WorldWideScience

Sample records for surveying methods and processes

  1. Survey: interpolation methods for whole slide image processing.

    Science.gov (United States)

    Roszkowiak, L; Korzynska, A; Zak, J; Pijanowska, D; Swiderska-Chadaj, Z; Markiewicz, T

    2017-02-01

    Evaluating whole slide images of histological and cytological samples is used in pathology for diagnostics, grading and prognosis . It is often necessary to rescale whole slide images of a very large size. Image resizing is one of the most common applications of interpolation. We collect the advantages and drawbacks of nine interpolation methods, and as a result of our analysis, we try to select one interpolation method as the preferred solution. To compare the performance of interpolation methods, test images were scaled and then rescaled to the original size using the same algorithm. The modified image was compared to the original image in various aspects. The time needed for calculations and results of quantification performance on modified images were also compared. For evaluation purposes, we used four general test images and 12 specialized biological immunohistochemically stained tissue sample images. The purpose of this survey is to determine which method of interpolation is the best to resize whole slide images, so they can be further processed using quantification methods. As a result, the interpolation method has to be selected depending on the task involving whole slide images. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  2. Survey of postharvest handling, preservation and processing ...

    African Journals Online (AJOL)

    Survey of postharvest handling, preservation and processing practices along the camel milk chain in Isiolo district, Kenya. ... Despite the important contribution of camel milk to food security for pastoralists in Kenya, little is known about the postharvest handling, preservation and processing practices. In this study, existing ...

  3. Survey Research: Methods, Issues and the Future

    Science.gov (United States)

    Brewer, Ernest W.; Torrisi-Steele, Geraldine; Wang, Victor C. X.

    2015-01-01

    Survey research is prevalent among many professional fields. Both cost effective and time efficient, this method of research is commonly used for the purposes of gaining insight into the attitudes, thoughts, and opinions of populations. Additionally, because there are several types of survey research designs and data collection instruments, the…

  4. Remote sensing, airborne radiometric survey and aeromagnetic survey data processing and analysis

    International Nuclear Information System (INIS)

    Dong Xiuzhen; Liu Dechang; Ye Fawang; Xuan Yanxiu

    2009-01-01

    Taking remote sensing data, airborne radiometric data and aero magnetic survey data as an example, the authors elaborate about basic thinking of remote sensing data processing methods, spectral feature analysis and adopted processing methods, also explore the remote sensing data combining with the processing of airborne radiometric survey and aero magnetic survey data, and analyze geological significance of processed image. It is not only useful for geological environment research and uranium prospecting in the study area, but also reference to applications in another area. (authors)

  5. A Survey of Commonly Applied Methods for Software Process Improvement

    Science.gov (United States)

    1994-02-01

    modeling in the course of systems development and virtually every organization has a systems life-cycle model of some sort. But process definition as the...Taken to its logical conclusion, CASE could make coding as we now know it today virtually obsolete, just as third generation lan- guages dramatically...promotion staff, and research and development (R&D) people in a team that worked on a design together from drawing board to dealer showroom . Many ac

  6. Microencapsulation and Electrostatic Processing Method

    Science.gov (United States)

    Morrison, Dennis R. (Inventor); Mosier, Benjamin (Inventor)

    2000-01-01

    Methods are provided for forming spherical multilamellar microcapsules having alternating hydrophilic and hydrophobic liquid layers, surrounded by flexible, semi-permeable hydrophobic or hydrophilic outer membranes which can be tailored specifically to control the diffusion rate. The methods of the invention rely on low shear mixing and liquid-liquid diffusion process and are particularly well suited for forming microcapsules containing both hydrophilic and hydrophobic drugs. These methods can be carried out in the absence of gravity and do not rely on density-driven phase separation, mechanical mixing or solvent evaporation phases. The methods include the process of forming, washing and filtering microcapsules. In addition, the methods contemplate coating microcapsules with ancillary coatings using an electrostatic field and free fluid electrophoresis of the microcapsules. The microcapsules produced by such methods are particularly useful in the delivery of pharmaceutical compositions.

  7. Survey of electronic payment methods and systems

    NARCIS (Netherlands)

    Havinga, Paul J.M.; Smit, Gerardus Johannes Maria; Helme, A.; Verbraeck, A.

    1996-01-01

    In this paper an overview of electronic payment methods and systems is given. This survey is done as part of the Moby Dick project. Electronic payment systems can be grouped into three broad classes: traditional money transactions, digital currency and creditdebit payments. Such payment systems have

  8. A survey of raw processing methods for kolanuts | Asamoah ...

    African Journals Online (AJOL)

    Although local preservatives such as wood ash, dried ground pepper, sliced lime and leaves of Jatropha were used in some areas of Ashanti to control storage insects, by far the commonest insect control agents in the two regions were chemicals which are dangerous to human health. Some of these chemicals are Gastoxin ...

  9. Image restoration and processing methods

    International Nuclear Information System (INIS)

    Daniell, G.J.

    1984-01-01

    This review will stress the importance of using image restoration techniques that deal with incomplete, inconsistent, and noisy data and do not introduce spurious features into the processed image. No single image is equally suitable for both the resolution of detail and the accurate measurement of intensities. A good general purpose technique is the maximum entropy method and the basis and use of this will be explained. (orig.)

  10. Radiological decontamination, survey, and statistical release method for vehicles

    International Nuclear Information System (INIS)

    Goodwill, M.E.; Lively, J.W.; Morris, R.L.

    1996-06-01

    Earth-moving vehicles (e.g., dump trucks, belly dumps) commonly haul radiologically contaminated materials from a site being remediated to a disposal site. Traditionally, each vehicle must be surveyed before being released. The logistical difficulties of implementing the traditional approach on a large scale demand that an alternative be devised. A statistical method for assessing product quality from a continuous process was adapted to the vehicle decontamination process. This method produced a sampling scheme that automatically compensates and accommodates fluctuating batch sizes and changing conditions without the need to modify or rectify the sampling scheme in the field. Vehicles are randomly selected (sampled) upon completion of the decontamination process to be surveyed for residual radioactive surface contamination. The frequency of sampling is based on the expected number of vehicles passing through the decontamination process in a given period and the confidence level desired. This process has been successfully used for 1 year at the former uranium millsite in Monticello, Utah (a cleanup site regulated under the Comprehensive Environmental Response, Compensation, and Liability Act). The method forces improvement in the quality of the decontamination process and results in a lower likelihood that vehicles exceeding the surface contamination standards are offered for survey. Implementation of this statistical sampling method on Monticello projects has resulted in more efficient processing of vehicles through decontamination and radiological release, saved hundreds of hours of processing time, provided a high level of confidence that release limits are met, and improved the radiological cleanliness of vehicles leaving the controlled site

  11. Sample processing device and method

    DEFF Research Database (Denmark)

    2011-01-01

    a sample liquid comprising the sample and the first preparation system is adapted to receive a receiving liquid. In a particular embodiment, a magnetic sample transport component, such as a permanent magnet or an electromagnet, is arranged to move magnetic beads in between the first and second substrates.......A sample processing device is disclosed, which sample processing device comprises a first substrate and a second substrate, where the first substrate has a first surface comprising two area types, a first area type with a first contact angle with water and a second area type with a second contact...... angle with water, the first contact angle being smaller than the second contact angle. The first substrate defines an inlet system and a preparation system in areas of the first type which two areas are separated by a barrier system in an area of the second type. The inlet system is adapted to receive...

  12. Survey compare team based learning and lecture teaching method, on learning-teaching process nursing student\\'s, in Surgical and Internal Diseases course

    Directory of Open Access Journals (Sweden)

    AA Vaezi

    2015-12-01

    Full Text Available Introduction: The effect of teaching methods on learning process of students will help teachers to improve the quality of teaching by selecting an appropriate method. This study aimed to compare the team- based learning and lecture teaching method on learning-teaching process of nursing students in surgical and internal diseases courses. Method: This quasi-experimental study was carried on the nursing students in the School of Nursing and Midwifery in Yazd and Meybod cities. Studied sample was all of the students in the sixth term in the Faculty of Nursing in Yazd (48 persons and the Faculty of Nursing in Meybod (28 persons. The rate of students' learning through lecture was measured using MCQ tests and teaching based on team-based learning (TBL method was run using MCQ tests (IRAT, GRAT, Appeals and Task group. Therefore, in order to examine the students' satisfaction about the TBL method, a 5-point Likert scale (translated questionnaire (1=completely disagree, 2= disagree, 3=not effective, 4=agree, and 5=completely agree consisted of 22 items was utilized. The reliability and validity of this translated questionnaire was measured. The collected data were analyzed through SPSS 17.0 using descriptive and analytical statistic. Result: The results showed that the mean scores in team-based learning were meaningful in individual assessment (17±84 and assessment group (17.2±1.17. The mean of overall scores in TBL method (17.84±0.98% was higher compared with the lecture teaching method (16±2.31. Most of the students believed that TBL method has improved their interpersonal and group interaction skills (100%. Among them, 97.7% of students mentioned that this method (TBL helped them to understand the course content better. The lowest levels of the satisfaction have related to the continuous learning during lifelong (51.2%. Conclusion: The results of the present study showed that the TBL method led to improving the communication skills, understanding

  13. Biological variables for the site survey of surface ecosystems - existing data and survey methods

    International Nuclear Information System (INIS)

    Kylaekorpi, Lasse; Berggren, Jens; Larsson, Mats; Liberg, Maria; Rydgren, Bernt

    2000-06-01

    In the process of selecting a safe and environmentally acceptable location for the deep level repository of nuclear waste, site surveys will be carried out. These site surveys will also include studies of the biota at the site, in order to assure that the chosen site will not conflict with important ecological interests, and to establish a thorough baseline for future impact assessments and monitoring programmes. As a preparation to the site survey programme, a review of the variables that need to be surveyed is conducted. This report contains the review for some of those variables. For each variable, existing data sources and their characteristics are listed. For those variables for which existing data sources are inadequate, suggestions are made for appropriate methods that will enable the establishment of an acceptable baseline. In this report the following variables are reviewed: Fishery, Landscape, Vegetation types, Key biotopes, Species (flora and fauna), Red-listed species (flora and fauna), Biomass (flora and fauna), Water level, water retention time (incl. water body and flow), Nutrients/toxins, Oxygen concentration, Layering, stratification, Light conditions/transparency, Temperature, Sediment transport, (Marine environments are excluded from this review). For a major part of the variables, the existing data coverage is most likely insufficient. Both the temporal and/or the geographical resolution is often limited, which means that complementary surveys must be performed during (or before) the site surveys. It is, however, in general difficult to make exact judgements on the extent of existing data, and also to give suggestions for relevant methods to use in the site surveys. This can be finally decided only when the locations for the sites are decided upon. The relevance of the different variables also depends on the environmental characteristics of the sites. Therefore, we suggest that when the survey sites are selected, an additional review is

  14. Biological variables for the site survey of surface ecosystems - existing data and survey methods

    Energy Technology Data Exchange (ETDEWEB)

    Kylaekorpi, Lasse; Berggren, Jens; Larsson, Mats; Liberg, Maria; Rydgren, Bernt [SwedPower AB, Stockholm (Sweden)

    2000-06-01

    In the process of selecting a safe and environmentally acceptable location for the deep level repository of nuclear waste, site surveys will be carried out. These site surveys will also include studies of the biota at the site, in order to assure that the chosen site will not conflict with important ecological interests, and to establish a thorough baseline for future impact assessments and monitoring programmes. As a preparation to the site survey programme, a review of the variables that need to be surveyed is conducted. This report contains the review for some of those variables. For each variable, existing data sources and their characteristics are listed. For those variables for which existing data sources are inadequate, suggestions are made for appropriate methods that will enable the establishment of an acceptable baseline. In this report the following variables are reviewed: Fishery, Landscape, Vegetation types, Key biotopes, Species (flora and fauna), Red-listed species (flora and fauna), Biomass (flora and fauna), Water level, water retention time (incl. water body and flow), Nutrients/toxins, Oxygen concentration, Layering, stratification, Light conditions/transparency, Temperature, Sediment transport, (Marine environments are excluded from this review). For a major part of the variables, the existing data coverage is most likely insufficient. Both the temporal and/or the geographical resolution is often limited, which means that complementary surveys must be performed during (or before) the site surveys. It is, however, in general difficult to make exact judgements on the extent of existing data, and also to give suggestions for relevant methods to use in the site surveys. This can be finally decided only when the locations for the sites are decided upon. The relevance of the different variables also depends on the environmental characteristics of the sites. Therefore, we suggest that when the survey sites are selected, an additional review is

  15. Research Methods in Healthcare Epidemiology: Survey and Qualitative Research.

    Science.gov (United States)

    Safdar, Nasia; Abbo, Lilian M; Knobloch, Mary Jo; Seo, Susan K

    2016-11-01

    Surveys are one of the most frequently employed study designs in healthcare epidemiology research. Generally easier to undertake and less costly than many other study designs, surveys can be invaluable to gain insights into opinions and practices in large samples and may be descriptive and/or be used to test associations. In this context, qualitative research methods may complement this study design either at the survey development phase and/or at the interpretation/extension of results stage. This methods article focuses on key considerations for designing and deploying surveys in healthcare epidemiology and antibiotic stewardship, including identification of whether or not de novo survey development is necessary, ways to optimally lay out and display a survey, denominator measurement, discussion of biases to keep in mind particularly in research using surveys, and the role of qualitative research methods to complement surveys. We review examples of surveys in healthcare epidemiology and antimicrobial stewardship and review the pros and cons of methods used. A checklist is provided to help aid design and deployment of surveys in healthcare epidemiology and antimicrobial stewardship. Infect Control Hosp Epidemiol 2016;1-6.

  16. Multivariate Statistical Process Control Process Monitoring Methods and Applications

    CERN Document Server

    Ge, Zhiqiang

    2013-01-01

      Given their key position in the process control industry, process monitoring techniques have been extensively investigated by industrial practitioners and academic control researchers. Multivariate statistical process control (MSPC) is one of the most popular data-based methods for process monitoring and is widely used in various industrial areas. Effective routines for process monitoring can help operators run industrial processes efficiently at the same time as maintaining high product quality. Multivariate Statistical Process Control reviews the developments and improvements that have been made to MSPC over the last decade, and goes on to propose a series of new MSPC-based approaches for complex process monitoring. These new methods are demonstrated in several case studies from the chemical, biological, and semiconductor industrial areas.   Control and process engineers, and academic researchers in the process monitoring, process control and fault detection and isolation (FDI) disciplines will be inter...

  17. Fast and accurate methods of independent component analysis: A survey

    Czech Academy of Sciences Publication Activity Database

    Tichavský, Petr; Koldovský, Zbyněk

    2011-01-01

    Roč. 47, č. 3 (2011), s. 426-438 ISSN 0023-5954 R&D Projects: GA MŠk 1M0572; GA ČR GA102/09/1278 Institutional research plan: CEZ:AV0Z10750506 Keywords : Blind source separation * artifact removal * electroencephalogram * audio signal processing Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.454, year: 2011 http://library.utia.cas.cz/separaty/2011/SI/tichavsky-fast and accurate methods of independent component analysis a survey.pdf

  18. A Survey of Fish Production and Processing Machinery in Rivers ...

    African Journals Online (AJOL)

    Survey of fish production and processing machinery in Port Harcourt City Local Government Area of Rivers State, Nigeria was carried out to evaluate the followings: different machines used for fish production and processing, the most acceptable machine, effect of cost of machinery on the fish farmer, whether gender has ...

  19. Survey and Method for Determination of Trajectory Predictor Requirements

    Science.gov (United States)

    Rentas, Tamika L.; Green, Steven M.; Cate, Karen Tung

    2009-01-01

    A survey of air-traffic-management researchers, representing a broad range of automation applications, was conducted to document trajectory-predictor requirements for future decision-support systems. Results indicated that the researchers were unable to articulate a basic set of trajectory-prediction requirements for their automation concepts. Survey responses showed the need to establish a process to help developers determine the trajectory-predictor-performance requirements for their concepts. Two methods for determining trajectory-predictor requirements are introduced. A fast-time simulation method is discussed that captures the sensitivity of a concept to the performance of its trajectory-prediction capability. A characterization method is proposed to provide quicker, yet less precise results, based on analysis and simulation to characterize the trajectory-prediction errors associated with key modeling options for a specific concept. Concept developers can then identify the relative sizes of errors associated with key modeling options, and qualitatively determine which options lead to significant errors. The characterization method is demonstrated for a case study involving future airport surface traffic management automation. Of the top four sources of error, results indicated that the error associated with accelerations to and from turn speeds was unacceptable, the error associated with the turn path model was acceptable, and the error associated with taxi-speed estimation was of concern and needed a higher fidelity concept simulation to obtain a more precise result

  20. Digital image processing mathematical and computational methods

    CERN Document Server

    Blackledge, J M

    2005-01-01

    This authoritative text (the second part of a complete MSc course) provides mathematical methods required to describe images, image formation and different imaging systems, coupled with the principle techniques used for processing digital images. It is based on a course for postgraduates reading physics, electronic engineering, telecommunications engineering, information technology and computer science. This book relates the methods of processing and interpreting digital images to the 'physics' of imaging systems. Case studies reinforce the methods discussed, with examples of current research

  1. Methods of Analysis by the U.S. Geological Survey National Water Quality Laboratory - Processing, Taxonomy, and Quality Control of Benthic Macroinvertebrate Samples

    Science.gov (United States)

    2000-01-01

    Press, 330 p. APPENDIXES 37 Kondratieff , B.C., and Voshell, J.R., Jr., 1984, The North and Central American species of Isonychia (Epnemeroptera...Natural History Survey of Connecticut Bulletin, v. 107, p. 1–261. Kondratieff , B.C., and Nelson, C.H., 1995, A review of the genus Remenus Ricker...Plecoptera: Perlodidae), with the description of two new species: Proceedings of the Entomological Society of Washington, v. 97, p. 596– 602. Kondratieff , B.C

  2. Processing module operating methods, processing modules, and communications systems

    Science.gov (United States)

    McCown, Steven Harvey; Derr, Kurt W.; Moore, Troy

    2014-09-09

    A processing module operating method includes using a processing module physically connected to a wireless communications device, requesting that the wireless communications device retrieve encrypted code from a web site and receiving the encrypted code from the wireless communications device. The wireless communications device is unable to decrypt the encrypted code. The method further includes using the processing module, decrypting the encrypted code, executing the decrypted code, and preventing the wireless communications device from accessing the decrypted code. Another processing module operating method includes using a processing module physically connected to a host device, executing an application within the processing module, allowing the application to exchange user interaction data communicated using a user interface of the host device with the host device, and allowing the application to use the host device as a communications device for exchanging information with a remote device distinct from the host device.

  3. Process tracing methods: foundation and guidelines

    DEFF Research Database (Denmark)

    Beach, Derek; Pedersen, Rasmus Brun

    Derek Beach and Rasmus Brun Pedersen have written the first practical guide for using process tracing in social science research. The book introduces a more refined definition of what process tracing methods are, differentiating it into three variants, showing the uses and limitations of each...... a set of tools for how the three variants of process tracing methods can be used in research, introducing a set of practical guidelines for each stage of the research process (working with theories, developing empirical tests, working with evidence, and case selection strategies, nesting case studies...

  4. Data acquisition and processing - helicopter radiometric survey, Krageroe, 1998

    Energy Technology Data Exchange (ETDEWEB)

    Beard, Les P.; Mogaard, John Olav

    2000-07-01

    On 07 October 1998 a helicopter radiometric survey was flown in the vicinity of Krageroe municipality. The purpose of the survey was to provide radiometric information to help assess radon hazard from radioactive rocks in the area. A total of 60 line-kilometres of radiometric data were acquired in a single flight, covering an area of approximately 3 square km with a 50-m line spacing. The data were collected by Geological Survey of Norway (NGU) personnel and processed at NGU. Radiometric data were reduced using the three-channel procedure recommended by the International Atomic Energy Association. All data were gridded using square cells with 30-m sides and geophysical maps were produced at a scale of 1:5000. This report covers aspects of data acquisition and processing (Author)

  5. Data acquisition and processing - helicopter radiometric survey, Krageroe, 1998

    International Nuclear Information System (INIS)

    Beard, Les P.; Mogaard, John Olav

    2000-01-01

    On 07 October 1998 a helicopter radiometric survey was flown in the vicinity of Krageroe municipality. The purpose of the survey was to provide radiometric information to help assess radon hazard from radioactive rocks in the area. A total of 60 line-kilometres of radiometric data were acquired in a single flight, covering an area of approximately 3 square km with a 50-m line spacing. The data were collected by Geological Survey of Norway (NGU) personnel and processed at NGU. Radiometric data were reduced using the three-channel procedure recommended by the International Atomic Energy Association. All data were gridded using square cells with 30-m sides and geophysical maps were produced at a scale of 1:5000. This report covers aspects of data acquisition and processing (Author)

  6. Method and apparatus for processing algae

    Science.gov (United States)

    Chew, Geoffrey; Reich, Alton J.; Dykes, Jr., H. Waite; Di Salvo, Roberto

    2012-07-03

    Methods and apparatus for processing algae are described in which a hydrophilic ionic liquid is used to lyse algae cells. The lysate separates into at least two layers including a lipid-containing hydrophobic layer and an ionic liquid-containing hydrophilic layer. A salt or salt solution may be used to remove water from the ionic liquid-containing layer before the ionic liquid is reused. The used salt may also be dried and/or concentrated and reused. The method can operate at relatively low lysis, processing, and recycling temperatures, which minimizes the environmental impact of algae processing while providing reusable biofuels and other useful products.

  7. ADASHI: User Survey and Focus Group Process with Final Results

    National Research Council Canada - National Science Library

    Buckless, Genna

    2004-01-01

    ...; Focus Group Assessment; and Summary of Results. The entire process was successful. The developers gained useful insight into the needs of the responder community. The results of this survey will enable the developers of ADASHI to make necessary improvements in the software making ADASHI a valuable tool for its users.

  8. Classification of engineering and hydrometeorological survey results regarding channel processes

    Directory of Open Access Journals (Sweden)

    Kondratiev A.N.

    2016-12-01

    Full Text Available according to the author the profile of maximum possible erosion is considered as the results of engineering and hydrometeorological surveys in part of channel processes in modern normative documents. The article presents the classification of results to sediment processes: selecting target set of profiles of maximum erosion, maximum erosion profile in a given target, part of the profile, erosion in the plan, the demonstration of safety of erosion or lack of erosion.

  9. Literature Review on Processing and Analytical Methods for ...

    Science.gov (United States)

    Report The purpose of this report was to survey the open literature to determine the current state of the science regarding the processing and analytical methods currently available for recovery of F. tularensis from water and soil matrices, and to determine what gaps remain in the collective knowledge concerning F. tularensis identification from environmental samples.

  10. The IMACS Cluster Building Survey. I. Description of the Survey and Analysis Methods

    Science.gov (United States)

    Oemler Jr., Augustus; Dressler, Alan; Gladders, Michael G.; Rigby, Jane R.; Bai, Lei; Kelson, Daniel; Villanueva, Edward; Fritz, Jacopo; Rieke, George; Poggianti, Bianca M.; hide

    2013-01-01

    The IMACS Cluster Building Survey uses the wide field spectroscopic capabilities of the IMACS spectrograph on the 6.5 m Baade Telescope to survey the large-scale environment surrounding rich intermediate-redshift clusters of galaxies. The goal is to understand the processes which may be transforming star-forming field galaxies into quiescent cluster members as groups and individual galaxies fall into the cluster from the surrounding supercluster. This first paper describes the survey: the data taking and reduction methods. We provide new calibrations of star formation rates (SFRs) derived from optical and infrared spectroscopy and photometry. We demonstrate that there is a tight relation between the observed SFR per unit B luminosity, and the ratio of the extinctions of the stellar continuum and the optical emission lines.With this, we can obtain accurate extinction-corrected colors of galaxies. Using these colors as well as other spectral measures, we determine new criteria for the existence of ongoing and recent starbursts in galaxies.

  11. The Dark Energy Survey Data Processing and Calibration System

    Energy Technology Data Exchange (ETDEWEB)

    Mohr, Joseph J. [Munich U.; Armstrong, Robert [Penn State U.; Bertin, Emmanuel [Paris, Inst. Astrophys.; Daues, Gregory E. [NCSA, Urbana; Desai, Shantanu [Munich U.; Gower, Michelle [NCSA, Urbana; Gruendl, Robert [Illinois U., Urbana (main); Hanlon, William [Illinois U., Urbana (main); Kuropatkin, Nikolay [Fermilab; Lin, Huan [Fermilab; Marriner, John [Fermilab; Petravick, Don; Sevilla, Ignacio [Madrid, CIEMAT; Swanson, Molly [Harvard-Smithsonian Ctr. Astrophys.; Tomashek, Todd [NCSA, Urbana; Tucker, Douglas [Fermilab; Yanny, Brian [Fermilab

    2012-09-24

    The Dark Energy Survey (DES) is a 5000 deg2 grizY survey reaching characteristic photometric depths of 24th magnitude (10 sigma) and enabling accurate photometry and morphology of objects ten times fainter than in SDSS. Preparations for DES have included building a dedicated 3 deg2 CCD camera (DECam), upgrading the existing CTIO Blanco 4m telescope and developing a new high performance computing (HPC) enabled data management system (DESDM). The DESDM system will be used for processing, calibrating and serving the DES data. The total data volumes are high (~2PB), and so considerable effort has gone into designing an automated processing and quality control system. Special purpose image detrending and photometric calibration codes have been developed to meet the data quality requirements, while survey astrometric calibration, coaddition and cataloging rely on new extensions of the AstrOmatic codes which now include tools for PSF modeling, PSF homogenization, PSF corrected model fitting cataloging and joint model fitting across multiple input images. The DESDM system has been deployed on dedicated development clusters and HPC systems in the US and Germany. An extensive program of testing with small rapid turn-around and larger campaign simulated datasets has been carried out. The system has also been tested on large real datasets, including Blanco Cosmology Survey data from the Mosaic2 camera. In Fall 2012 the DESDM system will be used for DECam commissioning, and, thereafter, the system will go into full science operations.

  12. Beowulf Distributed Processing and the United States Geological Survey

    Science.gov (United States)

    Maddox, Brian G.

    2002-01-01

    Introduction In recent years, the United States Geological Survey's (USGS) National Mapping Discipline (NMD) has expanded its scientific and research activities. Work is being conducted in areas such as emergency response research, scientific visualization, urban prediction, and other simulation activities. Custom-produced digital data have become essential for these types of activities. High-resolution, remotely sensed datasets are also seeing increased use. Unfortunately, the NMD is also finding that it lacks the resources required to perform some of these activities. Many of these projects require large amounts of computer processing resources. Complex urban-prediction simulations, for example, involve large amounts of processor-intensive calculations on large amounts of input data. This project was undertaken to learn and understand the concepts of distributed processing. Experience was needed in developing these types of applications. The idea was that this type of technology could significantly aid the needs of the NMD scientific and research programs. Porting a numerically intensive application currently being used by an NMD science program to run in a distributed fashion would demonstrate the usefulness of this technology. There are several benefits that this type of technology can bring to the USGS's research programs. Projects can be performed that were previously impossible due to a lack of computing resources. Other projects can be performed on a larger scale than previously possible. For example, distributed processing can enable urban dynamics research to perform simulations on larger areas without making huge sacrifices in resolution. The processing can also be done in a more reasonable amount of time than with traditional single-threaded methods (a scaled version of Chester County, Pennsylvania, took about fifty days to finish its first calibration phase with a single-threaded program). This paper has several goals regarding distributed processing

  13. METHOD AND DEVICE FOR PROCESSING A SIGNAL

    NARCIS (Netherlands)

    Van Nee, D.J.R.

    1995-01-01

    Abstract of CA 2109759 (A1) A method and device for processing a signal are described, wherein an estimate of a multipath-induced contribution to a demodulated navigation signal is calculated and subtracted from said demodulated navigation signal to obtain an estimated line of sight contribution to

  14. A survey of infrared and visual image fusion methods

    Science.gov (United States)

    Jin, Xin; Jiang, Qian; Yao, Shaowen; Zhou, Dongming; Nie, Rencan; Hai, Jinjin; He, Kangjian

    2017-09-01

    Infrared (IR) and visual (VI) image fusion is designed to fuse multiple source images into a comprehensive image to boost imaging quality and reduce redundancy information, which is widely used in various imaging equipment to improve the visual ability of human and robot. The accurate, reliable and complementary descriptions of the scene in fused images make these techniques be widely used in various fields. In recent years, a large number of fusion methods for IR and VI images have been proposed due to the ever-growing demands and the progress of image representation methods; however, there has not been published an integrated survey paper about this field in last several years. Therefore, we make a survey to report the algorithmic developments of IR and VI image fusion. In this paper, we first characterize the IR and VI image fusion based applications to represent an overview of the research status. Then we present a synthesize survey of the state of the art. Thirdly, the frequently-used image fusion quality measures are introduced. Fourthly, we perform some experiments of typical methods and make corresponding analysis. At last, we summarize the corresponding tendencies and challenges in IR and VI image fusion. This survey concludes that although various IR and VI image fusion methods have been proposed, there still exist further improvements or potential research directions in different applications of IR and VI image fusion.

  15. Survey of Processing Methods for High Strength High Conductivity Wires for High Field Magnet Applications

    International Nuclear Information System (INIS)

    Han, K.; Embury, J.D.

    1998-01-01

    This paper will deal with the basic concepts of attaining combination of high strength and high conductivity in pure materials, in-situ composites and macrocomposites. It will survey current attainments, and outline where some future developments may lie in developing wire products that are close to the theoretical strength of future magnet applications

  16. Survey of Processing Methods for High Strength High Conductivity Wires for High Field Magnet Applications

    Energy Technology Data Exchange (ETDEWEB)

    Han, K.; Embury, J.D.

    1998-10-01

    This paper will deal with the basic concepts of attaining combination of high strength and high conductivity in pure materials, in-situ composites and macrocomposites. It will survey current attainments, and outline where some future developments may lie in developing wire products that are close to the theoretical strength of future magnet applications.

  17. Survey and assessment of conventional software verification and validation methods

    International Nuclear Information System (INIS)

    Miller, L.A.; Groundwater, E.; Mirsky, S.M.

    1993-04-01

    By means of a literature survey, a comprehensive set of methods was identified for the verification and validation of conventional software. The 134 methods so identified were classified according to their appropriateness for various phases of a developmental lifecycle -- requirements, design, and implementation; the last category was subdivided into two, static testing and dynamic testing methods. The methods were then characterized in terms of eight rating factors, four concerning ease-of-use of the methods and four concerning the methods' power to detect defects. Based on these factors, two measurements were developed to permit quantitative comparisons among methods, a Cost-Benefit metric and an Effectiveness Metric. The Effectiveness Metric was further refined to provide three different estimates for each method, depending on three classes of needed stringency of V ampersand V (determined by ratings of a system's complexity and required-integrity). Methods were then rank-ordered for each of the three classes in terms of their overall cost-benefits and effectiveness. The applicability was then assessed of each method for the four identified components of knowledge-based and expert systems, as well as the system as a whole

  18. Process control in municipal solid waste incinerators: survey and assessment.

    Science.gov (United States)

    El Asri, R; Baxter, D

    2004-06-01

    As there is only rare and scattered published information about the process control in industrial incineration facilities for municipal solid waste (MSW), a survey of the literature has been supplemented by a number of waste incineration site visits in Belgium and The Netherlands, in order to make a realistic assessment of the current status of technology in the area. Owing to the commercial character, and therefore, the confidentiality restrictions imposed by plant builders and many of the operators, much of the information collected has either to be presented in a generalized manner, and in any case anonymously. The survey was focused on four major issues: process control strategy, process control systems, monitors used for process control and finally the correlation between the 850 degrees C/2 s rule in the European waste incineration directive and integrated process control. The process control strategies range from reaching good and stable emissions at the stack to stabilizing and maximizing the energy output from the process. The main indicator to be monitored, in cases in which the focus is controlling emissions, is the oxygen content in the stack. Keeping the oxygen concentration in a determined range (usually between 8 and 12 vol.%) ensures stable and tolerated concentrations of the gaseous emissions. In the case for which stabilization of energy production is the principal aim, the main controlled parameter is the steam temperature and flow-rate, which is usually related to the fuel energetic input. A lot of other parameters are used as alarm criteria, the most common of which is the carbon monoxide concentration. The process control systems used most commonly feature partially automated classical proportional integral derivative (PID) controllers. New and innovative process control systems, such as fuzzy-logic control systems, are still unknown to most plant managers while their performance is reported to be unsatisfactory in plants in which such systems

  19. Comparing Traditional and Crowdsourcing Methods for Pretesting Survey Questions

    Directory of Open Access Journals (Sweden)

    Jennifer Edgar

    2016-10-01

    Full Text Available Cognitive interviewing is a common method used to evaluate survey questions. This study compares traditional cognitive interviewing methods with crowdsourcing, or “tapping into the collective intelligence of the public to complete a task.” Crowdsourcing may provide researchers with access to a diverse pool of potential participants in a very timely and cost-efficient way. Exploratory work found that crowdsourcing participants, with self-administered data collection, may be a viable alternative, or addition, to traditional pretesting methods. Using three crowdsourcing designs (TryMyUI, Amazon Mechanical Turk, and Facebook, we compared the participant characteristics, costs, and quantity and quality of data with traditional laboratory-based cognitive interviews. Results suggest that crowdsourcing and self-administered protocols may be a viable way to collect survey pretesting information, as participants were able to complete the tasks and provide useful information; however, complex tasks may require the skills of an interviewer to administer unscripted probes.

  20. Advances in Packaging Methods, Processes and Systems

    Directory of Open Access Journals (Sweden)

    Nitaigour Premchand Mahalik

    2014-10-01

    Full Text Available The food processing and packaging industry is becoming a multi-trillion dollar global business. The reason is that the recent increase in incomes in traditionally less economically developed countries has led to a rise in standards of living that includes a significantly higher consumption of packaged foods. As a result, food safety guidelines have been more stringent than ever. At the same time, the number of research and educational institutions—that is, the number of potential researchers and stakeholders—has increased in the recent past. This paper reviews recent developments in food processing and packaging (FPP, keeping in view the aforementioned advancements and bearing in mind that FPP is an interdisciplinary area in that materials, safety, systems, regulation, and supply chains play vital roles. In particular, the review covers processing and packaging principles, standards, interfaces, techniques, methods, and state-of-the-art technologies that are currently in use or in development. Recent advances such as smart packaging, non-destructive inspection methods, printing techniques, application of robotics and machineries, automation architecture, software systems and interfaces are reviewed.

  1. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  2. Methods and systems for the processing of physiological signals

    International Nuclear Information System (INIS)

    Cosnac, B. de; Gariod, R.; Max, J.; Monge, V.

    1975-01-01

    This note is a general survey of the processing of physiological signals. After an introduction about electrodes and their limitations, the physiological nature of the main signals are shortly recalled. Different methods (signal averaging, spectral analysis, shape morphological analysis) are described as applications to the fields of magnetocardiography, electro-encephalography, cardiography, electronystagmography. As for processing means (single portable instruments and programmable), they are described through the example of application to rheography and to the Plurimat'S general system. As a conclusion the methods of signal processing are dominated by the morphological analysis of curves and by the necessity of a more important introduction of the statistical classification. As for the instruments, microprocessors will appear but specific operators linked to computer will certainly grow [fr

  3. Survey and evaluation of aging risk assessment methods and applications

    International Nuclear Information System (INIS)

    Sanzo, D.; Kvam, P.; Apostolakis, G.; Wu, J.; Milici, T.; Ghoniem, N.; Guarro, S.

    1994-11-01

    The US Nuclear Regulatory Commission initiated the nuclear power plant aging research program about 6 years ago to gather information about nuclear power plant aging. Since then, this program has collected a significant amount of information, largely qualitative, on plant aging and its potential effects on plant safety. However, this body of knowledge has not yet been integrated into formalisms that can be used effectively and systematically to assess plant risk resulting from aging, although models for assessing the effect of increasing failure rates on core damage frequency have been proposed. This report surveys the work on the aging of systems, structures, and components (SSCs) of nuclear power plants, as well as associated data bases. We take a critical look at the need to revise probabilistic risk assessments (PRAs) so that they will include the contribution to risk from plant aging, the adequacy of existing methods for evaluating this contribution, and the adequacy of the data that have been used in these evaluation methods. We identify a preliminary framework for integrating the aging of SSCs into the PRA and include the identification of necessary data for such an integration

  4. Application of gamma spectrometry survey and discussion on data processing

    International Nuclear Information System (INIS)

    Li Ji'an; He Jianguo

    2008-01-01

    This paper analyzed and discussed the different opinions about the measured parameters of gamma spectrometry data, introduced the effect of gamma spectrometry survey to the search for sandstone type uranium deposit. The author believes that it is very necessary to perform some ground gamma spectrometry survey and enforce the development and application of airborne radiometric data so as to carry out the role of gamma spectrometry in the exploration of sandstone type uranium deposit. (authors)

  5. The JCMT Transient Survey: Data Reduction and Calibration Methods

    Energy Technology Data Exchange (ETDEWEB)

    Mairs, Steve; Lane, James [Department of Physics and Astronomy, University of Victoria, Victoria, BC, V8P 1A1 (Canada); Johnstone, Doug; Kirk, Helen [NRC Herzberg Astronomy and Astrophysics, 5071 West Saanich Road, Victoria, BC, V9E 2E7 (Canada); Lacaille, Kevin; Chapman, Scott [Department of Physics and Atmospheric Science, Dalhousie University, Halifax, NS, B3H 4R2 (Canada); Bower, Geoffrey C. [Academia Sinica Institute of Astronomy and Astrophysics, 645 N. A‘ohōkū Place, Hilo, HI 96720 (United States); Bell, Graham S.; Graves, Sarah, E-mail: smairs@uvic.ca [East Asian Observatory, 660 North A‘ohōkū Place, University Park, Hilo, Hawaii 96720 (United States); Collaboration: JCMT Transient Team

    2017-07-01

    Though there has been a significant amount of work investigating the early stages of low-mass star formation in recent years, the evolution of the mass assembly rate onto the central protostar remains largely unconstrained. Examining in depth the variation in this rate is critical to understanding the physics of star formation. Instabilities in the outer and inner circumstellar disk can lead to episodic outbursts. Observing these brightness variations at infrared or submillimeter wavelengths constrains the current accretion models. The JCMT Transient Survey is a three-year project dedicated to studying the continuum variability of deeply embedded protostars in eight nearby star-forming regions at a one-month cadence. We use the SCUBA-2 instrument to simultaneously observe these regions at wavelengths of 450 and 850 μ m. In this paper, we present the data reduction techniques, image alignment procedures, and relative flux calibration methods for 850 μ m data. We compare the properties and locations of bright, compact emission sources fitted with Gaussians over time. Doing so, we achieve a spatial alignment of better than 1″ between the repeated observations and an uncertainty of 2%–3% in the relative peak brightness of significant, localized emission. This combination of imaging performance is unprecedented in ground-based, single-dish submillimeter observations. Finally, we identify a few sources that show possible and confirmed brightness variations. These sources will be closely monitored and presented in further detail in additional studies throughout the duration of the survey.

  6. Borehole survey method and apparatus for drilling substantially horizontal boreholes

    Energy Technology Data Exchange (ETDEWEB)

    Trowsdale, L.S.

    1982-11-30

    A borehole survey method and apparatus are claimed for use in drilling substantially horizontal boreholes through a mineral deposit wherein a dip accelerometer, a roll accelerometer assembly and a fluxgate are disposed near the drill bit, which is mounted on a bent sub, and connected to a surface computation and display unit by a cable which extends through the drill string. The dip angle of the borehole near the drill bit, the azimuth of the borehole near the drill bit and the roll angle or orientation of the bent sub are measured and selectively displayed at the surface while the drill string is in the borehole for utilization in guiding the drill bit through the mineral deposit along a predetermined path.

  7. Processing of soil survey data

    NARCIS (Netherlands)

    Bregt, A.K.

    1992-01-01

    This thesis focuses on processing soil survey data into user-specific information. Within this process four steps are distinguished: collection, storage, analysis and presentation. A review of each step is given, and detailed research on important aspects of the steps are

  8. Method of processing plutonium and uranium solution

    International Nuclear Information System (INIS)

    Otsuka, Katsuyuki; Kondo, Isao; Suzuki, Toru.

    1989-01-01

    Solutions of plutonium nitrate solutions and uranyl nitrate recovered in the solvent extraction step in reprocessing plants and nuclear fuel production plants are applied with low temperature treatment by means of freeze-drying under vacuum into residues containing nitrates, which are denitrated under heating and calcined under reduction into powders. That is, since complicate processes of heating, concentration and dinitration conducted so far for the plutonium solution and uranyl solution are replaced with one step of freeze-drying under vacuum, the process can be simplified significantly. In addition, since the treatment is applied at low temperature, occurrence of corrosion for the material of evaporation, etc. can be prevented. Further, the number of operators can be saved by dividing the operations into recovery of solidification products, supply and sintering of the solutions and vacuum sublimation. Further, since nitrates processed at a low temperature are powderized by heating dinitration, the powderization step can be simplified. The specific surface area and the grain size distribution of the powder is made appropriate and it is possible to obtain oxide powders of physical property easily to be prepared into pellets. (N.H.)

  9. Method and algorithm for image processing

    Science.gov (United States)

    He, George G.; Moon, Brain D.

    2003-12-16

    The present invention is a modified Radon transform. It is similar to the traditional Radon transform for the extraction of line parameters and similar to traditional slant stack for the intensity summation of pixels away from a given pixel, for example ray paths that spans 360 degree at a given grid in the time and offset domain. However, the present invention differs from these methods in that the intensity and direction of a composite intensity for each pixel are maintained separately instead of combined after the transformation. An advantage of this approach is elimination of the work required to extract the line parameters in the transformed domain. The advantage of the modified Radon Transform method is amplified when many lines are present in the imagery or when the lines are just short segments which both occur in actual imagery.

  10. Radiation methods in dairy production and processing

    International Nuclear Information System (INIS)

    Ganguli, N.C.

    1975-01-01

    Various uses of radiotracers and radiation in dairy technology are described. In dairy production, radiotracers are used for studying: (1) rumen metabolism leading to protein synthesis (2) total body water, blood volume and sodium (3) minerals metabolism (4) relation between climatic stress and thyroid functioning of dairy animals (5) volume of milk in mammary glands (6) hormone level in dairy animals and (7) spermatozoa metabolism. In dairy processing, radiotracers are used for studying: (1) compositional analysis of milk and milk products and (2) efficiency of cleaning agents for cleaning dairy equipment. Ionizing radiation is used for: (1) preservation of milk and milk products and (2) sterilization of packaging materials. Radiation source has been used to monitor the over-run in ice-cream and the fill control for fluid in papar cartons. (M.G.B.)

  11. Effect of processing and preservation method on the organoleptic ...

    African Journals Online (AJOL)

    Effect of processing and preservation method on the organoleptic and shelf life of meat products. ... Nigerian Journal of Animal Science ... to four processing methods-frying, boiling, roasting and oven-drying to investigate the effects of processing and preservation methods on the organoleptic and shelf-life of meat products.

  12. Man Versus Machine: Comparing Double Data Entry and Optical Mark Recognition for Processing CAHPS Survey Data.

    Science.gov (United States)

    Fifolt, Matthew; Blackburn, Justin; Rhodes, David J; Gillespie, Shemeka; Bennett, Aleena; Wolff, Paul; Rucks, Andrew

    Historically, double data entry (DDE) has been considered the criterion standard for minimizing data entry errors. However, previous studies considered data entry alternatives through the limited lens of data accuracy. This study supplies information regarding data accuracy, operational efficiency, and cost for DDE and Optical Mark Recognition (OMR) for processing the Consumer Assessment of Healthcare Providers and Systems 5.0 survey. To assess data accuracy, we compared error rates for DDE and OMR by dividing the number of surveys that were arbitrated by the total number of surveys processed for each method. To assess operational efficiency, we tallied the cost of data entry for DDE and OMR after survey receipt. Costs were calculated on the basis of personnel, depreciation for capital equipment, and costs of noncapital equipment. The cost savings attributed to this method were negated by the operational efficiency of OMR. There was a statistical significance between rates of arbitration between DDE and OMR; however, this statistical significance did not create a practical significance. The potential benefits of DDE in terms of data accuracy did not outweigh the operational efficiency and thereby financial savings of OMR.

  13. Method and apparatus for processing oily wastewater

    International Nuclear Information System (INIS)

    Torline, W.N.; Williams, R.K.

    1993-01-01

    A method of treating oily wastewater is described comprising the steps of passing oily wastewater through a coalescer to coalesce dispersed oil droplets; separating a free oil fraction as a liquid stream having a lower specific gravity from a contaminated water stream having a higher specific gravity; filtering particulate material from said contaminated water stream; passing the filtered water stream under pressure across an ultrafiltration membrane to separate a retentate fraction enriched in residual emulsified oil from an aqueous permeate fraction; recycling said substantially only retentate fraction to said coalescer; filtering said aqueous permeate through an activated carbon filter to remove low molecular weight organic materials; subjecting the filtrate from said activated carbon filter to cation exchange to remove heavy metal ions; and periodically flushing said ultra filter with filtrate from said particulate filter to maintain the permeability of said ultrafiltration membrane

  14. Review of Estimation Methods for Landline and Cell Phone Surveys

    Science.gov (United States)

    Arcos, Antonio; del Mar Rueda, María; Trujillo, Manuel; Molina, David

    2015-01-01

    The rapid proliferation of cell phone use and the accompanying decline in landline service in recent years have resulted in substantial potential for coverage bias in landline random-digit-dial telephone surveys, which has led to the implementation of dual-frame designs that incorporate both landline and cell phone samples. Consequently,…

  15. Effect of processing and preservation methods on Vitamin C and ...

    African Journals Online (AJOL)

    The study was aimed at determining the effects of processing and preservation methods on vitamin C and total carotenoid levels of some species of Vernonia (V. amygdalina, V. calvoana var. bitter, V. colorata and V. calvoana var. non bitter) consumed in Cameroon. The processing methods were squeeze–washing, boiling ...

  16. a survey of rice production and processing in south east nigeria

    African Journals Online (AJOL)

    user

    An assessment of rice production and processing in South-Eastern Nigeria was carried out by investigative survey approach. The survey was basically to ascertain the extent of mechanization applicable in the area to enable the agricultural policy makers device the modalities for improving rice production and processing in ...

  17. Software Process Automation: Interviews, Survey, and Workshop Results

    National Research Council Canada - National Science Library

    Christie, Alan

    1997-01-01

    ...: First, in-depth interviews were conducted to assess the state of the practice. Second, a survey questionnaire was distributed to a wider number of organizations to obtain more quantitative data...

  18. Survey of indigenous knowledge on gathering, processing and use ...

    African Journals Online (AJOL)

    consuming EWM that were prepared as relish, or stew, or soup and were eatenwith rice or maize, cassava or sorghum stiff porridges. During wet season, EWM were eaten up to three times per week. More than 89% of farmers processed EWM mainly by sun-drying and stored them in plastic bags or wrapping in the ...

  19. Process and device for automatically surveying complex installations

    International Nuclear Information System (INIS)

    Pekrul, P.J.; Thiele, A.W.

    1976-01-01

    A description is given of a process for automatically analysing separate signal processing channels in real time, one channel per signal, in a facility with significant background noise in signals varying in time and coming from transducers at selected points for the continuous monitoring of the operating conditions of the various components of the installation. The signals are intended to determine potential breakdowns, determine conclusions as to the severity of these potential breakdowns and indicate to an operator the measures to be taken in consequence. The feature of this process is that it comprises the automatic and successive selection of each channel for the purpose of spectral analysis, the automatic processing of the signal of each selected channel to show energy spectrum density data at pre-determined frequencies, the automatic comparison of the energy spectrum density data of each channel with pre-determined sets of limits varying with the frequency, and the automatic indication to the operator of the condition of the various components of the installation associated to each channel and the measures to be taken depending on the set of limits [fr

  20. Implementing a Community-Driven Research Partnership: The Backyard Initiative Community Health Survey Methods and Approach.

    Science.gov (United States)

    Orionzi, Dimpho E; Mink, Pamela J; Azzahir, Atum; Yusuf, Amged A; Jernigan, Mau J; Dahlem, Janet L; Anderson, Mark J; Trahan, Lovel; Rosenberg-Carlson, Elena

    In community-based participatory research (CBPR), issues such as creating a setting where community members drive decisions and creating culturally relevant processes remain largely underachieved. The Backyard Initiative (BYI) provided the setting for implementing a community-centered collaborative research process. The BYI is a partnership between Allina Health, the Cultural Wellness Center (CWC), and community residents to improve health. To describe the unique community-centered method used in the 2013 BYI Community Health Survey (CHS) as a viable approach for collecting meaningful and valid health related data. With this approach, the community operates as the agent of change rather than the target. At the core was the BYI assessment team, which brought together conventional researchers and community members to collaboratively design, implement, analyze, interpret, and disseminate the CHS results. Focusing on the CHS, this structure and process permitted and facilitated important and difficult discussions about approach, content and outcomes of the research. We held seven sessions (239 participants). Participants were 37% African American/African and 34% Native American, 65% female, and 72% spoke English at home. Achievement of our recruitment goals, participation of groups typically underrepresented in research, and positive community feedback were indications that the BYI approach to survey research was successful. The BYI CHS community-centered methods built trust among research partners and participants, engaged populations often underrepresented in research, and collected meaningful data. Our success indicates that it is possible to co-design and implement a lengthy survey to inform future research and community activities.

  1. Survey of Living Organ Donors' Experience and Directions for Process Improvement.

    Science.gov (United States)

    Li, Timmy; Dokus, M Katherine; Kelly, Kristin N; Ugoeke, Nene; Rogers, James R; Asham, George; Sharma, Venkatesh Abhishek; Cirillo, Dominic J; Robinson, Mary K; Venniro, Erika K; Taylor, Jeremy G; Orloff, Mark S; McIntosh, Scott; Kashyap, Randeep

    2017-09-01

    Understanding living organ donors' experience with donation and challenges faced during the process is necessary to guide the development of effective strategies to maximize donor benefit and increase the number of living donors. An anonymous self-administered survey, specifically designed for this population based on key informant interviews, was mailed to 426 individuals who donated a kidney or liver at our institution. Quantitative and qualitative methods including open and axial coding were used to analyze donor responses. Of the 141 survey respondents, 94% would encourage others to become donors; however, nearly half (44%) thought the donation process could be improved and offered numerous suggestions. Five major themes arose: (1) desire for greater convenience in testing and scheduling; (2) involvement of previous donors throughout the process; (3) education and promotion of donation through social media; (4) unanticipated difficulties, specifically pain; and (5) financial concerns. Donor feedback has been translated into performance improvements at our hospital, many of which are applicable to other institutions. Population-specific survey development helps to identify vital patient concerns and provides valuable feedback to enhance the delivery of care.

  2. Geochemical drainage surveys for uranium: sampling and analytical methods based on trial surveys in Pennsylvania

    International Nuclear Information System (INIS)

    Rose, A.W.; Keith, M.L.; Suhr, N.H.

    1976-01-01

    Geochemical surveys near sandstone-type uranium prospects in northeastern and north-central Pennsylvania show that the deposits can be detected by carefully planned stream sediment surveys, but not by stream water surveys. Stream waters at single sites changed in U content by x10 to 50 during the 18 months of our studies, and even near known prospects, contain less than 0.2 ppB U most of the time. Uranium extractable from stream sediment by acetic acid--H 2 O 2 provides useful contrast between mineralized and nonmineralized drainages of a square mile or less; total U in sediment does not. High organic material results in increased U content of sediments and must be corrected. Changes in U content of sediment with time reach a maximum of x3 and appear to be of short duration. A sediment of about 200 mi 2 near Jim Thorpe detects anomalies extending over several square miles near known occurrences and a second anomaly about two miles northeast of Penn Haven Jct. A similar survey in Lycoming-Sullivan Counties shows anomalous zones near known prospects of the Beaver Lake area and northwest of Muncy Creek. As, Mn, Pb, and V are enriched in the mineralized zones, and perhaps in surrounding halo zones, but do not appear to be pathfinder elements useful for reconnaissance exploration

  3. GPR survey, as one of the best geophysical methods for social and industrial needs

    Science.gov (United States)

    Chernov, Anatolii

    2016-04-01

    This paper is about ways and methods of applying non-invasive geophysical method - Ground penetrating radar (GPR) survey in different spheres of science, industry, social life and culture. Author would like to show that geological methods could be widely used for solving great variety of industrial, human safety and other problems. In that article, we take GPR survey as an example of such useful geophysical methods. It is a fact that investigation of near surface underground medium is important process, which influence on development of different spheres of science and social life: investigation of near surface geology (layering, spreading of rock types, identification of voids, etc.), hydrogeology (depth to water horizons, their thickness), preparation step for construction of roads and buildings (civil geology, engineering geology), investigation of cultural heritage (burial places, building remains,...), ecological investigations (land slides, variation in underground water level, etc.), glaciology. These tasks can be solved by geological methods, but as usual, geophysical survey takes a lot of time and energy (especially electric current and resistivity methods, seismic survey). Author claims that GPR survey can be performed faster than other geophysical surveys and results of GPR survey are informative enough to make proper conclusions. Some problems even cannot be solved without GPR. For example, identification of burial place (one of author's research objects): results of magnetic and electric resistivity tomography survey do not contain enough information to identify burial place, but according to anomalies on GPR survey radarograms, presence of burial place can be proven. Identification of voids and non-magnetic objects also hardly can be done by another non-invasive geophysics surveys and GPR is applicable for that purpose. GPR can be applied for monitoring of dangerous processes in geological medium under roads, buildings, parks and other places of human

  4. The swift UVOT stars survey. I. Methods and test clusters

    International Nuclear Information System (INIS)

    Siegel, Michael H.; Porterfield, Blair L.; Linevsky, Jacquelyn S.; Bond, Howard E.; Hoversten, Erik A.; Berrier, Joshua L.; Gronwall, Caryl A.; Holland, Stephen T.; Breeveld, Alice A.; Brown, Peter J.

    2014-01-01

    We describe the motivations and background of a large survey of nearby stellar populations using the Ultraviolet Optical Telescope (UVOT) on board the Swift Gamma-Ray Burst Mission. UVOT, with its wide field, near-UV sensitivity, and 2.″3 spatial resolution, is uniquely suited to studying nearby stellar populations and providing insight into the near-UV properties of hot stars and the contribution of those stars to the integrated light of more distant stellar populations. We review the state of UV stellar photometry, outline the survey, and address problems specific to wide- and crowded-field UVOT photometry. We present color–magnitude diagrams of the nearby open clusters M67, NGC 188, and NGC 2539, and the globular cluster M79. We demonstrate that UVOT can easily discern the young- and intermediate-age main sequences, blue stragglers, and hot white dwarfs, producing results consistent with previous studies. We also find that it characterizes the blue horizontal branch of M79 and easily identifies a known post-asymptotic giant branch star.

  5. Radioactive waste processing method and processing device therefor

    International Nuclear Information System (INIS)

    Matsuo, Toshiaki; Nishi, Takashi; Noge, Kenji; Matsuda, Masami; Takeshi, Kiyotaka

    1998-01-01

    Each predetermined amount of aggregates such as cements and sands as water-hardening solidification materials and kneading water are charged from a solidification material containing vessel, an aggregate containing vessel and a kneading water containing vessel to a kneading vessel of a paste supply device. The cements, the sands and the kneading water are kneaded by the rotation of a kneader. A produced solidification material paste is charged from the kneader to a drum through a paste transporting pump. Miscellaneous radioactive solid wastes have been filled in a drum. The solidification paste produced while supplying the cements, the sands and the kneading water into the kneader is discharged from the kneader. Since increase of viscosity of the solid material paste in the kneader is suppressed, the solidification paste can be easily flown into narrow gaps between radioactive miscellaneous solid wastes in the drum. (I.N.)

  6. Survey of current component reliability problems and methods for prevention.

    Science.gov (United States)

    Hamiter, L.; Villella, F.

    1972-01-01

    The current reliability problems related to electronic components and microcircuits are presented in this paper. Specific process controls, design, materials, application constraints, destructive testing, electrical tests, and procedures for implementation are recommended to improve the reliability of selected electronic components.

  7. Methods of practice and guidelines for using survey-grade global navigation satellite systems (GNSS) to establish vertical datum in the United States Geological Survey

    Science.gov (United States)

    Rydlund, Jr., Paul H.; Densmore, Brenda K.

    2012-01-01

    Geodetic surveys have evolved through the years to the use of survey-grade (centimeter level) global positioning to perpetuate and post-process vertical datum. The U.S. Geological Survey (USGS) uses Global Navigation Satellite Systems (GNSS) technology to monitor natural hazards, ensure geospatial control for climate and land use change, and gather data necessary for investigative studies related to water, the environment, energy, and ecosystems. Vertical datum is fundamental to a variety of these integrated earth sciences. Essentially GNSS surveys provide a three-dimensional position x, y, and z as a function of the North American Datum of 1983 ellipsoid and the most current hybrid geoid model. A GNSS survey may be approached with post-processed positioning for static observations related to a single point or network, or involve real-time corrections to provide positioning "on-the-fly." Field equipment required to facilitate GNSS surveys range from a single receiver, with a power source for static positioning, to an additional receiver or network communicated by radio or cellular for real-time positioning. A real-time approach in its most common form may be described as a roving receiver augmented by a single-base station receiver, known as a single-base real-time (RT) survey. More efficient real-time methods involving a Real-Time Network (RTN) permit the use of only one roving receiver that is augmented to a network of fixed receivers commonly known as Continually Operating Reference Stations (CORS). A post-processed approach in its most common form involves static data collection at a single point. Data are most commonly post-processed through a universally accepted utility maintained by the National Geodetic Survey (NGS), known as the Online Position User Service (OPUS). More complex post-processed methods involve static observations among a network of additional receivers collecting static data at known benchmarks. Both classifications provide users

  8. Method and apparatus for lysing and processing algae

    Science.gov (United States)

    Chew, Geoffrey; Reich, Alton J.; Dykes, Jr., H. Waite H.; Di Salvo, Roberto

    2013-03-05

    Methods and apparatus for processing algae are described in which a hydrophilic ionic liquid is used to lyse algae cells at lower temperatures than existing algae processing methods. A salt or salt solution is used as a separation agent and to remove water from the ionic liquid, allowing the ionic liquid to be reused. The used salt may be dried or concentrated and reused. The relatively low lysis temperatures and recycling of the ionic liquid and salt reduce the environmental impact of the algae processing while providing biofuels and other useful products.

  9. Calculation of radiation exposure in diagnostic radiology. Method and surveys

    International Nuclear Information System (INIS)

    Duvauferrier, R.; Ramee, A.; Ezzeldin, K.; Guibert, J.L.

    1984-01-01

    A computerized method for evaluating the radiation exposure of the main target organs during various diagnostic radiologic procedures is described. This technique was used for educational purposes: study of exposure variations according to the technical modalities of a given procedure, and study of exposure variations according to various technical protocols (IVU, EGD barium study, etc.). This method was also used for studying exposure of patients during hospitalization in the Rennes Regional Hospital Center (France) in 1982, according to departments (urology, neurology, etc.). This method and results of these three studies are discussed [fr

  10. Effects of processing methods on the antinutrional factor and the ...

    African Journals Online (AJOL)

    The effect of processing on phytic acid (PA) reduction and nutritional composition of sesame seed was investigated. Raw sesame seed (RASS) was compared with seeds processed by three different methods: roasted (ROSS), boiled (BOSS) and soaked (SOSS) sesame seeds. Processing had no significant (P>0.05) effects ...

  11. A Comparison of Neural Networks and Fuzzy Logic Methods for Process Modeling

    Science.gov (United States)

    Cios, Krzysztof J.; Sala, Dorel M.; Berke, Laszlo

    1996-01-01

    The goal of this work was to analyze the potential of neural networks and fuzzy logic methods to develop approximate response surfaces as process modeling, that is for mapping of input into output. Structural response was chosen as an example. Each of the many methods surveyed are explained and the results are presented. Future research directions are also discussed.

  12. Process control and optimization with simple interval calculation method

    DEFF Research Database (Denmark)

    Pomerantsev, A.; Rodionova, O.; Høskuldsson, Agnar

    2006-01-01

    for the quality improvement in the course of production. The latter is an active quality optimization, which takes into account the actual history of the process. The advocate approach is allied to the conventional method of multivariate statistical process control (MSPC) as it also employs the historical process......Methods of process control and optimization are presented and illustrated with a real world example. The optimization methods are based on the PLS block modeling as well as on the simple interval calculation methods of interval prediction and object status classification. It is proposed to employ...... the series of expanding PLS/SIC models in order to support the on-line process improvements. This method helps to predict the effect of planned actions on the product quality and thus enables passive quality control. We have also considered an optimization approach that proposes the correcting actions...

  13. Process control and optimization with simple interval calculation method

    DEFF Research Database (Denmark)

    Pomerantsev, A.; Rodionova, O.; Høskuldsson, Agnar

    2006-01-01

    Methods of process control and optimization are presented and illustrated with a real world example. The optimization methods are based on the PLS block modeling as well as on the simple interval calculation methods of interval prediction and object status classification. It is proposed to employ...... the series of expanding PLS/SIC models in order to support the on-line process improvements. This method helps to predict the effect of planned actions on the product quality and thus enables passive quality control. We have also considered an optimization approach that proposes the correcting actions...... for the quality improvement in the course of production. The latter is an active quality optimization, which takes into account the actual history of the process. The advocate approach is allied to the conventional method of multivariate statistical process control (MSPC) as it also employs the historical process...

  14. Parallel processing and maintaining adequate alignment between instruments and methods.

    Science.gov (United States)

    Calleja, John

    2008-08-01

    Parallel processing of laboratory tests across more than one instrument platform: permits dealing with increasing workloads; but broadens uncertainty of measurement; minimising measurement uncertainty means keeping assay performances continuously aligned. Important questions are: Why is there the need to demonstrate "acceptable alignment" between methods/instruments? What methods/tools can be used to test method/instrument alignment and how adjustments can be made? What is an "acceptable" alignment? How often should alignments be checked and what is the reasoning for this?

  15. Survey of artificial intelligence methods for detection and identification of component faults in nuclear power plants

    International Nuclear Information System (INIS)

    Reifman, J.

    1997-01-01

    A comprehensive survey of computer-based systems that apply artificial intelligence methods to detect and identify component faults in nuclear power plants is presented. Classification criteria are established that categorize artificial intelligence diagnostic systems according to the types of computing approaches used (e.g., computing tools, computer languages, and shell and simulation programs), the types of methodologies employed (e.g., types of knowledge, reasoning and inference mechanisms, and diagnostic approach), and the scope of the system. The major issues of process diagnostics and computer-based diagnostic systems are identified and cross-correlated with the various categories used for classification. Ninety-five publications are reviewed

  16. Questionnaires: the use and abuse of social survey methods in medical research

    OpenAIRE

    Eaden, J.; Mayberry, M.; Mayberry, J.

    1999-01-01

    We present a working review of survey methods based on market research technology. The structure of questionnaires, their distribution and analysis, are considered, together with techniques for increasing response rates.


Keywords: questionnaires; research methods

  17. Mathematical methods in time series analysis and digital image processing

    CERN Document Server

    Kurths, J; Maass, P; Timmer, J

    2008-01-01

    The aim of this volume is to bring together research directions in theoretical signal and imaging processing developed rather independently in electrical engineering, theoretical physics, mathematics and the computer sciences. In particular, mathematically justified algorithms and methods, the mathematical analysis of these algorithms, and methods as well as the investigation of connections between methods from time series analysis and image processing are reviewed. An interdisciplinary comparison of these methods, drawing upon common sets of test problems from medicine and geophysical/enviromental sciences, is also addressed. This volume coherently summarizes work carried out in the field of theoretical signal and image processing. It focuses on non-linear and non-parametric models for time series as well as on adaptive methods in image processing.

  18. A survey on the task analysis methods and techniques for nuclear power plant operators

    International Nuclear Information System (INIS)

    Lee, Yong Heui; Chun, Se Woo; Suh, Sang Moon; Lee, Jung Woon

    1994-04-01

    We have surveyed techniques and methods of task analysis from very traditional ones to recently developed ones that are being applicated to various industrial fields. We compare each other and analyse their fundamental characteristics and methodological specification in order to find a proper one enough to apply to nuclear power plant operators tasks. Generally, the fundamental process of task analyses has well been understandable, but its process of application in practice has not been so simple due to the wide and varying range of applications according to specific domain. Operators' tasks in NPPs are supposed to be performed strictly according to operational procedures written in a text and well trained, so the method of task analysis for operators' tasks in NPPs can be established to have its unique characteristics of task analysis based on the operational procedures. 8 figs., 10 tabs., 18 refs. (Author)

  19. A survey on the task analysis methods and techniques for nuclear power plant operators

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Heui; Chun, Se Woo; Suh, Sang Moon; Lee, Jung Woon [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-04-01

    We have surveyed techniques and methods of task analysis from very traditional ones to recently developed ones that are being applicated to various industrial fields. We compare each other and analyse their fundamental characteristics and methodological specification in order to find a proper one enough to apply to nuclear power plant operators tasks. Generally, the fundamental process of task analyses has well been understandable, but its process of application in practice has not been so simple due to the wide and varying range of applications according to specific domain. Operators` tasks in NPPs are supposed to be performed strictly according to operational procedures written in a text and well trained, so the method of task analysis for operators` tasks in NPPs can be established to have its unique characteristics of task analysis based on the operational procedures. 8 figs., 10 tabs., 18 refs. (Author).

  20. A Survey of Structured and Object-Oriented Software Specification Methods and Techniques

    NARCIS (Netherlands)

    Wieringa, Roelf J.

    1998-01-01

    This article surveys techniques used in structured and object-oriented software specification methods. The techniques are classified as techniques for the specification of external interaction and internal decomposition. The external specification techniques are further subdivided into techniques

  1. Survey of Methods and Algorithms of Robot Swarm Aggregation

    Science.gov (United States)

    E Shlyakhov, N.; Vatamaniuk, I. V.; Ronzhin, A. L.

    2017-01-01

    The paper considers the problem of swarm aggregation of autonomous robots with the use of three methods based on the analogy of the behavior of biological objects. The algorithms substantiating the requirements for hardware realization of sensor, computer and network resources and propulsion devices are presented. Techniques for efficiency estimation of swarm aggregation via space-time characteristics are described. The developed model of the robot swarm reconfiguration into a predetermined three-dimensional shape is presented.

  2. Survey of Existing and Promising New Methods of Surface Preparation

    Science.gov (United States)

    1982-04-01

    stream. ● Effluent disposal — As with other systems the effluent water may contain toxic compounds from the removed paint which must be properly filtered...conditions, types of abrasives, recycling, theory of the blasting process and costs. 87. 0xy Metal Industrial Corp., “ Tannin -Containing Compositions...solution consisting of a vegetable tannin in a concentration of 0.1-10 g./l. and having a pH of less than 6 and above a value which will cause degradation

  3. Methods and analysis of processing signals of incremental optoelectronic transducer.

    Science.gov (United States)

    Szcześniak, Adam; Szcześniak, Zbigniew

    2009-09-01

    This article is a presentation of designed methods which interpolate signals from the optoelectronic transducer. This enables a way to distinguish the motion direction of the optoelectronic transducer and also to increase its accuracy. In this article methods based on logic functions, logic functions and RC circuits, phase processing were analyzed. In methods which are based on processing logic functions of transducer's signals there is a possibility of two times and four times increase in the transducer glass scale. The presented method of generating and processing sine signals with 18 degrees of the shift enables the reception of square signals with five times higher frequency compared to the basic signals. This method is universal and it can be used to the different scale of frequency multiplication of the optoelectronic transducer. The simulations of the methods were performed by using the MATLAB-SIMULINK software.

  4. The Canadian Human Activity Pattern Survey: report of methods and population surveyed.

    Science.gov (United States)

    Leech, J A; Wilby, K; McMullen, E; Laporte, K

    1996-01-01

    The assessment of health risk due to environmental contaminants depends upon accurate estimates of the distribution of population exposures. Exposure assessment, in turn, requires information on the time people spend in micro-environments and their activities during periods of exposure. This paper describes preliminary results including study methodology and population sampled in a large Canadian survey of time-activity patterns. A 24-hour diary recall survey was performed in 2381 households (representing a 65% response rate) to describe in detail the timing, location and activity pattern of one household member (the adult or child with the next birthday). Four cities (Toronto, Vancouver, Edmonton and Saint John, NB) and their suburbs were sampled by random-digit dialling over a nine-month period in 1994/1995. Supplemental questionnaires inquiring about sociodemographic information, house and household characteristics and potential exposure to toxins in the air and water were also administered. In general, the results show that respondents spend the majority of their time indoors (88.6%) with smaller proportions of time outdoors (6.1%) and in vehicles (5.3%). Children under the age of 12 spend more time both indoors and outdoors and less time in transit than do adults. The data from this study will be used to define more accurately the exposure of Canadians to a variety of toxins in exposure assessment models and to improve upon the accuracy of risk assessment for a variety of acute and chronic health effects known or suspected to be related to environmental exposures.

  5. Measuring methods, registration and signal processing for magnetic field research

    International Nuclear Information System (INIS)

    Nagiello, Z.

    1981-01-01

    Some measuring methods and signal processing systems based on analogue and digital technics, which have been applied in magnetic field research using magnetometers with ferromagnetic transducers, are presented. (author)

  6. Model based methods and tools for process systems engineering

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    Process systems engineering (PSE) provides means to solve a wide range of problems in a systematic and efficient manner. This presentation will give a perspective on model based methods and tools needed to solve a wide range of problems in product-process synthesis-design. These methods and tools...... need to be integrated with work-flows and data-flows for specific product-process synthesis-design problems within a computer-aided framework. The framework therefore should be able to manage knowledge-data, models and the associated methods and tools needed by specific synthesis-design work...... of model based methods and tools within a computer aided framework for product-process synthesis-design will be highlighted....

  7. effect of cassava flour processing methods and substitution level

    African Journals Online (AJOL)

    MASAMBA

    produce bread. This study was carried out to assess the effect of using two differently processed cassava flour (fermented and unfermented) and substitution level on proximate ... to improve the bread characteristics from different cassava processing methods and assessing ..... Tobago. Journal of Curriculum and Teaching.

  8. Comparison of Satellite Surveying to Traditional Surveying Methods for the Resources Industry

    Science.gov (United States)

    Osborne, B. P.; Osborne, V. J.; Kruger, M. L.

    Modern ground-based survey methods involve detailed survey, which provides three-space co-ordinates for surveyed points, to a high level of accuracy. The instruments are operated by surveyors, who process the raw results to create survey location maps for the subject of the survey. Such surveys are conducted for a location or region and referenced to the earth global co- ordinate system with global positioning system (GPS) positioning. Due to this referencing the survey is only as accurate as the GPS reference system. Satellite survey remote sensing utilise satellite imagery which have been processed using commercial geographic information system software. Three-space co-ordinate maps are generated, with an accuracy determined by the datum position accuracy and optical resolution of the satellite platform.This paper presents a case study, which compares topographic surveying undertaken by traditional survey methods with satellite surveying, for the same location. The purpose of this study is to assess the viability of satellite remote sensing for surveying in the resources industry. The case study involves a topographic survey of a dune field for a prospective mining project area in Pakistan. This site has been surveyed using modern surveying techniques and the results are compared to a satellite survey performed on the same area.Analysis of the results from traditional survey and from the satellite survey involved a comparison of the derived spatial co- ordinates from each method. In addition, comparisons have been made of costs and turnaround time for both methods.The results of this application of remote sensing is of particular interest for survey in areas with remote and extreme environments, weather extremes, political unrest, poor travel links, which are commonly associated with mining projects. Such areas frequently suffer language barriers, poor onsite technical support and resources.

  9. Process-driven architecture : Design techniques and methods

    NARCIS (Netherlands)

    Jaskiewicz, T.

    2007-01-01

    This paper explores the notion of process-driven architecture and, as a consequence, application of complex systems in the newly defined area of digital process-driven architectural design in order to formulate a suitable design method. Protospace software environment and SwarmCAD software

  10. Development of continuous pharmaceutical production processes supported by process systems engineering methods and tools

    DEFF Research Database (Denmark)

    Gernaey, Krist; Cervera Padrell, Albert Emili; Woodley, John

    2012-01-01

    The pharmaceutical industry is undergoing a radical transition towards continuous production processes. Systematic use of process systems engineering (PSE) methods and tools form the key to achieve this transition in a structured and efficient way.......The pharmaceutical industry is undergoing a radical transition towards continuous production processes. Systematic use of process systems engineering (PSE) methods and tools form the key to achieve this transition in a structured and efficient way....

  11. Workshop on Survey Methods in Education Research: Facilitator's Guide and Resources. REL 2017-214

    Science.gov (United States)

    Walston, Jill; Redford, Jeremy; Bhatt, Monica P.

    2017-01-01

    This Workshop on Survey Methods in Education Research tool consists of a facilitator guide and workshop handouts. The toolkit is intended for use by state or district education leaders and others who want to conduct training on developing and administering surveys. The facilitator guide provides materials related to various phases of the survey…

  12. Effect of Processing Methods on the Nutrients and Anti Nutrients ...

    African Journals Online (AJOL)

    appropriate methods for retaining its nutrients and reducing to a moderate level, its antinutrients compositions. The leaves were subjected to different processing methods such as boiling for 3mins at 100oC, blanching at 62oC for 5mins, squeeze-washing with 250ml of clean water for 3 rounds each lasting 3mins, a combine ...

  13. An assessment of oil processing methods and technology in Taraba ...

    African Journals Online (AJOL)

    Objective: The study assessed the various methods and technology of vegetable oil processing in Taraba State. Methods: A total of 250 questionnaires were administered to vegetable oil processors with indepth interview of 28 respondents who were randomly selected based on a preliminary study in six (6) Local ...

  14. A comparison of web-based and paper-based survey methods: testing assumptions of survey mode and response cost.

    Science.gov (United States)

    Greenlaw, Corey; Brown-Welty, Sharon

    2009-10-01

    Web-based surveys have become more prevalent in areas such as evaluation, research, and marketing research to name a few. The proliferation of these online surveys raises the question, how do response rates compare with traditional surveys and at what cost? This research explored response rates and costs for Web-based surveys, paper surveys, and mixed-mode surveys. The participants included evaluators from the American Evaluation Association (AEA). Results included that mixed-mode, while more expensive, had higher response rates.

  15. A survey of current practices for genomic sequencing test interpretation and reporting processes in US laboratories.

    Science.gov (United States)

    O'Daniel, Julianne M; McLaughlin, Heather M; Amendola, Laura M; Bale, Sherri J; Berg, Jonathan S; Bick, David; Bowling, Kevin M; Chao, Elizabeth C; Chung, Wendy K; Conlin, Laura K; Cooper, Gregory M; Das, Soma; Deignan, Joshua L; Dorschner, Michael O; Evans, James P; Ghazani, Arezou A; Goddard, Katrina A; Gornick, Michele; Farwell Hagman, Kelly D; Hambuch, Tina; Hegde, Madhuri; Hindorff, Lucia A; Holm, Ingrid A; Jarvik, Gail P; Knight Johnson, Amy; Mighion, Lindsey; Morra, Massimo; Plon, Sharon E; Punj, Sumit; Richards, C Sue; Santani, Avni; Shirts, Brian H; Spinner, Nancy B; Tang, Sha; Weck, Karen E; Wolf, Susan M; Yang, Yaping; Rehm, Heidi L

    2017-05-01

    While the diagnostic success of genomic sequencing expands, the complexity of this testing should not be overlooked. Numerous laboratory processes are required to support the identification, interpretation, and reporting of clinically significant variants. This study aimed to examine the workflow and reporting procedures among US laboratories to highlight shared practices and identify areas in need of standardization. Surveys and follow-up interviews were conducted with laboratories offering exome and/or genome sequencing to support a research program or for routine clinical services. The 73-item survey elicited multiple choice and free-text responses that were later clarified with phone interviews. Twenty-one laboratories participated. Practices highly concordant across all groups included consent documentation, multiperson case review, and enabling patient opt-out of incidental or secondary findings analysis. Noted divergence included use of phenotypic data to inform case analysis and interpretation and reporting of case-specific quality metrics and methods. Few laboratory policies detailed procedures for data reanalysis, data sharing, or patient access to data. This study provides an overview of practices and policies of experienced exome and genome sequencing laboratories. The results enable broader consideration of which practices are becoming standard approaches, where divergence remains, and areas of development in best practice guidelines that may be helpful.Genet Med advance online publication 03 Novemeber 2016.

  16. The Jamaica asthma and allergies national prevalence survey: rationale and methods

    Directory of Open Access Journals (Sweden)

    Edwards Nancy C

    2010-04-01

    Full Text Available Abstract Background Asthma is a significant public health problem in the Caribbean. Prevalence surveys using standardized measures of asthma provide valid prevalence estimates to facilitate regional and international comparisons and monitoring of trends. This paper describes methods used in the Jamaica Asthma and Allergies National Prevalence Survey, challenges associated with this survey and strategies used to overcome these challenges. Methods/Design An island wide, cross-sectional, community-based survey of asthma, asthma symptoms and allergies was done among adults and children using the European Community Respiratory Health Survey Questionnaire for adults and the International Study of Asthma and Allergies in Children. Stratified multi-stage cluster sampling was used to select 2, 163 adults aged 18 years and older and 2, 017 children aged 2-17 years for the survey. The Kish selection table was used to select one adult and one child per household. Data analysis accounted for sampling design and prevalence estimates were weighted to produce national estimates. Discussion The Jamaica Asthma and Allergies National Prevalence Survey is the first population- based survey in the Caribbean to determine the prevalence of asthma and allergies both in adults and children using standardized methods. With response rates exceeding 80% in both groups, this approach facilitated cost-effective gathering of high quality asthma prevalence data that will facilitate international and regional comparison and monitoring of asthma prevalence trends. Another unique feature of this study was the partnership with the Ministry of Health in Jamaica, which ensured the collection of data relevant for decision-making to facilitate the uptake of research evidence. The findings of this study will provide important data on the burden of asthma and allergies in Jamaica and contribute to evidence-informed planning of comprehensive asthma management and education programs.

  17. Exploring the response process of culturally differing survey respondents with a response style: A sequential mixed-methods study

    NARCIS (Netherlands)

    Morren, M.H.; Gelissen, J.P.T.M.; Vermunt, J.K.

    2013-01-01

    This article presents a mixed methods approach that integrates quantitative and qualitative methods to analyze why the four largest minorities in the Netherlands-Turks, Moroccans, Antilleans, and Surinamese-respond differently to items treating cultural topics. First, we conducted latent class

  18. Unsupervised process monitoring and fault diagnosis with machine learning methods

    CERN Document Server

    Aldrich, Chris

    2013-01-01

    This unique text/reference describes in detail the latest advances in unsupervised process monitoring and fault diagnosis with machine learning methods. Abundant case studies throughout the text demonstrate the efficacy of each method in real-world settings. The broad coverage examines such cutting-edge topics as the use of information theory to enhance unsupervised learning in tree-based methods, the extension of kernel methods to multiple kernel learning for feature extraction from data, and the incremental training of multilayer perceptrons to construct deep architectures for enhanced data

  19. System and method for cognitive processing for data fusion

    Science.gov (United States)

    Duong, Tuan A. (Inventor); Duong, Vu A. (Inventor)

    2012-01-01

    A system and method for cognitive processing of sensor data. A processor array receiving analog sensor data and having programmable interconnects, multiplication weights, and filters provides for adaptive learning in real-time. A static random access memory contains the programmable data for the processor array and the stored data is modified to provide for adaptive learning.

  20. Bridging Technometric Method and Innovation Process: An Initial Study

    Science.gov (United States)

    Rumanti, A. A.; Reynaldo, R.; Samadhi, T. M. A. A.; Wiratmadja, I. I.; Dwita, A. C.

    2018-03-01

    The process of innovation is one of ways utilized to increase the capability of a technology component that reflects the need of SME. Technometric method can be used to identify to what extent the level of technology advancement in a SME is, and also which technology component that needs to be maximized in order to significantly deliver an innovation. This paper serves as an early study, which lays out a conceptual framework that identifies and elaborates the principles of innovation process from a well-established innovation model by Martin with the technometric method, based on the initial background research conducted at SME Ira Silver in Jogjakarta, Indonesia.

  1. Information processing systems, reasoning modules, and reasoning system design methods

    Science.gov (United States)

    Hohimer, Ryan E; Greitzer, Frank L; Hampton, Shawn D

    2014-03-04

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  2. Information processing systems, reasoning modules, and reasoning system design methods

    Energy Technology Data Exchange (ETDEWEB)

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2015-08-18

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  3. Information processing systems, reasoning modules, and reasoning system design methods

    Energy Technology Data Exchange (ETDEWEB)

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2016-08-23

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  4. Effect of Processing Method on Composition and Consumer ...

    African Journals Online (AJOL)

    Processing method significantly (p<0.05) affected protein contents of raw beef from 18.88% to 20.26% in balangu, 19.69% in kilishi and 24.78% in soye. Protein content ... It was recommended that camel meat be adopted for use in soye, kilishi and balangu making at small scale and commercial production levels. Keywords: ...

  5. Effect of processing methods and storage environment on moisture ...

    African Journals Online (AJOL)

    Effect of processing methods and storage environment on moisture adsorption characteristics of ginger (Zingiber Officianale) ... At any given storage temperature and relative humidity, unpeeled ginger samples were more hygroscopic than the peeled. Also at the temperatures and relative humidities studied, ginger powders ...

  6. Effect of tomato cultivars, honey finisher and processing methods on ...

    African Journals Online (AJOL)

    The experiment was carried out using factorial combination of two tomato varieties, two processing methods and three honey and one sugar concentrations with three replications. Tomato ketchups prepared from Melkashola subjected to pre-heat treatment before cooking exhibited higher total soluble solids, reducing and ...

  7. National Survey on Access, Use and Promotion of Rational Use of Medicines (PNAUM: household survey component methods

    Directory of Open Access Journals (Sweden)

    Sotero Serrate Mengue

    Full Text Available ABSTRACT OBJECTIVE To describe methodological aspects of the household survey National Survey on Access, Use and Promotion of Rational Use of Medicines (PNAUM related to sampling design and implementation, the actual obtained sample, instruments and fieldwork. METHODS A cross-sectional, population-based study with probability sampling in three stages of the population living in households located in Brazilian urban areas. Fieldwork was carried out between September 2013 and February 2014. The data collection instrument included questions related to: information about households, residents and respondents; chronic diseases and medicines used; use of health services; acute diseases and events treated with drugs; use of contraceptives; use of pharmacy services; behaviors that may affect drug use; package inserts and packaging; lifestyle and health insurance. RESULTS In total, 41,433 interviews were carried out in 20,404 households and 576 urban clusters corresponding to 586 census tracts distributed in the five Brazilian regions, according to eight domains defined by age and gender. CONCLUSIONS The results of the survey may be used as a baseline for future studies aiming to assess the impact of government action on drug access and use. For local studies using a compatible method, PNAUM may serve as a reference point to evaluate variations in space and population. With a comprehensive evaluation of drug-related aspects, PNAUM is a major source of data for a variety of analyses to be carried out both at academic and government level.

  8. Pyrochemical and Dry Processing Methods Program. A selected bibliography

    Energy Technology Data Exchange (ETDEWEB)

    McDuffie, H.F.; Smith, D.H.; Owen, P.T.

    1979-03-01

    This selected bibliography with abstracts was compiled to provide information support to the Pyrochemical and Dry Processing Methods (PDPM) Program sponsored by DOE and administered by the Argonne National Laboratory. Objectives of the PDPM Program are to evaluate nonaqueous methods of reprocessing spent fuel as a route to the development of proliferation-resistant and diversion-resistant methods for widespread use in the nuclear industry. Emphasis was placed on the literature indexed in the ERDA--DOE Energy Data Base (EDB). The bibliography includes indexes to authors, subject descriptors, EDB subject categories, and titles.

  9. Pyrochemical and Dry Processing Methods Program. A selected bibliography

    International Nuclear Information System (INIS)

    McDuffie, H.F.; Smith, D.H.; Owen, P.T.

    1979-03-01

    This selected bibliography with abstracts was compiled to provide information support to the Pyrochemical and Dry Processing Methods (PDPM) Program sponsored by DOE and administered by the Argonne National Laboratory. Objectives of the PDPM Program are to evaluate nonaqueous methods of reprocessing spent fuel as a route to the development of proliferation-resistant and diversion-resistant methods for widespread use in the nuclear industry. Emphasis was placed on the literature indexed in the ERDA--DOE Energy Data Base (EDB). The bibliography includes indexes to authors, subject descriptors, EDB subject categories, and titles

  10. Collection, Processing, and Accuracy of Mobile Terrestrial Lidar Survey Data in the Coastal Environment

    Science.gov (United States)

    2017-04-01

    ER D C/ CH L TR -1 7- 5 Coastal Field Data Collection Program Collection, Processing , and Accuracy of Mobile Terrestrial Lidar Survey...water resources, and environmental sciences for the Army, the Department of Defense, civilian agencies, and our nation’s public good. Find out more at...default. Coastal Field Data Collection Program ERDC/CHL TR-17-5 April 2017 Collection, Processing , and Accuracy of Mobile Terrestrial Lidar

  11. Anthropometric Survey (ANSUR) II Pilot Study: Methods and Summary Statistics

    Science.gov (United States)

    2009-04-01

    by Mr. Jeremy Carson and Drs. Brian Corner and Peng Li of Natick’s Ergonomics Team. Project coordination was conducted by Ms. Belva Hodge...photographic in nature. Soldiers were requested to appear in Physical Training (PT) attire. If they did not have their own PT gear, running shorts and...images will have some role in the future of U.S. Army anthropology . By way of developing an initial database of digital images which might help

  12. Literature Survey and Preliminary Evaluation of Streambank Protection Methods

    Science.gov (United States)

    1977-05-01

    ban preparation, tedding materiale and transportation of stohe, can vary greatly depending on the location and availability of suitable rock. The...Replaces Porous Concrete Slope.,Protection," ’• Formes de la Construccion (Spain). Vol 2, No. 13, Aug-Sep 1949, •; PP 531-532. Johnson, P. L

  13. A Survey On Various Web Template Detection And Extraction Methods

    Directory of Open Access Journals (Sweden)

    Neethu Mary Varghese

    2015-03-01

    Full Text Available Abstract In todays digital world reliance on the World Wide Web as a source of information is extensive. Users increasingly rely on web based search engines to provide accurate search results on a wide range of topics that interest them. The search engines in turn parse the vast repository of web pages searching for relevant information. However majority of web portals are designed using web templates which are designed to provide consistent look and feel to end users. The presence of these templates however can influence search results leading to inaccurate results being delivered to the users. Therefore to improve the accuracy and reliability of search results identification and removal of web templates from the actual content is essential. A wide range of approaches are commonly employed to achieve this and this paper focuses on the study of the various approaches of template detection and extraction that can be applied across homogenous as well as heterogeneous web pages.

  14. Decontamination and Disposal Methods for Chemical Agents - A Literature Survey

    Science.gov (United States)

    1982-11-01

    destruction of AC. Although no meas- urements were made of the hypochlorite solution, use of a starch - iodine indicator scrubber showed t: , that less than...asthe reaction was found to be highly exothermic. As measured with a starch - iodine bubbler mixture, with a limit of 45 pg of cyanogen chloride...unlimited. Distribution authorized to U.S. Gov’t. agencies only; Test and Evaluation; NOV 1982. Other requests shall be referred to Commander, US Army Toxic

  15. Methods and tools for sustainable chemical process design

    DEFF Research Database (Denmark)

    Loureiro da Costa Lira Gargalo, Carina; Chairakwongsa, Siwanat; Quaglia, Alberto

    2015-01-01

    chemical processes is presented. The framework allows the use of appropriate computer-aided methods and tools in a hierarchical manner according to a developed work flow for a multilevel criteria analysis that helps generate competing and more sustainable process design options. The application......As the pressure on chemical and biochemical processes to achieve a more sustainable performance increases, the need to define a systematic and holistic way to accomplish this is becoming more urgent. In this chapter, a multilevel computer-aided framework for systematic design of more sustainable...... of the framework as well as the related computer-aided methods and tools are highlighted through a case study involving the production of bioethanol from various renewable raw materials....

  16. [Essential procedure and key methods for survey of traditional knowledge related to Chinese materia medica resources].

    Science.gov (United States)

    Cheng, Gong; Huang, Lu-qi; Xue, Da-yuan; Zhang, Xiao-bo

    2014-12-01

    The survey of traditional knowledge related to Chinese materia medica resources is the important component and one of the innovative aspects of the fourth national survey of the Chinese materia medica resources. China has rich traditional knowledge of traditional Chinese medicine (TCM) and the comprehensive investigation of TCM traditional knowledge aims to promote conservation and sustainable use of Chinese materia medica resources. Building upon the field work of pilot investigations, this paper introduces the essential procedures and key methods for conducting the survey of traditional knowledge related to Chinese materia medica resources. The essential procedures are as follows. First is the preparation phrase. It is important to review all relevant literature and provide training to the survey teams so that they have clear understanding of the concept of traditional knowledge and master key survey methods. Second is the field investigation phrase. When conducting field investigations, survey teams should identify the traditional knowledge holders by using the 'snowball method', record the traditional knowledge after obtaining prior informed concerned from the traditional knowledge holders. Researchers should fill out the survey forms provided by the Technical Specification of the Fourth National Survey of Chinese Materia Medica Resources. Researchers should pay particular attention to the scope of traditional knowledge and the method of inheriting the knowledge, which are the key information for traditional knowledge holders and potential users to reach mutual agreed terms to achieve benefit sharing. Third is the data compilation and analysis phrase. Researchers should try to compile and edit the TCM traditional knowledge in accordance with intellectual property rights requirements so that the information collected through the national survey can serve as the basic data for the TCM traditional knowledge database. The key methods of the survey include regional

  17. Individual Differences in the Encoding Processes of Egocentric and Allocentric Survey Knowledge

    Science.gov (United States)

    Wen, Wen; Ishikawa, Toru; Sato, Takao

    2013-01-01

    This study examined how different components of working memory are involved in the acquisition of egocentric and allocentric survey knowledge by people with a good and poor sense of direction (SOD). We employed a dual-task method and asked participants to learn routes from videos with verbal, visual, and spatial interference tasks and without any…

  18. Application of combined temporal and spectral processing methods ...

    Indian Academy of Sciences (India)

    2016-08-26

    Aug 26, 2016 ... This paper presents an experimental evaluation of the combined temporal and spectral processing methods for speaker recognition task under noise, reverberation or multi-speaker environments. Automatic speaker recognition system gives good performance in controlled environments. Speech recorded ...

  19. Indigenous processing methods and raw materials of borde , an ...

    African Journals Online (AJOL)

    A flow chart of borde production was constructed showing four major processing stages. The short shelf life of borde and the seasonal variations in production volume were identified as major problems for the vendors in the study areas. Keywords: indigenous methods; cereal fermentation; borde; beverage; Ethiopia J Food ...

  20. Methods of surveying and monitoring marine radioactivity. Report of an ad hoc panel of experts

    International Nuclear Information System (INIS)

    1965-01-01

    An effective control of the radioactive pollution of the sea depends partly on the availability of adequate technical methods for surveying and monitoring the sea and marine products with regard to the presence of radioactive substances. The purpose of this manual is to offer such methods.

  1. Processing method and device for radioactive liquid waste

    International Nuclear Information System (INIS)

    Matsuo, Toshiaki; Nishi, Takashi; Matsuda, Masami; Yukita, Atsushi.

    1997-01-01

    When only suspended particulate ingredients are contained as COD components in radioactive washing liquid wastes, the liquid wastes are heated by a first process, for example, an adsorption step to adsorb the suspended particulate ingredients to an activated carbon, and then separating and removing the suspended particulate ingredients by filtration. When both of the floating particle ingredients and soluble organic ingredients are contained, the suspended particulate ingredients are separated and removed by the first process, and then soluble organic ingredients are removed by other process, or both of the suspended particulate ingredients and the soluble organic ingredients are removed by the first process. In an existent method of adding an activated carbon and then filtering them at a normal temperature, the floating particle ingredients cover the layer of activated carbon formed on a filter paper or fabric to sometimes cause clogging. However, according to the method of the present invention, since disturbance by the floating particle ingredients does not occur, the COD components can be separated and removed sufficiently without lowering liquid waste processing speed. (T.M.)

  2. Establishing survey validity and reliability for American Indians through "think aloud" and test-retest methods.

    Science.gov (United States)

    Hauge, Cindy Horst; Jacobs-Knight, Jacque; Jensen, Jamie L; Burgess, Katherine M; Puumala, Susan E; Wilton, Georgiana; Hanson, Jessica D

    2015-06-01

    The purpose of this study was to use a mixed-methods approach to determine the validity and reliability of measurements used within an alcohol-exposed pregnancy prevention program for American Indian women. To develop validity, content experts provided input into the survey measures, and a "think aloud" methodology was conducted with 23 American Indian women. After revising the measurements based on this input, a test-retest was conducted with 79 American Indian women who were randomized to complete either the original measurements or the new, modified measurements. The test-retest revealed that some of the questions performed better for the modified version, whereas others appeared to be more reliable for the original version. The mixed-methods approach was a useful methodology for gathering feedback on survey measurements from American Indian participants and in indicating specific survey questions that needed to be modified for this population. © The Author(s) 2015.

  3. Fiscal 1999 survey report. Survey and research concerning development of next-generation chemical process technologies; 1999 nendo jisedai kagaku process gijutsu kaihatsu ni kansuru chosa kenkyu hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    To further enhance resource/energy conservation and environmental impact reduction, it is necessary to develop innovative chemical reaction process technologies. It is for this reason that 'development of next-generation chemical reaction process technologies' is being carried out under the New Sunshine Program. The survey and research, for the fulfilment of the above goal, aim to select important technologies and put in a easy-to-study order the problems contained in associated technologies for picking out tasks for future studies for the purpose of suggesting some subjects to be taken up for future development. In addition, studies are made about how a comprehensive technology assessment system should be. In this fiscal year, propositions are compiled for research and development projects on five subjects. Studies of subjects other than these five will also continue to eventually build concrete propositions on them. The said five subjects involve 1) the development and application of nonaqueous biotechnologies, 2) biotechnology-aided polymeric material creation processes, 3) construction of high-efficiency energy conservation processes using innovative grain handling technologies in the high-temperature reaction field, 4) manufacture of high-performance polymeric materials for batteries and development of battery fabrication processes, and 5) the development of an energy conservation process maximally utilizing environmentally-friendly polyolefin. (NEDO)

  4. Process Synthesis, Design and Analysis using Process-Group Contribution Method

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Eden, Mario Richard; Gani, Rafiqul

    Process synthesis implies the investigation of chemical reactions needed to produce the desired product, selection of the separation tec hniques needed for downstream processing, as well as making decisions on sequencing the involved reaction and separation operations. This work highlights the de...... methods [2] with mathematical programming techniques [3] to formulate and solve a superstructure based optimization problem....

  5. Scientists' attitudes on science and values: Case studies and survey methods in philosophy of science.

    Science.gov (United States)

    Steel, Daniel; Gonnerman, Chad; O'Rourke, Michael

    2017-06-01

    This article examines the relevance of survey data of scientists' attitudes about science and values to case studies in philosophy of science. We describe two methodological challenges confronting such case studies: 1) small samples, and 2) potential for bias in selection, emphasis, and interpretation. Examples are given to illustrate that these challenges can arise for case studies in the science and values literature. We propose that these challenges can be mitigated through an approach in which case studies and survey methods are viewed as complementary, and use data from the Toolbox Dialogue Initiative to illustrate this claim. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. An UAV scheduling and planning method for post-disaster survey

    Science.gov (United States)

    Li, G. Q.; Zhou, X. G.; Yin, J.; Xiao, Q. Y.

    2014-11-01

    Annually, the extreme climate and special geological environments lead to frequent natural disasters, e.g., earthquakes, floods, etc. The disasters often bring serious casualties and enormous economic losses. Post-disaster surveying is very important for disaster relief and assessment. As the Unmanned Aerial Vehicle (UAV) remote sensing with the advantage of high efficiency, high precision, high flexibility, and low cost, it is widely used in emergency surveying in recent years. As the UAVs used in emergency surveying cannot stop and wait for the happening of the disaster, when the disaster happens the UAVs usually are working at everywhere. In order to improve the emergency surveying efficiency, it is needed to track the UAVs and assign the emergency surveying task for each selected UAV. Therefore, a UAV tracking and scheduling method for post-disaster survey is presented in this paper. In this method, Global Positioning System (GPS), and GSM network are used to track the UAVs; an emergency tracking UAV information database is built in advance by registration, the database at least includes the following information, e.g., the ID of the UAVs, the communication number of the UAVs; when catastrophe happens, the real time location of all UAVs in the database will be gotten using emergency tracking method at first, then the traffic cost time for all UAVs to the disaster region will be calculated based on the UAVs' the real time location and the road network using the nearest services analysis algorithm; the disaster region is subdivided to several emergency surveying regions based on DEM, area, and the population distribution map; the emergency surveying regions are assigned to the appropriated UAV according to shortest cost time rule. The UAVs tracking and scheduling prototype is implemented using SQLServer2008, ArcEnginge 10.1 SDK, Visual Studio 2010 C#, Android, SMS Modem, and Google Maps API.

  7. Dental ceramics: a review of new materials and processing methods

    Directory of Open Access Journals (Sweden)

    Lucas Hian da SILVA

    2017-08-01

    Full Text Available Abstract The evolution of computerized systems for the production of dental restorations associated to the development of novel microstructures for ceramic materials has caused an important change in the clinical workflow for dentists and technicians, as well as in the treatment options offered to patients. New microstructures have also been developed by the industry in order to offer ceramic and composite materials with optimized properties, i.e., good mechanical properties, appropriate wear behavior and acceptable aesthetic characteristics. The objective of this literature review is to discuss the main advantages and disadvantages of the new ceramic systems and processing methods. The manuscript is divided in five parts: I monolithic zirconia restorations; II multilayered dental prostheses; III new glass-ceramics; IV polymer infiltrated ceramics; and V novel processing technologies. Dental ceramics and processing technologies have evolved significantly in the past ten years, with most of the evolution being related to new microstructures and CAD-CAM methods. In addition, a trend towards the use of monolithic restorations has changed the way clinicians produce all-ceramic dental prostheses, since the more aesthetic multilayered restorations unfortunately are more prone to chipping or delamination. Composite materials processed via CAD-CAM have become an interesting option, as they have intermediate properties between ceramics and polymers and are more easily milled and polished.

  8. Dental ceramics: a review of new materials and processing methods.

    Science.gov (United States)

    Silva, Lucas Hian da; Lima, Erick de; Miranda, Ranulfo Benedito de Paula; Favero, Stéphanie Soares; Lohbauer, Ulrich; Cesar, Paulo Francisco

    2017-08-28

    The evolution of computerized systems for the production of dental restorations associated to the development of novel microstructures for ceramic materials has caused an important change in the clinical workflow for dentists and technicians, as well as in the treatment options offered to patients. New microstructures have also been developed by the industry in order to offer ceramic and composite materials with optimized properties, i.e., good mechanical properties, appropriate wear behavior and acceptable aesthetic characteristics. The objective of this literature review is to discuss the main advantages and disadvantages of the new ceramic systems and processing methods. The manuscript is divided in five parts: I) monolithic zirconia restorations; II) multilayered dental prostheses; III) new glass-ceramics; IV) polymer infiltrated ceramics; and V) novel processing technologies. Dental ceramics and processing technologies have evolved significantly in the past ten years, with most of the evolution being related to new microstructures and CAD-CAM methods. In addition, a trend towards the use of monolithic restorations has changed the way clinicians produce all-ceramic dental prostheses, since the more aesthetic multilayered restorations unfortunately are more prone to chipping or delamination. Composite materials processed via CAD-CAM have become an interesting option, as they have intermediate properties between ceramics and polymers and are more easily milled and polished.

  9. Fault-Tolerant Process Control Methods and Applications

    CERN Document Server

    Mhaskar, Prashant; Christofides, Panagiotis D

    2013-01-01

    Fault-Tolerant Process Control focuses on the development of general, yet practical, methods for the design of advanced fault-tolerant control systems; these ensure an efficient fault detection and a timely response to enhance fault recovery, prevent faults from propagating or developing into total failures, and reduce the risk of safety hazards. To this end, methods are presented for the design of advanced fault-tolerant control systems for chemical processes which explicitly deal with actuator/controller failures and sensor faults and data losses. Specifically, the book puts forward: ·         a framework for  detection, isolation and diagnosis of actuator and sensor faults for nonlinear systems; ·         controller reconfiguration and safe-parking-based fault-handling methodologies; ·         integrated-data- and model-based fault-detection and isolation and fault-tolerant control methods; ·         methods for handling sensor faults and data losses; and ·      ...

  10. Comparing Coral Reef Survey Methods. Unesco Reports in Marine Science No. 21 Report of a Regional Unesco/UNEP Workshop on Coral Reef Survey Management and Assessment Methods in Asia and the Pacific (Phuket, Thailand, December 13-17, 1982).

    Science.gov (United States)

    United Nations Educational, Scientific, and Cultural Organization, Paris (France). Div. of Marine Sciences.

    This report includes nine papers prepared for a workshop on coral reef survey management and assessment methods in Asia and the Pacific. The papers are: "A Contrast in Methodologies between Surveying and Testing" (Charles Birkeland); "Coral Reef Survey Methods in the Andaman Sea" (Hansa Chansang); "A Review of Coral Reef…

  11. Method and equipment of processing radioactive laundry wastes

    International Nuclear Information System (INIS)

    Shirai, Takamori; Suzuki, Takeo; Tabata, Masayuki; Takada, Takao; Yamaguchi, Shin-ichi; Noda, Tetsuya.

    1985-01-01

    Purpose: To effectively process radioactive laundry wastes generated due to water-washing after dry-cleaning of protective clothings which have been put on in nuclear facilities. Method: Dry cleaning soaps and ionic radioactive materials contained in radioactive laundry wastes are selectively adsorbed to decontaminate by adsorbents. Then, the adsorbents having adsorbed dry cleaning soaps and ionic radioactive materials are purified by being removed with these radioactive materials. The purified adsorbents are re-used. (Seki, T.)

  12. Using Reinterview and Reconciliation Methods to Design and Evaluate Survey Questions

    Directory of Open Access Journals (Sweden)

    Jeremy E. Morton

    2008-06-01

    Full Text Available Conducting reinterviews is an effective method to estimate and reduce response errors in interview surveys. As part of the School Health Policies and Programs Study 2000 (SHPPS, RTI used reinterview methods to assist in designing and evaluating survey questions. Reinterviews were conducted in the field test with selected respondents to identify discrepancies between the original interviews and reinterviews. Reconciliation interviews were then conducted to determine the reasons for the discrepancies in terms of comprehension, recall, encoding, response options, or other problems. In this paper, we describe the design of the reinterview and reconciliation study and discuss the implications of using these methods for questionnaire design and evaluation, specifically in comparison to cognitive interviewing.

  13. Comparing two survey methods of measuring health-related indicators: Lot Quality Assurance Sampling and Demographic Health Surveys.

    Science.gov (United States)

    Anoke, Sarah C; Mwai, Paul; Jeffery, Caroline; Valadez, Joseph J; Pagano, Marcello

    2015-12-01

    Two common methods used to measure indicators for health programme monitoring and evaluation are the demographic and health surveys (DHS) and lot quality assurance sampling (LQAS); each one has different strengths. We report on both methods when utilised in comparable situations. We compared 24 indicators in south-west Uganda, where data for prevalence estimations were collected independently for the two methods in 2011 (LQAS: n = 8876; DHS: n = 1200). Data were stratified (e.g. gender and age) resulting in 37 comparisons. We used a two-sample two-sided Z-test of proportions to compare both methods. The average difference between LQAS and DHS for 37 estimates was 0.062 (SD = 0.093; median = 0.039). The average difference among the 21 failures to reject equality of proportions was 0.010 (SD = 0.041; median = 0.009); among the 16 rejections, it was 0.130 (SD = 0.010, median = 0.118). Seven of the 16 rejections exhibited absolute differences of 0.10 and 0.20 (mean = 0.261, SD = 0.083). There is 75.7% agreement across the two surveys. Both methods yield regional results, but only LQAS provides information at less granular levels (e.g. the district level) where managerial action is taken. The cost advantage and localisation make LQAS feasible to conduct more frequently, and provides the possibility for real-time health outcomes monitoring. © 2015 The Authors. Tropical Medicine & International Health Published by John Wiley & Sons Ltd.

  14. FY1998 report on the surveys and studies on developing next generation chemical process technologies; 1998 nendo jisedai kagaku process gijutsu kaihatsu ni kansuru chosa kenkyu hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    For further resource and energy conservation and environmental load reduction, development is necessary on innovative chemical reaction technologies. This paper describes surveys on next generation chemical processes. As non-halogen processes subject to development of new catalysts, new processes were investigated and searching experiments and discussions were given on isocyanate, propylene oxide, and phenol. Technological progress in the C1 chemistry was investigated. Problems in hydrocarbon compound oxidation, hydroxylation, and decomposition by utilizing microorganisms were put into order as application of environmentally friendly technologies. Marine biotechnical possibilities were surveyed. The surveys were given on new processes utilizing the phase transfer catalyst forming a third phase, manufacture of biodegradable plastics, and a novel reaction system combined with self-separation process using molecular assembly. Possibilities were searched on designing a truly simple production system of highly energy saving type. Such fundamental common technologies as structure analysis, property control and reaction engineering were investigated for methods to manufacture functional micro-powder chemical materials. Development was discussed on a system for technology assessment over whole product life cycle to structure a technology assessment basis. (NEDO)

  15. Quantitative Sociodynamics Stochastic Methods and Models of Social Interaction Processes

    CERN Document Server

    Helbing, Dirk

    2010-01-01

    This new edition of Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioral changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics and mathematics, but they have very often proven their explanatory power in chemistry, biology, economics and the social sciences as well. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces important concepts from nonlinear dynamics (e.g. synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches, a fundamental dynamic model is obtained, which opens new perspectives in the social sciences. It includes many established models a...

  16. Results of a survey on accident and safety analysis codes, benchmarks, verification and validation methods

    International Nuclear Information System (INIS)

    Lee, A.G.; Wilkin, G.B.

    1996-03-01

    During the 'Workshop on R and D needs' at the 3rd Meeting of the International Group on Research Reactors (IGORR-III), the participants agreed that it would be useful to compile a survey of the computer codes and nuclear data libraries used in accident and safety analyses for research reactors and the methods various organizations use to verify and validate their codes and libraries. Five organizations, Atomic Energy of Canada Limited (AECL, Canada), China Institute of Atomic Energy (CIAE, People's Republic of China), Japan Atomic Energy Research Institute (JAERI, Japan), Oak Ridge National Laboratories (ORNL, USA), and Siemens (Germany) responded to the survey. The results of the survey are compiled in this report. (author) 36 refs., 3 tabs

  17. Natural language processing-based COTS software and related technologies survey.

    Energy Technology Data Exchange (ETDEWEB)

    Stickland, Michael G.; Conrad, Gregory N.; Eaton, Shelley M.

    2003-09-01

    Natural language processing-based knowledge management software, traditionally developed for security organizations, is now becoming commercially available. An informal survey was conducted to discover and examine current NLP and related technologies and potential applications for information retrieval, information extraction, summarization, categorization, terminology management, link analysis, and visualization for possible implementation at Sandia National Laboratories. This report documents our current understanding of the technologies, lists software vendors and their products, and identifies potential applications of these technologies.

  18. Studies of neutron methods for process control and criticality surveillance of fissile material processing facilities

    International Nuclear Information System (INIS)

    Zoltowski, T.

    1988-01-01

    The development of radiochemical processes for fissile material processing and spent fuel handling need new control procedures enabling an improvement of plant throughput. This is strictly related to the implementation of continuous criticality control policy and developing reliable methods for monitoring the reactivity of radiochemical plant operations in presence of the process perturbations. Neutron methods seem to be applicable for fissile material control in some technological facilities. The measurement of epithermal neutron source multiplication with heuristic evaluation of measured data enables surveillance of anomalous reactivity enhancement leading to unsafe states. 80 refs., 47 figs., 33 tabs. (author)

  19. Quantitative sociodynamics stochastic methods and models of social interaction processes

    CERN Document Server

    Helbing, Dirk

    1995-01-01

    Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioural changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics but they have very often proved their explanatory power in chemistry, biology, economics and the social sciences. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces the most important concepts from nonlinear dynamics (synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches a very fundamental dynamic model is obtained which seems to open new perspectives in the social sciences. It includes many established models as special cases, e.g. the log...

  20. A Survey of Hospice Volunteer Coordinators: Training Methods and Objectives of Current Hospice Volunteer Training Programs.

    Science.gov (United States)

    Brock, Cara M; Herndon, Christopher M

    2017-06-01

    Currently more than 5800 hospice organizations operate in the United States. 1 Hospice organizations are required by the Centers for Medicare and Medicaid Services (CMS) to use volunteers for services provided to patients. 2 Although CMS regulates the amount of hours hospice volunteers should provide, there are currently no national requirements for objectives of training. 3 The purpose of this study was to gather information from a sample of hospices regarding volunteer coordinator background, current training for volunteers, importance of training objectives, and any comments regarding additional objectives. Representative state hospice organizations were contacted by e-mail requesting their participation and distribution of the survey throughout their member hospices. The survey asked demographical questions, along with ratings of training components based on perceived level of importance and time spent on each objective. A total of 90 surveys were received, and the response rate was undeterminable. Results showed the majority of hospices were nonprofit, had less than 100 currently trained volunteers, and maintained an average daily patient census of less than 50. Questions regarding training programs indicated that most use live lecture methods of approximately 19 hours or less in duration. Overall, responding hospice organizations agreed that all objectives surveyed were important in training volunteers. The small number of respondents to this survey makes generalization nationwide difficult, however it is a strong starting point for the development of further surveys on hospice volunteer training and achieving a standardized set of training objectives and delivery methods.

  1. Multidisciplinary eHealth Survey Evaluation Methods

    Science.gov (United States)

    Karras, Bryant T.; Tufano, James T.

    2006-01-01

    This paper describes the development process of an evaluation framework for describing and comparing web survey tools. We believe that this approach will help shape the design, development, deployment, and evaluation of population-based health interventions. A conceptual framework for describing and evaluating web survey systems will enable the…

  2. Digital signal processor and processing method for GPS receivers

    Science.gov (United States)

    Thomas, Jr., Jess B. (Inventor)

    1989-01-01

    A digital signal processor and processing method therefor for use in receivers of the NAVSTAR/GLOBAL POSITIONING SYSTEM (GPS) employs a digital carrier down-converter, digital code correlator and digital tracking processor. The digital carrier down-converter and code correlator consists of an all-digital, minimum bit implementation that utilizes digital chip and phase advancers, providing exceptional control and accuracy in feedback phase and in feedback delay. Roundoff and commensurability errors can be reduced to extremely small values (e.g., less than 100 nanochips and 100 nanocycles roundoff errors and 0.1 millichip and 1 millicycle commensurability errors). The digital tracking processor bases the fast feedback for phase and for group delay in the C/A, P.sub.1, and P.sub.2 channels on the L.sub.1 C/A carrier phase thereby maintaining lock at lower signal-to-noise ratios, reducing errors in feedback delays, reducing the frequency of cycle slips and in some cases obviating the need for quadrature processing in the P channels. Simple and reliable methods are employed for data bit synchronization, data bit removal and cycle counting. Improved precision in averaged output delay values is provided by carrier-aided data-compression techniques. The signal processor employs purely digital operations in the sense that exactly the same carrier phase and group delay measurements are obtained, to the last decimal place, every time the same sampled data (i.e., exactly the same bits) are processed.

  3. Systematic Development of Miniaturized (Bio)Processes using Process Systems Engineering (PSE) Methods and Tools

    DEFF Research Database (Denmark)

    Krühne, Ulrich; Larsson, Hilde; Heintz, Søren

    2014-01-01

    The focus of this work is on process systems engineering (PSE) methods and tools, and especially on how such PSE methods and tools can be used to accelerate and support systematic bioprocess development at a miniature scale. After a short presentation of the PSE methods and the bioprocess...... of substrate and product, which is otherwise difficult to access. In the last example, a new approach to the design of microbioreactor layouts using topology optimization is presented and discussed. Finally, the PSE methods are carefully discussed with respect to the complexity of the presented approaches...

  4. Survey the Process of Collection and Turnover of Receivables, Yearly Budget Laws in Health Sector

    Directory of Open Access Journals (Sweden)

    Ahmad Rahbar

    2016-09-01

    Full Text Available Background & Aims of the Study:  The evaluation of financial performance is one of the main tasks of the manager that is very important. Any decision-making and rational planning in order to increase the productivity and survey the effect of adopted decision on the organization performance is in relation to the accurate assessment of financial performance. The aim of this study was to investigate the process of collection and turnover of receivables and rules of hospital budget in Qom University of medical sciences during the fourth and fifth development plans. Materials and Methods: This is a descriptive-analytic and cross-sectional study. It was used from researcher-made forms for collecting data. The data after collection from financial filing unit entered to the excel software and then the period of receivables collection, circulation of receivables,average daily sales and average intake accounts were analyzed,using activity ratios formula. Results: Our findings show that during the fourth and fifth development programs, the turnover of receivables decreased from four times to three times in the year and it is less than the least standard that is five times in the year. The period of receivables collection increased from 72 days to 147 days and it is more than the maximum standard that is 67 days. This issue is the indication of bad situations of insurance companies of under-contract with hospitals in paying of their obligations during study period. Conclusion: The period of collection and turnover of receivables of selected hospitals takes distance from the standard amount that needs to intervention at the macro level of decision-making. This study showed that resorting to legal leverage over the fourth and fifth development plan have not any impact on the improvement of reimbursement process. Therefore, the practical action of politicians about reformation of insurance’s structure and review of the ways of financing can be effective.

  5. Research on Methods of Processing Transit IC Card Information and Constructing Transit OD Matrix

    Science.gov (United States)

    Han, Xiuhua; Li, Jin; Peng, Han

    Transit OD matrix is of vital importance when planning urban transit system. Traditional transit OD matrix constructing method needs a large range of spot check survey. It is expensive and needs long cycle time to process information. Recently transit IC card charging systems have been widely applied in big cities. Being processed reasonably, transit passenger information stored in IC card database can turn into information resource. It will reduce survey cost a lot. The concept of transit trip chain is put forward in this paper. According to the characteristics of closed transit trip chain, it discusses how to process IC card information and construct transit OD matrix. It also points out that urban transit information platform and data warehouse should be constructed, and how to integrate IC card information.

  6. Examining Stress in Graduate Assistants: Combining Qualitative and Quantitative Survey Methods

    Science.gov (United States)

    Mazzola, Joseph J.; Walker, Erin J.; Shockley, Kristen M.; Spector, Paul E.

    2011-01-01

    The aim of this study was to employ qualitative and quantitative survey methods in a concurrent mixed model design to assess stressors and strains in graduate assistants. The stressors most frequently reported qualitatively were work overload, interpersonal conflict, and organizational constraints; the most frequently reported psychological…

  7. Optimal and adaptive methods of processing hydroacoustic signals (review)

    Science.gov (United States)

    Malyshkin, G. S.; Sidel'nikov, G. B.

    2014-09-01

    Different methods of optimal and adaptive processing of hydroacoustic signals for multipath propagation and scattering are considered. Advantages and drawbacks of the classical adaptive (Capon, MUSIC, and Johnson) algorithms and "fast" projection algorithms are analyzed for the case of multipath propagation and scattering of strong signals. The classical optimal approaches to detecting multipath signals are presented. A mechanism of controlled normalization of strong signals is proposed to automatically detect weak signals. The results of simulating the operation of different detection algorithms for a linear equidistant array under multipath propagation and scattering are presented. An automatic detector is analyzed, which is based on classical or fast projection algorithms, which estimates the background proceeding from median filtering or the method of bilateral spatial contrast.

  8. Survey of systems safety analysis methods and their application to nuclear waste management systems

    Energy Technology Data Exchange (ETDEWEB)

    Pelto, P.J.; Winegardner, W.K.; Gallucci, R.H.V.

    1981-11-01

    This report reviews system safety analysis methods and examines their application to nuclear waste management systems. The safety analysis methods examined include expert opinion, maximum credible accident approach, design basis accidents approach, hazard indices, preliminary hazards analysis, failure modes and effects analysis, fault trees, event trees, cause-consequence diagrams, G0 methodology, Markov modeling, and a general category of consequence analysis models. Previous and ongoing studies on the safety of waste management systems are discussed along with their limitations and potential improvements. The major safety methods and waste management safety related studies are surveyed. This survey provides information on what safety methods are available, what waste management safety areas have been analyzed, and what are potential areas for future study.

  9. Survey of systems safety analysis methods and their application to nuclear waste management systems

    International Nuclear Information System (INIS)

    Pelto, P.J.; Winegardner, W.K.; Gallucci, R.H.V.

    1981-11-01

    This report reviews system safety analysis methods and examines their application to nuclear waste management systems. The safety analysis methods examined include expert opinion, maximum credible accident approach, design basis accidents approach, hazard indices, preliminary hazards analysis, failure modes and effects analysis, fault trees, event trees, cause-consequence diagrams, G0 methodology, Markov modeling, and a general category of consequence analysis models. Previous and ongoing studies on the safety of waste management systems are discussed along with their limitations and potential improvements. The major safety methods and waste management safety related studies are surveyed. This survey provides information on what safety methods are available, what waste management safety areas have been analyzed, and what are potential areas for future study

  10. Survey of systems safety analysis methods and their application to nuclear waste management systems

    Science.gov (United States)

    Pelto, P. J.; Winegardner, W. K.; Gallucci, R. H. V.

    1981-11-01

    This report reviews system safety analysis methods and examines their application to nuclear waste management systems. The safety analysis methods examined include expert opinion, maximum credible accident approach, design basis accidents approach, hazard indices, preliminary hazards analysis, failure modes and effects analysis, fault trees, event trees, cause consequence diagrams, GO methodology, Markov modeling, and a general category of consequence analysis models. Previous and ongoing studies on the safety of waste management systems are discussed along with their limitations and potential improvements. The major safety methods and waste management safety related studies are surveyed. This survey provides information on what safety methods are available, what waste management safety areas have been analyzed, and what are potential areas for future study.

  11. Data processing and image reconstruction methods for pixel detectors

    International Nuclear Information System (INIS)

    Jakubek, Jan

    2007-01-01

    Semiconductor single-particle-counting pixel detectors offer many advantages for radiation imaging: high detection efficiency, energy discrimination, noiseless digital integration (counting), high frame rate and virtually unlimited dynamic range. All these properties allow to achieve high quality images. Examples of transmission images and 3D tomographic reconstruction using X-rays and slow neutrons are presented demonstrating effects that can affect the quality of images. A number of obstacles can limit detector performance if not handled. The pixel detector is in fact an array of individual detectors (pixels), each of them has its own efficiency, energy calibration and also noise. The common effort is to make all these parameters uniform for all pixels. However, an ideal uniformity can be never reached. Moreover, it is often seen that the signal in one pixel affects neighboring pixels due to various reasons (charge sharing, crosstalk, etc.). All such effects have to be taken into account during data processing to avoid false data interpretation. The main intention of this contribution is to summarize techniques of data processing and image correction to eliminate residual drawbacks of pixel detectors. It is shown how to extend these methods to handle further physical effects such as hardening of the beam and edge enhancement by deflection. Besides, more advanced methods of data processing such as tomographic 3D reconstruction are discussed. All methods are demonstrated on real experiments from biology and material science performed mostly with the Medipix2 pixel device. A brief view to the future of pixel detectors and their applications also including spectroscopy and particle tracking is given too

  12. The Dark Energy Survey Image Processing Pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Morganson, E.; et al.

    2018-01-09

    The Dark Energy Survey (DES) is a five-year optical imaging campaign with the goal of understanding the origin of cosmic acceleration. DES performs a 5000 square degree survey of the southern sky in five optical bands (g,r,i,z,Y) to a depth of ~24th magnitude. Contemporaneously, DES performs a deep, time-domain survey in four optical bands (g,r,i,z) over 27 square degrees. DES exposures are processed nightly with an evolving data reduction pipeline and evaluated for image quality to determine if they need to be retaken. Difference imaging and transient source detection are also performed in the time domain component nightly. On a bi-annual basis, DES exposures are reprocessed with a refined pipeline and coadded to maximize imaging depth. Here we describe the DES image processing pipeline in support of DES science, as a reference for users of archival DES data, and as a guide for future astronomical surveys.

  13. Methods and representativeness of a European survey in children and adolescents: the KIDSCREEN study

    Directory of Open Access Journals (Sweden)

    von Rueden Ursula

    2007-07-01

    Full Text Available Abstract Background The objective of the present study was to compare three different sampling and questionnaire administration methods used in the international KIDSCREEN study in terms of participation, response rates, and external validity. Methods Children and adolescents aged 8–18 years were surveyed in 13 European countries using either telephone sampling and mail administration, random sampling of school listings followed by classroom or mail administration, or multistage random sampling of communities and households with self-administration of the survey materials at home. Cooperation, completion, and response rates were compared across countries and survey methods. Data on non-respondents was collected in 8 countries. The population fraction (PF, respondents in each sex-age, or educational level category, divided by the population in the same category from Eurostat census data and population fraction ratio (PFR, ratio of PF and their corresponding 95% confidence intervals were used to analyze differences by country between the KIDSCREEN samples and a reference Eurostat population. Results Response rates by country ranged from 18.9% to 91.2%. Response rates were highest in the school-based surveys (69.0%–91.2%. Sample proportions by age and gender were similar to the reference Eurostat population in most countries, although boys and adolescents were slightly underrepresented (PFR Conclusion School-based sampling achieved the highest overall response rates but also produced slightly more biased samples than the other methods. The results suggest that the samples were sufficiently representative to provide reference population values for the KIDSCREEN instrument.

  14. Surface defect detection in tiling Industries using digital image processing methods: analysis and evaluation.

    Science.gov (United States)

    Karimi, Mohammad H; Asemani, Davud

    2014-05-01

    Ceramic and tile industries should indispensably include a grading stage to quantify the quality of products. Actually, human control systems are often used for grading purposes. An automatic grading system is essential to enhance the quality control and marketing of the products. Since there generally exist six different types of defects originating from various stages of tile manufacturing lines with distinct textures and morphologies, many image processing techniques have been proposed for defect detection. In this paper, a survey has been made on the pattern recognition and image processing algorithms which have been used to detect surface defects. Each method appears to be limited for detecting some subgroup of defects. The detection techniques may be divided into three main groups: statistical pattern recognition, feature vector extraction and texture/image classification. The methods such as wavelet transform, filtering, morphology and contourlet transform are more effective for pre-processing tasks. Others including statistical methods, neural networks and model-based algorithms can be applied to extract the surface defects. Although, statistical methods are often appropriate for identification of large defects such as Spots, but techniques such as wavelet processing provide an acceptable response for detection of small defects such as Pinhole. A thorough survey is made in this paper on the existing algorithms in each subgroup. Also, the evaluation parameters are discussed including supervised and unsupervised parameters. Using various performance parameters, different defect detection algorithms are compared and evaluated. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  15. Method of and apparatus for thermomagnetically processing a workpiece

    Science.gov (United States)

    Kisner, Roger A.; Rios, Orlando; Wilgen, John B.; Ludtka, Gerard M.; Ludtka, Gail M.

    2014-08-05

    A method of thermomagnetically processing a material includes disposing a workpiece within a bore of a magnet; exposing the workpiece to a magnetic field of at least about 1 Tesla generated by the magnet; and, while exposing the workpiece to the magnetic field, applying heat energy to the workpiece at a plurality of frequencies to achieve spatially-controlled heating of the workpiece. An apparatus for thermomagnetically processing a material comprises: a high field strength magnet having a bore extending therethrough for insertion of a workpiece therein; and an energy source disposed adjacent to an entrance to the bore. The energy source is an emitter of variable frequency heat energy, and the bore comprises a waveguide for propagation of the variable frequency heat energy from the energy source to the workpiece.

  16. Optimising UAV topographic surveys processed with structure-from-motion: Ground control quality, quantity and bundle adjustment

    Science.gov (United States)

    James, Mike R.; Robson, Stuart; d'Oleire-Oltmanns, Sebastian; Niethammer, Uwe

    2016-04-01

    Structure-from-motion (SfM) algorithms are greatly facilitating the production of detailed topographic models based on images collected by unmanned aerial vehicles (UAVs). However, SfM-based software does not generally provide the rigorous photogrammetric analysis required to fully understand survey quality. Consequently, error related to problems in control point data or the distribution of control points can remain undiscovered. Even if these errors are not large in magnitude, they can be systematic, and thus have strong implications for the use of products such as digital elevation models (DEMs) and orthophotos. Here, we develop a Monte Carlo approach to (1) improve the accuracy of products when SfM-based processing is used and (2) reduce the associated field effort by identifying suitable lower density deployments of ground control points. The method highlights over-parameterisation during camera self-calibration and provides enhanced insight into control point performance when rigorous error metrics are not available. Processing was implemented using commonly-used SfM-based software (Agisoft PhotoScan), which we augment with semi-automated and automated GCPs image measurement. We apply the Monte Carlo method to two contrasting case studies - an erosion gully survey (Taurodont, Morocco) carried out with an fixed-wing UAV, and an active landslide survey (Super-Sauze, France), acquired using a manually controlled quadcopter. The results highlight the differences in the control requirements for the two sites, and we explore the implications for future surveys. We illustrate DEM sensitivity to critical processing parameters and show how the use of appropriate parameter values increases DEM repeatability and reduces the spatial variability of error due to processing artefacts.

  17. Random sampling of quantum states: a survey of methods and some issues regarding the Overparametrized Method

    Energy Technology Data Exchange (ETDEWEB)

    Maziero, Jonas, E-mail: jonas.maziero@ufsm.br [Universidade Federal de Santa Maria (UFSM), Santa Maria, RS (Brazil). Dept. de Fisica

    2015-12-15

    The numerical generation of random quantum states (RQS) is an important procedure for investigations in quantum information science. Here, we review some methods that may be used for performing that task. We start by presenting a simple procedure for generating random state vectors, for which the main tool is the random sampling of unbiased discrete probability distributions (DPD). Afterwards, the creation of random density matrices is addressed. In this context, we first present the standard method, which consists in using the spectral decomposition of a quantum state for getting RQS from random DPDs and random unitary matrices. In the sequence, the Bloch vector parametrization method is described. This approach, despite being useful in several instances, is not in general convenient for RQS generation. In the last part of the article, we regard the overparametrized method (OPM) and the related Ginibre and Bures techniques. The OPM can be used to create random positive semidefinite matrices with unit trace from randomly produced general complex matrices in a simple way that is friendly for numerical implementations. We consider a physically relevant issue related to the possible domains that may be used for the real and imaginary parts of the elements of such general complex matrices. Subsequently, a too fast concentration of measure in the quantum state space that appears in this parametrization is noticed. (author)

  18. [Work process and working conditions in poultry processing plants: report of a survey on occupational health surveillance].

    Science.gov (United States)

    Oliveira, Paulo Antonio Barros; Mendes, Jussara Maria Rosa

    2014-12-01

    This article presents the report of a survey on health surveillance activities performed in poultry processing plants in the south of Brazil. It aims to contribute to an understanding of the work process developed, the growth of the sector, the organization of labor and the confrontation with the economic model of this sector, which has been exposing employees to working conditions that undermine their health. The working conditions identified are considered largely incompatible with health and human dignity. The study supports interinstitutional intervention, especially with the Public Ministry of Labor, criticizes the weak implementation of specific government interventions in health conditions in the industry and introduces the new Regulatory Standard 36 as a positive perspective for the near future.

  19. Design and methodology of a mixed methods follow-up study to the 2014 Ghana Demographic and Health Survey.

    Science.gov (United States)

    Staveteig, Sarah; Aryeetey, Richmond; Anie-Ansah, Michael; Ahiadeke, Clement; Ortiz, Ladys

    2017-01-01

    The intended meaning behind responses to standard questions posed in large-scale health surveys are not always well understood. Systematic follow-up studies, particularly those which pose a few repeated questions followed by open-ended discussions, are well positioned to gauge stability and consistency of data and to shed light on the intended meaning behind survey responses. Such follow-up studies require extensive coordination and face challenges in protecting respondent confidentiality during the process of recontacting and reinterviewing participants. We describe practical field strategies for undertaking a mixed methods follow-up study during a large-scale health survey. The study was designed as a mixed methods follow-up study embedded within the 2014 Ghana Demographic and Health Survey (GDHS). The study was implemented in 13 clusters. Android tablets were used to import reference data from the parent survey and to administer the questionnaire, which asked a mixture of closed- and open-ended questions on reproductive intentions, decision-making, and family planning. Despite a number of obstacles related to recontacting respondents and concern about respondent fatigue, over 92 percent of the selected sub-sample were successfully recontacted and reinterviewed; all consented to audio recording. A confidential linkage between GDHS data, follow-up tablet data, and audio transcripts was successfully created for the purpose of analysis. We summarize the challenges in follow-up study design, including ethical considerations, sample size, auditing, filtering, successful use of tablets, and share lessons learned for future such follow-up surveys.

  20. Deformation Study of Papandayan Volcano using GPS Survey Method and Its Correlation with Seismic Data Observation

    Directory of Open Access Journals (Sweden)

    Dina A. Sarsito

    2006-11-01

    Full Text Available Papandayan volcano located in the southern part of Garut regency, around 70 km away from Bandung city, West Java. Many methods carried out to monitoring the activities of volcano, both continuously or periodically, one of the monitoring method is periodically GPS survey. Basically those surveys are carried out to understand the pattern and velocity of displacement which occurred in the volcano body, both horizontally and vertically, and also others deformation elements such as; translation, rotation and dilatation. The Mogi modeling was also used to determine the location and volume of the pressure source which caused deformation of volcano body. By comparing seismic activity and the deformation reveal from GPS measurement, before, during and after eruption, it could be understood there is a correlation between the seismicity and its deformation. These studies is hoping that GPS measurement in Papandayan volcano could be one of supported method to determine the volcano activities, at least in Papandayan volcano.

  1. Measuring walking and cycling using the PABS (pedestrian and bicycling survey) approach : a low-cost survey method for local communities [research brief].

    Science.gov (United States)

    2010-10-01

    Many communities want to promote walking and cycling. However, few know how much nonmotorized travel already occurs in their communities. This research project developed the Pedestrian and Bicycling Survey (PABS), a method that local governments can ...

  2. Methods of the NSW Schools Physical Activity and Nutrition Survey 2010 (SPANS 2010).

    Science.gov (United States)

    Hardy, L L; King, L; Espinel, P; Okely, A D; Bauman, A

    2011-09-01

    Addressing the high prevalence of overweight and obesity and unhealthy lifestyles among New South Wales (NSW) (the most populous state in Australia) youth is a government priority. The primary aim of the NSW Schools Physical Activity and Nutrition Survey (SPANS 2010; n=8058) was to monitor progress towards the NSW State Plan and State Health Plan priorities and targets for child obesity. SPANS 2010 is the third in a series of NSW cross sectional representative population survey of school children in Kindergarten, Grades 2, 4, 6, 8 and 10. SPANS 2010 was conducted in schools in February-April 2010 (summer school term). The survey comprises measures of weight status (anthropometry) and weight related behaviours including the assessment of cardiorespiratory fitness, fundamental movement skills and questionnaires asking about diet habits and patterns, sedentary behaviours, school travel and physical activity. Parents of students in Kindergarten Grades 2 and 4 proxy-reported for their child and students in Grades 6, 8 and 10 self reported. This paper describes the survey methods of SPANS 2010. Survey information will be used to guide policies and interventions which promote healthy weight and lifestyles among young people, and to monitor the overall impact of recent interventions and policies. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.

  3. Data Processing Procedures and Methodology for Estimating Trip Distances for the 1995 American Travel Survey (ATS)

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, H.-L.; Rollow, J.

    2000-05-01

    The 1995 American Travel Survey (ATS) collected information from approximately 80,000 U.S. households about their long distance travel (one-way trips of 100 miles or more) during the year of 1995. It is the most comprehensive survey of where, why, and how U.S. residents travel since 1977. ATS is a joint effort by the U.S. Department of Transportation (DOT) Bureau of Transportation Statistics (BTS) and the U.S. Department of Commerce Bureau of Census (Census); BTS provided the funding and supervision of the project, and Census selected the samples, conducted interviews, and processed the data. This report documents the technical support for the ATS provided by the Center for Transportation Analysis (CTA) in Oak Ridge National Laboratory (ORNL), which included the estimation of trip distances as well as data quality editing and checking of variables required for the distance calculations.

  4. High-speed multispectral infrared imaging and data processing method

    Science.gov (United States)

    Rhee, Kyung T.

    1995-05-01

    The new imaging system developed in our laboratory facilitates the solutions of problems otherwise difficult to remedy. In this lecture, the progressive steps taken for building our new diagnostic tool are explained, followed by a description of the system and our data processing methods. Some of the results obtained by using the device are presented. The system design was directed to incorporation of off-the- shelf components with several newly fabricated units in order to overcome limitations in existing infrared (IR) imaging systems. In the new IR imaging system which recently became operational, four high-speed IR camera units are lined up to a single (reflective) optical unit having three spectral beam splitters. This permits simultaneous framing of four geometrically (pixel-to-pixel) identical images of the same object in respective spectral bands. The multispectral imaging by the camera is activated either by the internal clock (at a rate over 1,800 frames/sec) or an external signal such as pulses generated by an encoder. Unique features incorporated in the system include: independent variation of the framing rate and the exposure period in terms of time period (as short as 30 microsecond(s) ec) or the number of external pulses; control of the total number of images to be obtained per event from successive cyclic processes. The new device is applied to 'quantitative imaging' of rapidly reacting events/objects, e.g. determination of temporal and spatial variations of the thermochemical characteristics. Thermal objects, which typically involve a reactor wall and a gaseous mixture in front, are studied by obtaining the high-speed digital readout from the corresponding pixels of: two wall images in separate wavebands and two mixture images in other bands, i.e., total of four matrices of digital output at a time. The results are processed by the conventional two-color method and a new dual-band spectrometric algorithm.

  5. Graph Processing on GPUs: A Survey

    DEFF Research Database (Denmark)

    Shi, Xuanhua; Zheng, Zhigao; Zhou, Yongluan

    2018-01-01

    In the big data era, much real-world data can be naturally represented as graphs. Consequently, many application domains can be modeled as graph processing. Graph processing, especially the processing of the large-scale graphs with the number of vertices and edges in the order of billions or even......, utilizing GPU to accelerate graph processing proves to be a promising solution. This article surveys the key issues of graph processing on GPUs, including data layout, memory access pattern, workload mapping, and specific GPU programming. In this article, we summarize the state-of-the-art research on GPU...

  6. Topsoil thickness mapping at watershed scale by integration of field survey, geophysics and remote sensing methods

    Science.gov (United States)

    Francés, Alain Pascal; Lubczynski, Maciek

    2010-05-01

    The adequate parameterisation of near subsurface is a critical issue due to the large spatial variability of soil properties. Direct observations made by common invasive field sampling procedures through drilling and trench excavations can be complemented in an efficient way by non-invasive geophysical methods, improving spatial data coverage in cost and time efficient way. The geophysical methods measure a physical property of subsurface that is convertible into the parameter or variable of interest. Such conversion requires development of data integration method. In this study, we present a methodology of data integration to assess spatially the topsoil thickness at the watershed scale. To integrate the spatial variability of the soil characteristics, we used a combination of field survey, ground-geophysics, satellite and aerial imagery processing and statistical estimation techniques. The ground-geophysics was used to complement and extend the direct field observations of the topsoil thickness. The conversion of the geophysical data in topsoil thickness and the estimation of the topsoil thickness over the catchment were done through statistical methods that integrated auxiliary variables derived from the remote sensing imagery (soil and geomorphology classifications and terrain attributes). A simple and expedite soil classification based on multi-resolution segmentation of image objects and fuzzy logic was derived from a high-resolution multispectral QuickBird image combined with aerial photograph. Landform classes and terrain attributes were computed from the Global Digital Elevation Model (GDEM) of the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) satellite. We applied this methodology to the Pisões catchment (~19 km2, Portugal) where the AB horizon, following the standard pedologic classification, is characterized by its high concentration in swelling clay. In the first step, we elaborated the sampling schema of the geophysical

  7. Survey Methods, Traditional, Public Opinion Polling

    DEFF Research Database (Denmark)

    Elmelund-Præstekær, Christian; Hopmann, David Nicolas; Pedersen, Rasmus Tue

    2017-01-01

    Traditional public opinion polls are surveys in which a random sample of a given population is asked questions about their attitudes, knowledge, or behavior. If conducted properly, the answers from such surveys are approximately representative of the entire population. Traditional public opinion...... polling is typically based on four different methods of data gathering, or combinations hereof: face-to-face, postal surveys, phone surveys, and web surveys. Given that opinion polls are based on a sample, we cannot be sure that the sample reflects public opinion perfectly, however—even if randomness...... is perfect. Moreover, responses may be highly dependent on the contextual information provided with the question. Also, it may be difficult to capture past or complex causes of attitudes or behavior. In short, surveys are a precise way of measuring public opinion, but they do not come without challenges....

  8. A Survey of Symplectic and Collocation Integration Methods for Orbit Propagation

    Science.gov (United States)

    Jones, Brandon A.; Anderson, Rodney L.

    2012-01-01

    Demands on numerical integration algorithms for astrodynamics applications continue to increase. Common methods, like explicit Runge-Kutta, meet the orbit propagation needs of most scenarios, but more specialized scenarios require new techniques to meet both computational efficiency and accuracy needs. This paper provides an extensive survey on the application of symplectic and collocation methods to astrodynamics. Both of these methods benefit from relatively recent theoretical developments, which improve their applicability to artificial satellite orbit propagation. This paper also details their implementation, with several tests demonstrating their advantages and disadvantages.

  9. Optimizing process and equipment efficiency using integrated methods

    Science.gov (United States)

    D'Elia, Michael J.; Alfonso, Ted F.

    1996-09-01

    The semiconductor manufacturing industry is continually riding the edge of technology as it tries to push toward higher design limits. Mature fabs must cut operating costs while increasing productivity to remain profitable and cannot justify large capital expenditures to improve productivity. Thus, they must push current tool production capabilities to cut manufacturing costs and remain viable. Working to continuously improve mature production methods requires innovation. Furthermore, testing and successful implementation of these ideas into modern production environments require both supporting technical data and commitment from those working with the process daily. At AMD, natural work groups (NWGs) composed of operators, technicians, engineers, and supervisors collaborate to foster innovative thinking and secure commitment. Recently, an AMD NWG improved equipment cycle time on the Genus tungsten silicide (WSi) deposition system. The team used total productive manufacturing (TPM) to identify areas for process improvement. Improved in-line equipment monitoring was achieved by constructing a real time overall equipment effectiveness (OEE) calculator which tracked equipment down, idle, qualification, and production times. In-line monitoring results indicated that qualification time associated with slow Inspex turn-around time and machine downtime associated with manual cleans contributed greatly to reduced availability. Qualification time was reduced by 75% by implementing a new Inspex monitor pre-staging technique. Downtime associated with manual cleans was reduced by implementing an in-situ plasma etch back to extend the time between manual cleans. A designed experiment was used to optimize the process. Time between 18 hour manual cleans has been improved from every 250 to every 1500 cycles. Moreover defect density realized a 3X improvement. Overall, the team achieved a 35% increase in tool availability. This paper details the above strategies and accomplishments.

  10. [Survey of methods of cleaning, decontamination, disinfection and sterilization in dental health services in tropical areas].

    Science.gov (United States)

    Clapeau, G; Decroix, B; Bakayoko-Ly, R; Varenne, B; Dosso-Hien, D; Decroix, M O

    1997-01-01

    hygiene standards despite difficult practice conditions, exacerbated by supply problems. In all applications, hygiene involves a succession of closely-related, logical steps, which form an asepsis chain aimed at preventing the transmission of infection. Our survey shows that fundamental elements of hygiene require attention to achieve this aim. The cleaning, disinfection and sterilizing of floor surfaces and equipment should be improved and more widespread use made of disposable items. It is important to define the hygiene level required for particular treatments, taking into account the oral and dental micro flora and whether the equipment has been decontaminated, disinfected or sterilized. A piece of equipment is decontaminated if it has been mechanically cleaned and decontaminated. It is disinfected if these steps are followed by rinsing with sterile water, drying and conditioning. An item is described as sterilized if it is cleaned, decontaminated, rinsed, dried, conditioned and then sterilized. We found that a wide variety of chemicals were used to clean hands, surfaces and equipment. The nature and appropriate methods of use of these chemicals were not widely known. Understanding the chemical composition of these chemicals makes it possible to classify them into cleaning agents, detergents, decontaminating agents and disinfectants. The definition, choice and use of antiseptics and disinfectants should be strictly controlled. It is also vital that single-use disposable items are used only once and are never reused. Hygiene in the dental surgery is a chain of processes aimed at protecting the patient and the medical staff. There are many links in the chain, involving floor and surface hygiene, hand washing by dentists and dental assistants, washing of surgery linen and treatment of equipment. Dental practitioners should continually focus on ensuring that the chain of hygiene procedures is not broken, in their own interests as well as in those of their patients.

  11. Data processing device test apparatus and method therefor

    Science.gov (United States)

    Wilcox, Richard Jacob; Mulig, Jason D.; Eppes, David; Bruce, Michael R.; Bruce, Victoria J.; Ring, Rosalinda M.; Cole, Jr., Edward I.; Tangyunyong, Paiboon; Hawkins, Charles F.; Louie, Arnold Y.

    2003-04-08

    A method and apparatus mechanism for testing data processing devices are implemented. The test mechanism isolates critical paths by correlating a scanning microscope image with a selected speed path failure. A trigger signal having a preselected value is generated at the start of each pattern vector. The sweep of the scanning microscope is controlled by a computer, which also receives and processes the image signals returned from the microscope. The value of the trigger signal is correlated with a set of pattern lines being driven on the DUT. The trigger is either asserted or negated depending the detection of a pattern line failure and the particular line that failed. In response to the detection of the particular speed path failure being characterized, and the trigger signal, the control computer overlays a mask on the image of the device under test (DUT). The overlaid image provides a visual correlation of the failure with the structural elements of the DUT at the level of resolution of the microscope itself.

  12. Control and monitoring method and system for electromagnetic forming process

    Science.gov (United States)

    Kunerth, Dennis C.; Lassahn, Gordon D.

    1990-01-01

    A process, system, and improvement for a process for electromagnetic forming of a workpiece in which characteristics of the workpiece such as its geometry, electrical conductivity, quality, and magnetic permeability can be determined by monitoring the current and voltage in the workcoil. In an electromagnet forming process in which a power supply provides current to a workcoil and the electromagnetic field produced by the workcoil acts to form the workpiece, the dynamic interaction of the electromagnetic fields produced by the workcoil with the geometry, electrical conductivity, and magnetic permeability of the workpiece, provides information pertinent to the physical condition of the workpiece that is available for determination of quality and process control. This information can be obtained by deriving in real time the first several time derivatives of the current and voltage in the workcoil. In addition, the process can be extended by injecting test signals into the workcoil during the electromagnetic forming and monitoring the response to the test signals in the workcoil.

  13. The SAGES Legacy Unifying Globulars and Galaxies survey (SLUGGS): sample definition, methods, and initial results

    Energy Technology Data Exchange (ETDEWEB)

    Brodie, Jean P.; Romanowsky, Aaron J.; Jennings, Zachary G.; Pota, Vincenzo; Kader, Justin; Roediger, Joel C.; Villaume, Alexa; Arnold, Jacob A.; Woodley, Kristin A. [University of California Observatories, 1156 High Street, Santa Cruz, CA 95064 (United States); Strader, Jay [Department of Physics and Astronomy, Michigan State University, East Lansing, MI 48824 (United States); Forbes, Duncan A.; Pastorello, Nicola; Usher, Christopher; Blom, Christina; Kartha, Sreeja S. [Centre for Astrophysics and Supercomputing, Swinburne University, Hawthorn, VIC 3122 (Australia); Foster, Caroline; Spitler, Lee R., E-mail: jbrodie@ucsc.edu [Australian Astronomical Observatory, P.O. Box 915, North Ryde, NSW 1670 (Australia)

    2014-11-20

    We introduce and provide the scientific motivation for a wide-field photometric and spectroscopic chemodynamical survey of nearby early-type galaxies (ETGs) and their globular cluster (GC) systems. The SAGES Legacy Unifying Globulars and GalaxieS (SLUGGS) survey is being carried out primarily with Subaru/Suprime-Cam and Keck/DEIMOS. The former provides deep gri imaging over a 900 arcmin{sup 2} field-of-view to characterize GC and host galaxy colors and spatial distributions, and to identify spectroscopic targets. The NIR Ca II triplet provides GC line-of-sight velocities and metallicities out to typically ∼8 R {sub e}, and to ∼15 R {sub e} in some cases. New techniques to extract integrated stellar kinematics and metallicities to large radii (∼2-3 R {sub e}) are used in concert with GC data to create two-dimensional (2D) velocity and metallicity maps for comparison with simulations of galaxy formation. The advantages of SLUGGS compared with other, complementary, 2D-chemodynamical surveys are its superior velocity resolution, radial extent, and multiple halo tracers. We describe the sample of 25 nearby ETGs, the selection criteria for galaxies and GCs, the observing strategies, the data reduction techniques, and modeling methods. The survey observations are nearly complete and more than 30 papers have so far been published using SLUGGS data. Here we summarize some initial results, including signatures of two-phase galaxy assembly, evidence for GC metallicity bimodality, and a novel framework for the formation of extended star clusters and ultracompact dwarfs. An integrated overview of current chemodynamical constraints on GC systems points to separate, in situ formation modes at high redshifts for metal-poor and metal-rich GCs.

  14. Disinfection methods in general practice and health authority clinics: a telephone survey

    OpenAIRE

    Farrow, S.C.; Kaul, S.; Littlepage, B.C.

    1988-01-01

    Concern about the epidemic of the acquired immune deficiency syndrome led to discussions in one health district about the dangers of cross-infection from instruments in general practice and health authority clinics. In order to establish what current disinfection practices were in use a telephone survey was adopted as a quick and easy method of data collection. Information was collected on who was responsible for disinfection as well as details of how each instrument was disinfected. Results ...

  15. HIDE & SEEK: End-to-end packages to simulate and process radio survey data

    Science.gov (United States)

    Akeret, J.; Seehars, S.; Chang, C.; Monstein, C.; Amara, A.; Refregier, A.

    2017-01-01

    As several large single-dish radio surveys begin operation within the coming decade, a wealth of radio data will become available and provide a new window to the Universe. In order to fully exploit the potential of these datasets, it is important to understand the systematic effects associated with the instrument and the analysis pipeline. A common approach to tackle this is to forward-model the entire system-from the hardware to the analysis of the data products. For this purpose, we introduce two newly developed, open-source Python packages: the HI Data Emulator (HIDE) and the Signal Extraction and Emission Kartographer (SEEK) for simulating and processing single-dish radio survey data. HIDE forward-models the process of collecting astronomical radio signals in a single-dish radio telescope instrument and outputs pixel-level time-ordered-data. SEEK processes the time-ordered-data, removes artifacts from Radio Frequency Interference (RFI), automatically applies flux calibration, and aims to recover the astronomical radio signal. The two packages can be used separately or together depending on the application. Their modular and flexible nature allows easy adaptation to other instruments and datasets. We describe the basic architecture of the two packages and examine in detail the noise and RFI modeling in HIDE, as well as the implementation of gain calibration and RFI mitigation in SEEK. We then apply HIDE &SEEK to forward-model a Galactic survey in the frequency range 990-1260 MHz based on data taken at the Bleien Observatory. For this survey, we expect to cover 70% of the full sky and achieve a median signal-to-noise ratio of approximately 5-6 in the cleanest channels including systematic uncertainties. However, we also point out the potential challenges of high RFI contamination and baseline removal when examining the early data from the Bleien Observatory. The fully documented HIDE &SEEK packages are available at http://hideseek.phys.ethz.ch/ and are published

  16. NEURAL NETWORK METHODS FOR ESTIMATING THE COSTS OF RESEARCH AND DESIGN AND SURVEY WORKS

    Directory of Open Access Journals (Sweden)

    M. A. Karpovich

    2014-01-01

    Full Text Available Summary. The article describes the neural network methods for estimating the costs of research and design and survey work in the construction of roads, allowing on the basis of data previously projected objects produce price-setting ranking factors according to their impact on the cost of research and development works undertaken for public-private partnership (PPP. Advantages of neural network methods is determined by the following circumstances : neural network models automatically take into account the mutual influence of the pricing factors ; Neural methods are completely free of subjective factors. Optimization of neural network allowed rank pricesetting parameters according to their impact on the cost of research , design and survey work under PPP . This causes the 4 "customer status", "Type of work", "Kind of competition" and "Road category" in the aggregate more than 87 % determined by the unit price of the project. Specific calculations show that the neural network allow very accurate ( with a relative error less than 0.2% to describe most of the objects , and only a small fraction - less than 5 % of subsets , with significant error - from 9 % to 17 %.

  17. Nonaqueous processing methods

    Energy Technology Data Exchange (ETDEWEB)

    Coops, M.S.; Bowersox, D.F.

    1984-09-01

    A high-temperature process utilizing molten salt extraction from molten metal alloys has been developed for purification of spent power reactor fuels. Experiments with laboratory-scale processing operations show that purification and throughput parameters comparable to the Barnwell Purex process can be achieved by pyrochemical processing in equipment one-tenth the size, with all wastes being discharged as stable metal alloys at greatly reduced volume and disposal cost. This basic technology can be developed for large-scale processing of spent reactor fuels. 13 references, 4 figures.

  18. Curriculum and instructional methods for drug information, literature evaluation, and biostatistics: survey of US pharmacy schools.

    Science.gov (United States)

    Phillips, Jennifer A; Gabay, Michael P; Ficzere, Cathy; Ward, Kristina E

    2012-06-01

    The drug information curriculum in US colleges of pharmacy continues to evolve. The American College of Clinical Pharmacy (ACCP) Drug Information Practice and Research Network (DI PRN) published an opinion paper with specific recommendations regarding drug information education in 2009. Adoption of these recommendations has not been evaluated. To assess which recommendations made in the ACCP DI PRN opinion paper are included in US pharmacy school curricula and characterize faculty qualifications, educational methods, and recent changes in drug information education. An electronic survey was designed using the ACCP DI PRN opinion paper and the Accreditation Council for Pharmacy Education standards and guidelines for accreditation of PharmD programs in the US. Survey questions addressed curricular content within the following categories: drug information, literature evaluation, and biostatistics. A letter including the online survey link was sent via email to the dean of each US college/school of pharmacy (N = 128). Recipients were instructed to forward the email to the individual at their institution who was the most knowledgeable about the content and methodology used for didactic drug information education. Sixty-four responses were included in the final analysis. Of the 19 ACCP DI PRN minimum core concepts, 9 (47%) were included in curricula of all responding institutions; 14 of 19 (74%) were included in curricula for all but 1 institution. In contrast, 5 of 16 concepts (31%) were not formally taught by a number of institutions. Many respondents noted an increased focus on evidence-based medicine, medication safety, and informatics. Although a survey of drug information curricula documented substantial inclusion of the essential concepts presented in the ACCP DI PRN opinion paper, room for improvement remains in drug information curricula in US colleges of pharmacy.

  19. Medication non-adherence and uncertainty: Information-seeking and processing in the Danish LIFESTAT survey.

    Science.gov (United States)

    Kriegbaum, Margit; Lau, Sofie Rosenlund

    2017-09-23

    Statins are widely prescribed to lower cardiovascular morbidity and mortality. However, statin non-adherence is very high. The aim of this paper was to investigate reasons for stopping statin treatment in the general population and to study how aspects of information-seeking and processing is associated with statin non-adherence. This study used a population survey on 3050 Danish residents aged 45-65 years. Reasons for statin discontinuation was studied among those who were previous statin users. The association between information seeking and processing and statin discontinuation were analysed using multivariate logistical regression models. Experience of side effects and fear of side effects played an important role in the discontinuation of statin treatment. Feelings of uncertainty and confusion regarding information on statins predicted statin discontinuation. This applied to information from both mass media and from general practitioners. There was no clear pattern of information seeking and statin non-adherence. The article point to the impact of information-seeking on the decision to take cholesterol-lowering medication. This included contributions from information disseminated by media outlets. Side effects and fear of side effects should be addressed in clinical practice. Health care professionals should pay attention to emotional aspects of how information is disseminated and perceived by statin users. Copyright © 2017. Published by Elsevier Inc.

  20. Survey of an evaluation method for research and development projects; Kenkyu kaihatsu project no hyoka shuho ni kansuru chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    This report describes an interim evaluation method and a concrete evaluation method for projects promoted by the Agency of Industrial Science and Technology, and NEDO. As a result of the survey, a method of highly practical interim evaluation, concrete evaluation items, and evaluation criteria have been proposed by assuming that the projects are evaluated by the project evaluation department independent of the project promotion department. Long-term issues for constructing the evaluation system are also described. It is the most essential for the evaluation to fulfill the function of effective promotion of the following projects. It is also indispensable for the evaluation method and issues proposed in this report to communicate closely to project promoters and researchers, and to reassess the projects continuously. Continuous consideration for the feedback of evaluation process and the improvement of evaluation are significant for the long-term construction of system. 21 refs., 9 figs., 23 tabs.

  1. METHODS FOR IMPROVING PROCESS CONTROL AND CORRECTION IN FLEXOGRAPHIC PRINTING

    Directory of Open Access Journals (Sweden)

    Dorin PIȚIGOI

    2016-05-01

    Full Text Available The printability tester is defined as a device for uniformly applying a reproducible amount of ink to a substrate under specified conditions using a motorized control of the ink transfer process specific function. This repeatable print operation characteristic may also facilitate the correlation of the actual printing condition set by the primary process parameters specified by printing process, substrate, ink, screening, printing order, etc. By extending the usage and functionality of the printability tester, printers have the possibility to use laboratory test prints as means of process control and correction at a fraction of cost and time.

  2. Survey of engineering computational methods and experimental programs for estimating supersonic missile aerodynamic characteristics

    Science.gov (United States)

    Sawyer, W. C.; Allen, J. M.; Hernandez, G.; Dillenius, M. F. E.; Hemsch, M. J.

    1982-01-01

    This paper presents a survey of engineering computational methods and experimental programs used for estimating the aerodynamic characteristics of missile configurations. Emphasis is placed on those methods which are suitable for preliminary design of conventional and advanced concepts. An analysis of the technical approaches of the various methods is made in order to assess their suitability to estimate longitudinal and/or lateral-directional characteristics for different classes of missile configurations. Some comparisons between the predicted characteristics and experimental data are presented. These comparisons are made for a large variation in flow conditions and model attitude parameters. The paper also presents known experimental research programs developed for the specific purpose of validating analytical methods and extending the capability of data-base programs.

  3. METHODS FOR IMPROVING PROCESS CONTROL AND CORRECTION IN FLEXOGRAPHIC PRINTING

    OpenAIRE

    Dorin PIȚIGOI; Emilia BĂLAN

    2016-01-01

    The printability tester is defined as a device for uniformly applying a reproducible amount of ink to a substrate under specified conditions using a motorized control of the ink transfer process specific function. This repeatable print operation characteristic may also facilitate the correlation of the actual printing condition set by the primary process parameters specified by printing process, substrate, ink, screening, printing order, etc. By extending the usage and functionality of the pr...

  4. Modelling and simulation of diffusive processes methods and applications

    CERN Document Server

    Basu, SK

    2014-01-01

    This book addresses the key issues in the modeling and simulation of diffusive processes from a wide spectrum of different applications across a broad range of disciplines. Features: discusses diffusion and molecular transport in living cells and suspended sediment in open channels; examines the modeling of peristaltic transport of nanofluids, and isotachophoretic separation of ionic samples in microfluidics; reviews thermal characterization of non-homogeneous media and scale-dependent porous dispersion resulting from velocity fluctuations; describes the modeling of nitrogen fate and transport

  5. From "models" to "reality", and Return. Some Reflections on the Interaction Between Survey and Interpretative Methods for Built Heritage Conservation

    Science.gov (United States)

    Ottoni, F.; Freddi, F.; Zerbi, A.

    2017-05-01

    It's well known that more and more accurate methodologies and automatic tools are now available in the field of geometric survey and image processing and they constitute a fundamental instrument for cultural heritage knowledge and preservation; on the other side, very smart and precise numerical models are continuously improved and used in order to simulate the mechanical behaviour of masonry structures: both instruments and technologies are important part of a global process of knowledge which is at the base of any conservation project of cultural heritage. Despite the high accuracy and automation level reached by both technologies and programs, the transfer of data between them is not an easy task and defining the most reliable way to translate and exchange information without data loosing is still an open issue. The goal of the present paper is to analyse the complex process of translation from the very precise (and sometimes redundant) information obtainable by the modern survey methodologies for historic buildings (as laser scanner), into the very simplified (may be too much) schemes used to understand their real structural behaviour, with the final aim to contribute to the discussion on reliable methods for cultural heritage knowledge improvement, through empiricism.

  6. On Intelligent Design and Planning Method of Process Route Based on Gun Breech Machining Process

    Science.gov (United States)

    Hongzhi, Zhao; Jian, Zhang

    2018-03-01

    The paper states an approach of intelligent design and planning of process route based on gun breech machining process, against several problems, such as complex machining process of gun breech, tedious route design and long period of its traditional unmanageable process route. Based on gun breech machining process, intelligent design and planning system of process route are developed by virtue of DEST and VC++. The system includes two functional modules--process route intelligent design and its planning. The process route intelligent design module, through the analysis of gun breech machining process, summarizes breech process knowledge so as to complete the design of knowledge base and inference engine. And then gun breech process route intelligently output. On the basis of intelligent route design module, the final process route is made, edited and managed in the process route planning module.

  7. National Survey on Access, Use and Promotion of Rational Use of Medicines (PNAUM): household survey component methods.

    Science.gov (United States)

    Mengue, Sotero Serrate; Bertoldi, Andréa Dâmaso; Boing, Alexandra Crispim; Tavares, Noemia Urruth Leão; Pizzol, Tatiane da Silva Dal; Oliveira, Maria Auxiliadora; Arrais, Paulo Sérgio Dourado; Ramos, Luiz Roberto; Farias, Mareni Rocha; Luiza, Vera Lucia; Bernal, Regina Tomie Ivata; Barros, Aluísio Jardim Dornellas de

    2016-12-01

    To describe methodological aspects of the household survey National Survey on Access, Use and Promotion of Rational Use of Medicines (PNAUM) related to sampling design and implementation, the actual obtained sample, instruments and fieldwork. A cross-sectional, population-based study with probability sampling in three stages of the population living in households located in Brazilian urban areas. Fieldwork was carried out between September 2013 and February 2014. The data collection instrument included questions related to: information about households, residents and respondents; chronic diseases and medicines used; use of health services; acute diseases and events treated with drugs; use of contraceptives; use of pharmacy services; behaviors that may affect drug use; package inserts and packaging; lifestyle and health insurance. In total, 41,433 interviews were carried out in 20,404 households and 576 urban clusters corresponding to 586 census tracts distributed in the five Brazilian regions, according to eight domains defined by age and gender. The results of the survey may be used as a baseline for future studies aiming to assess the impact of government action on drug access and use. For local studies using a compatible method, PNAUM may serve as a reference point to evaluate variations in space and population. With a comprehensive evaluation of drug-related aspects, PNAUM is a major source of data for a variety of analyses to be carried out both at academic and government level. Descrever aspectos metodológicos do inquérito domiciliar da Pesquisa Nacional sobre Acesso, Utilização e Promoção do Uso Racional de Medicamentos (PNAUM) quanto ao desenho e implementação da amostragem e da amostra efetivamente obtida, seus instrumentos e implementação do campo. Estudo transversal de base populacional com amostra probabilística em três estágios da população residente nos domicílios localizados na zona urbana do Brasil. O trabalho de campo foi

  8. Hyperspectral image processing methods

    Science.gov (United States)

    Hyperspectral image processing refers to the use of computer algorithms to extract, store and manipulate both spatial and spectral information contained in hyperspectral images across the visible and near-infrared portion of the electromagnetic spectrum. A typical hyperspectral image processing work...

  9. Remote sensing models and methods for image processing

    CERN Document Server

    Schowengerdt, Robert A

    1997-01-01

    This book is a completely updated, greatly expanded version of the previously successful volume by the author. The Second Edition includes new results and data, and discusses a unified framework and rationale for designing and evaluating image processing algorithms.Written from the viewpoint that image processing supports remote sensing science, this book describes physical models for remote sensing phenomenology and sensors and how they contribute to models for remote-sensing data. The text then presents image processing techniques and interprets them in terms of these models. Spectral, s

  10. A Critical Evaluation and Framework of Business Process Improvement Methods

    NARCIS (Netherlands)

    Vanwersch, R.J.B.; Shahzad, K.; Vanderfeesten, I.; Vanhaecht, K.; Grefen, P.; Pintelon, L.M.; Mendling, J.; van Merode, G.G.; Reijers, H.A.

    2016-01-01

    The redesign of business processes has a huge potential in terms of reducing costs and throughput times, as well as improving customer satisfaction. Despite rapid developments in the business process management discipline during the last decade, a comprehensive overview of the options to

  11. Y-STR frequency surveying method

    DEFF Research Database (Denmark)

    Willuweit, Sascha; Caliebe, Amke; Andersen, Mikkel Meyer

    2011-01-01

    reached a new quality with the establishment of the comprehensive neatly quality-controlled reference database YHRD. Grounded on such unrivalled empirical material from hundreds of populations studies the core assumption of the Haplotype Frequency Surveying Method originally described 10 years ago can...... be tested and improved. Here we provide new approaches to calculate the parameters used in the frequency surveying method: a maximum likelihood estimation of the regression parameters (r1, r2, s1 and s2) and a revised Frequency Surveying framework with variable binning and a database preprocessing to take......Reasonable formalized methods to estimate the frequencies of DNA profiles generated from lineage markers have been proposed in the past years and were discussed in the forensic community. Recently, collections of population data on the frequencies of variations in Y chromosomal STR profiles have...

  12. Volcanic Processes, and Possible Precursors of Eruptions at Etna and Stromboli Volcanoes Revealed by Thermal Surveys

    Science.gov (United States)

    Calvari, S.

    2007-05-01

    Thermal imaging has recently been introduced in volcanology to analyze a number of different volcanic processes. This system allows us to detect magma movements within the summit conduits of volcanoes, and then to reveal volcanic activity within the craters even through the thick curtain of gases usually released by active volcanoes such as Mt Etna and Stromboli. Thermal mapping is essential during effusive eruptions, since it distinguishes lava flows of different age and concealed lava tubes' path, improving hazard evaluation. Recently, thermal imaging has also been applied to reveal failure planes and instability on the flanks of active volcanoes. Excellent results have been obtained in terms of volcanic prediction during the eruptions of Mt Etna and Stromboli occurred in 2002-2003. On Etna, thermal images monthly recorded on the summit of the volcano revealed the opening of fissure systems several months in advance. At Stromboli, helicopter-borne thermal surveys allowed us to recognize the opening of fractures one hour before the large failure that caused severe destruction on the island on 30 December 2002. The INGV - Sezione di Catania started in 2001 to monitor active volcanoes using a hand-held thermal camera. This instrument was used in field and from helicopter to detect any thermal anomaly recorded on the surface of active volcanoes, and has since been applied to a number of eruptions and eruptive processes. After the two major eruptions at Etna and Stromboli, fixed thermal cameras have been installed on Stromboli, Etna and Vulcano, allowing us to keep under control the eruptive activity, flank stability and ash emission. On Etna, we have monitored the 2002-03, 2004-05, July 2006 and August-December 2006 eruptions. On Stromboli, thermal surveys from helicopter allowed us to follow the propagation of ephemeral vents and thus the path of hidden lava tubes, as well as the stages of inflation and deflation of the upper lava flow field. Thermal cameras have

  13. Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method, 4th Edition

    Science.gov (United States)

    Dillman, Don A.; Smyth, Jolene D.; Christian, Lean Melani

    2014-01-01

    For over two decades, Dillman's classic text on survey design has aided both students and professionals in effectively planning and conducting mail, telephone, and, more recently, Internet surveys. The new edition is thoroughly updated and revised, and covers all aspects of survey research. It features expanded coverage of mobile phones, tablets,…

  14. Photometric redshifts for the next generation of deep radio continuum surveys - II. Gaussian processes and hybrid estimates

    Science.gov (United States)

    Duncan, Kenneth J.; Jarvis, Matt J.; Brown, Michael J. I.; Röttgering, Huub J. A.

    2018-04-01

    Building on the first paper in this series (Duncan et al. 2018), we present a study investigating the performance of Gaussian process photometric redshift (photo-z) estimates for galaxies and active galactic nuclei detected in deep radio continuum surveys. A Gaussian process redshift code is used to produce photo-z estimates targeting specific subsets of both the AGN population - infrared, X-ray and optically selected AGN - and the general galaxy population. The new estimates for the AGN population are found to perform significantly better at z > 1 than the template-based photo-z estimates presented in our previous study. Our new photo-z estimates are then combined with template estimates through hierarchical Bayesian combination to produce a hybrid consensus estimate that outperforms both of the individual methods across all source types. Photo-z estimates for radio sources that are X-ray sources or optical/IR AGN are significantly improved in comparison to previous template-only estimates - with outlier fractions and robust scatter reduced by up to a factor of ˜4. The ability of our method to combine the strengths of the two input photo-z techniques and the large improvements we observe illustrate its potential for enabling future exploitation of deep radio continuum surveys for both the study of galaxy and black hole co-evolution and for cosmological studies.

  15. Methods and compositions for controlling gene expression by RNA processing

    Science.gov (United States)

    Doudna, Jennifer A.; Qi, Lei S.; Haurwitz, Rachel E.; Arkin, Adam P.

    2017-08-29

    The present disclosure provides nucleic acids encoding an RNA recognition sequence positioned proximal to an insertion site for the insertion of a sequence of interest; and host cells genetically modified with the nucleic acids. The present disclosure also provides methods of modifying the activity of a target RNA, and kits and compositions for carrying out the methods.

  16. COLLABORATIVE AND PARTICIPATORY PLANNING PROCESSES AND METHODS FOR LOCAL DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Mauricio Hernandez Bonilla

    2009-11-01

    Full Text Available In 2006, The University of Veracruz (UV, the State Government of Veracruz and the United Nations Program for Human Settlements (UN/Habitat signed an agreement to promote the urban and territorial development of the Veracruz State. This event is the result of The University of Veracruz’s policies on the production of knowledge for the improvement of social and economic conditions in Veracruz through the involvement of the University organization in the real problems of different groups within society. Under this agreement, the University has made a commitment to promote sustainable development in the main regions of Veracruz, firstly through the implementation of strategic regional planning using inclusive and participatory methods; and secondly, through the strengthening of state and local authorities’ capacities for the development and implementation of urban and regional policies that have an effective and integral impact on the social, environmental and economic dimensions of cities. The purpose of this paper is to present the participatory exercises conducted by the multidisciplinary academic group of The University of Veracruz, under UN/Habitat-UVVeracruz State Government agreement following the UN methodology to promote Local Economic and Territorial Development. This paper explains these participatory planning experiences, methods and the results in the context of the central urban region of Veracruz State (made up of 15 municipalities.

  17. Report on evaluation/selection surveys on coal species, processes and others; Tanshu process nado hyoka sentei chosa hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1981-03-01

    This program analyzes the applicable coal species centered by Australia's Victoria brown coal and Chinese coal, which are promising alternative fuel sources for Japan for their reserves, prices, availability, suitability for liquefaction, etc, in order to clarify the possible problems, and commercialize the liquefaction techniques in the early stage. This report consists of 6 chapters. Chapter 1 describes development situations of brown coal, specifically for Australia's Victoria brown coal and Chinese coal. Chapter 2 describes characteristics of the reactions involved in the brown coal liquefaction. Chapter 3 describes current status of various liquefaction processes (solvolysis, solvent extraction, direct hydrogenation and C-SRC) under development in Japan, and problems involved in their future developments. Chapter 4 describes current status of the elementary techniques, e.g., those for slurry pretreatment (e.g., dehydration and crushing), solid/liquid separation, secondary hydrogenation, product upgrading and gasification. Chapter 5 describes the related techniques, and Chapter 6 discusses the demonstration survey results of de-ashing, primary/secondary hydrogenation, and dehydration of brown coal. (NEDO)

  18. Survey and analysis of deep water mineral deposits using nuclear methods

    International Nuclear Information System (INIS)

    Staehle, C.M.; Noakes, J.E.; Spaulding, J.

    1991-01-01

    Present knowledge of the location, quality, quantity and recoverability of sea floor minerals is severely limited, particularly in the abyssal depths and deep water within the 200 mile Exclusion Economic Zone (EEZ) surrounding the U.S. Pacific Islands. To improve this understanding and permit exploitation of these mineral reserves much additional data is needed. This paper will discuss a sponsored program for extending existing proven nuclear survey methods currently used on the shallow continental margins of the Atlantic and Gulf of Mexico into the deeper waters of the Pacific. This nuclear technology can be readily integrated and extended to depths of 2000 m using the existing RCV-150 remotely operated vehicle (ROV) and the PISCESE V manned deep submersible vehicle (DSV) operated by The University of Hawaii's, Hawaii Underseas Research Laboratory (HURL). Previous papers by the authors have also proposed incorporating these nuclear analytical methods for survey of the deep ocean through the use of Autonomous Underwater Vehicle (AUX). Such a vehicle could extend the use of passive nuclear instrument operation, in addition to conventional analytical methods, into the abyssal depths and do so with speed and economy not otherwise possible. The natural radioactivity associated with manganese nodules and crustal deposits is sufficiently above normal background levels to allow discrimination and quantification in near real time

  19. Application of QMC methods to PDEs with random coefficients : a survey of analysis and implementation

    KAUST Repository

    Kuo, Frances

    2016-01-05

    In this talk I will provide a survey of recent research efforts on the application of quasi-Monte Carlo (QMC) methods to PDEs with random coefficients. Such PDE problems occur in the area of uncertainty quantification. In recent years many papers have been written on this topic using a variety of methods. QMC methods are relatively new to this application area. I will consider different models for the randomness (uniform versus lognormal) and contrast different QMC algorithms (single-level versus multilevel, first order versus higher order, deterministic versus randomized). I will give a summary of the QMC error analysis and proof techniques in a unified view, and provide a practical guide to the software for constructing QMC points tailored to the PDE problems.

  20. Method and system for processing optical elements using magnetorheological finishing

    Science.gov (United States)

    Menapace, Joseph Arthur; Schaffers, Kathleen Irene; Bayramian, Andrew James; Molander, William A

    2012-09-18

    A method of finishing an optical element includes mounting the optical element in an optical mount having a plurality of fiducials overlapping with the optical element and obtaining a first metrology map for the optical element and the plurality of fiducials. The method also includes obtaining a second metrology map for the optical element without the plurality of fiducials, forming a difference map between the first metrology map and the second metrology map, and aligning the first metrology map and the second metrology map. The method further includes placing mathematical fiducials onto the second metrology map using the difference map to form a third metrology map and associating the third metrology map to the optical element. Moreover, the method includes mounting the optical element in the fixture in an MRF tool, positioning the optical element in the fixture; removing the plurality of fiducials, and finishing the optical element.

  1. Statistical problems raised by data processing of food surveys

    International Nuclear Information System (INIS)

    Lacourly, Nancy

    1974-01-01

    The methods used for the analysis of dietary habits of national populations - food surveys - have been studied. S. Lederman's linear model for the estimation of the average individual consumptions from the total family diets was in the light of a food survey carried on with 250 Roman families in 1969. An important bias in the estimates thus obtained was shown out by a simulation assuming 'housewife's dictatorship'; these assumptions should contribute to set up an unbiased model. Several techniques of multidimensional analysis were therefore used and the theoretical aspect of linear regression for some particular situations had to be investigated: quasi-colinear 'independent variables', measurements with errors, positive constraints on regression coefficients. A new survey methodology was developed taking account of the new 'Integrated Information Systems', which have incidence on all the stages of a consumption survey: organization, data collection, constitution of an information bank and data processing. (author) [fr

  2. A survey on processing and marketing characteristics of peri-urban ...

    African Journals Online (AJOL)

    A survey was conducted in five districts on the Accra plains to characterize the peri-urban dairy production system. Results from the survey indicated that farmers were keeping cattle, sheep, goats, and poultry (Guinea fowl, turkey, chicken, and duck), cattle being the only species milked. The mean flock size was 73.6 TLU ...

  3. Method and apparatus for monitoring plasma processing operations

    Science.gov (United States)

    Smith, Jr., Michael Lane; Ward, Pamela Denise Peardon; Stevenson, Joel O'Don

    2002-01-01

    The invention generally relates to various aspects of a plasma process, and more specifically the monitoring of such plasma processes. One aspect relates in at least some manner to calibrating or initializing a plasma monitoring assembly. This type of calibration may be used to address wavelength shifts, intensity shifts, or both associated with optical emissions data obtained on a plasma process. A calibration light may be directed at a window through which optical emissions data is being obtained to determine the effect, if any, that the inner surface of the window is having on the optical emissions data being obtained therethrough, the operation of the optical emissions data gathering device, or both. Another aspect relates in at least some manner to various types of evaluations which may be undertaken of a plasma process which was run, and more typically one which is currently being run, within the processing chamber. Plasma health evaluations and process identification through optical emissions analysis are included in this aspect. Yet another aspect associated with the present invention relates in at least some manner to the endpoint of a plasma process (e.g., plasma recipe, plasma clean, conditioning wafer operation) or discrete/discernible portion thereof (e.g., a plasma step of a multiple step plasma recipe). Another aspect associated with the present invention relates to how one or more of the above-noted aspects may be implemented into a semiconductor fabrication facility, such as the distribution of wafers to a wafer production system. A final aspect of the present invention relates to a network a plurality of plasma monitoring systems, including with remote capabilities (i.e., outside of the clean room).

  4. Mobile acoustic transects miss rare bat species: implications of survey method and spatio-temporal sampling for monitoring bats

    Directory of Open Access Journals (Sweden)

    Elizabeth C. Braun de Torrez

    2017-11-01

    Full Text Available Due to increasing threats facing bats, long-term monitoring protocols are needed to inform conservation strategies. Effective monitoring should be easily repeatable while capturing spatio-temporal variation. Mobile acoustic driving transect surveys (‘mobile transects’ have been touted as a robust, cost-effective method to monitor bats; however, it is not clear how well mobile transects represent dynamic bat communities, especially when used as the sole survey approach. To assist biologists who must select a single survey method due to resource limitations, we assessed the effectiveness of three acoustic survey methods at detecting species richness in a vast protected area (Everglades National Park: (1 mobile transects, (2 stationary surveys that were strategically located by sources of open water and (3 stationary surveys that were replicated spatially across the landscape. We found that mobile transects underrepresented bat species richness compared to stationary surveys across all major vegetation communities and in two distinct seasons (dry/cool and wet/warm. Most critically, mobile transects failed to detect three rare bat species, one of which is federally endangered. Spatially replicated stationary surveys did not estimate higher species richness than strategically located stationary surveys, but increased the rate at which species were detected in one vegetation community. The survey strategy that detected maximum species richness and the highest mean nightly species richness with minimal effort was a strategically located stationary detector in each of two major vegetation communities during the wet/warm season.

  5. Mobile acoustic transects miss rare bat species: implications of survey method and spatio-temporal sampling for monitoring bats.

    Science.gov (United States)

    Braun de Torrez, Elizabeth C; Wallrichs, Megan A; Ober, Holly K; McCleery, Robert A

    2017-01-01

    Due to increasing threats facing bats, long-term monitoring protocols are needed to inform conservation strategies. Effective monitoring should be easily repeatable while capturing spatio-temporal variation. Mobile acoustic driving transect surveys ('mobile transects') have been touted as a robust, cost-effective method to monitor bats; however, it is not clear how well mobile transects represent dynamic bat communities, especially when used as the sole survey approach. To assist biologists who must select a single survey method due to resource limitations, we assessed the effectiveness of three acoustic survey methods at detecting species richness in a vast protected area (Everglades National Park): (1) mobile transects, (2) stationary surveys that were strategically located by sources of open water and (3) stationary surveys that were replicated spatially across the landscape. We found that mobile transects underrepresented bat species richness compared to stationary surveys across all major vegetation communities and in two distinct seasons (dry/cool and wet/warm). Most critically, mobile transects failed to detect three rare bat species, one of which is federally endangered. Spatially replicated stationary surveys did not estimate higher species richness than strategically located stationary surveys, but increased the rate at which species were detected in one vegetation community. The survey strategy that detected maximum species richness and the highest mean nightly species richness with minimal effort was a strategically located stationary detector in each of two major vegetation communities during the wet/warm season.

  6. Laser Scanning in Engineering Surveying: Methods of Measurement and Modeling of Structures

    Directory of Open Access Journals (Sweden)

    Lenda Grzegorz

    2016-06-01

    Full Text Available The study is devoted to the uses of laser scanning in the field of engineering surveying. It is currently one of the main trends of research which is developed at the Department of Engineering Surveying and Civil Engineering at the Faculty of Mining Surveying and Environmental Engineering of AGH University of Science and Technology in Krakow. They mainly relate to the issues associated with tower and shell structures, infrastructure of rail routes, or development of digital elevation models for a wide range of applications. These issues often require the use of a variety of scanning techniques (stationary, mobile, but the differences also regard the planning of measurement stations and methods of merging point clouds. Significant differences appear during the analysis of point clouds, especially when modeling objects. Analysis of the selected parameters is already possible basing on ad hoc measurements carried out on a point cloud. However, only the construction of three-dimensional models provides complete information about the shape of structures, allows to perform the analysis in any place and reduces the amount of the stored data. Some structures can be modeled in the form of simple axes, sections, or solids, for others it becomes necessary to create sophisticated models of surfaces, depicting local deformations. The examples selected for the study allow to assess the scope of measurement and office work for a variety of uses related to the issue set forth in the title of this study. Additionally, the latest, forward-looking technology was presented - laser scanning performed from Unmanned Aerial Vehicles (drones. Currently, it is basically in the prototype phase, but it might be expected to make a significant progress in numerous applications in the field of engineering surveying.

  7. Effect of processing methods on the nutritional values and anti ...

    African Journals Online (AJOL)

    This research aimed at determining the effect of processing on the nutritional and anti -nutritional values of “food tree” [Adenanthera pavonina L. (Fabaceae)] seeds, a highly nutritional and underutilized legume. The seeds were separated into three groups namely, boiled, roasted and raw. Quantitative analysis was carried ...

  8. Processing and discarding method for contaminated concrete wastes

    International Nuclear Information System (INIS)

    Yamamoto, Kazuo; Konishi, Masao; Matsuda, Atsuo; Iwamoto, Yoshiaki; Yoshikane, Toru; Koie, Toshio; Nakajima, Yoshiro

    1998-01-01

    Contaminated concrete wastes are crashed into granular concrete wastes having a successive grain size distribution. They are filled in a contamination processing vessel and made hardenable in the presence of a water-hardenable material in the granular concrete wastes. When underground water intrudes into the contamination processing vessel filled with the granular concrete wastes upon long-term storage, the underground water reacts with the water-hardenable material to be used for the solidification effect. Accordingly, leaching of contaminated materials due to intrusion of underground water can be suppressed. Since the concrete wastes have a successive grain size distribution, coarse grains can be used as coarse aggregates, medium grains can be used as fine aggregates and fine grains can be used as a solidifying material. Accordingly, the amount of wastes after processing can be remarkably reduced, with no supply of a solidifying material from outside. (T.M.)

  9. Technical Evaluation of Sample-Processing, Collection, and Preservation Methods

    Science.gov (United States)

    2014-07-01

    policy document entitled The National Strategy for Biosurveillance was released (White House, July 2012) as part of the National Security Strategy...concept of leveraging existing capabilities to “scan and discern the environment,” which implies the use of current technical biosurveillance ...testing of existing sample-processing technologies are expected to enable in silico evaluations of biosurveillance methodologies, equipment, and

  10. Profile and Instrumentation Driven Methods for Embedded Signal Processing

    Science.gov (United States)

    2015-01-01

    dataflow model match cannot be found, a less efficient, generic scheduler and more conservative memory allocation may need to be employed. Economic ...Processing, Taipei , Taiwan, April 2009, pp. 565– 568. [75] Shuvra S. Bhattacharyya et al., “Heterogeneous concurrent modeling and design in java, volume 1

  11. Validation Process Methods

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, John E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); English, Christine M. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gesick, Joshua C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mukkamala, Saikrishna [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2018-01-04

    This report documents the validation process as applied to projects awarded through Funding Opportunity Announcements (FOAs) within the U.S. Department of Energy Bioenergy Technologies Office (DOE-BETO). It describes the procedures used to protect and verify project data, as well as the systematic framework used to evaluate and track performance metrics throughout the life of the project. This report also describes the procedures used to validate the proposed process design, cost data, analysis methodologies, and supporting documentation provided by the recipients.

  12. Grasping devices and methods in automated production processes

    DEFF Research Database (Denmark)

    Fantoni, Gualtiero; Santochi, Marco; Dini, Gino

    2014-01-01

    assembly to disassembly, from aerospace to food industry, from textile to logistics) are discussed. Finally, the most recent research is reviewed in order to introduce the new trends in grasping. They provide an outlook on the future of both grippers and robotic hands in automated production processes. (C...

  13. Survey of Nuclear Methods in Chemical Technology

    International Nuclear Information System (INIS)

    Broda, E.

    1966-01-01

    An attempt is made to classify nuclear methods on a logical basis to facilitate assimilation by the technologist. The three main groups are: (I) Tracer methods, (II) Methods based on the influence of absorbers on radiations to be measured, and (III) Radiation chemical methods. The variants of the first two groups are discussed in some detail, and typical examples are given. Group I can be subdivided into (1) Indicator methods, (2) Emanation methods, (3) Radioreagent methods, and (4) Isotope dilution methods, Group II into (5) Activation methods, (6) Absorption methods, (7) Induced Nuclear Reaction methods, (8) Scattering methods, and (9) Fluorescence methods. While the economic benefits due to nuclear methods already run into hundreds of millions of dollars annually, owing to radiation protection problems radiochemical methods in the strict sense are not widely used in actual production. It is suggested that more use should be made of pilot plant tracer studies of chemical processes as used in industry. (author)

  14. Method and apparatus for processing radioactive laundry waste

    International Nuclear Information System (INIS)

    Shirai, Takamori; Suzuki, Takeo; Inami, Ichiro

    1979-01-01

    Purpose: To improve the processing efficiency by processing radioactive laundry waste by removing solid components and then eliminating ionic components contained therein. Constitution: Stored liquid wastes are sent to a cyclone to remove particles of greater specific weight and sludges are accumulated in a sludge reservoir. The liquid wastes are sent to a self-cleaning strainer to remove thread dusts, hairs and the likes, and then sent to a storage tank. The liquid wastes are further sent to a self-cleaning type centrifugal settler. Fine clads separated there are stored in a sludge reservoir and cleaned liquid wastes are sent to ultrafilter and an ion exchange resin column to remove radioactive components. The concentration liquid wastes from the ultrafilter are dried in a drier and the resin wastes from the ion exchange column are subjected to solidification. Sludges in the sludge reservoirs are solidified in a solidifier and then packed in drum can. (Kawakami, Y.)

  15. Improvement of high resolution borehole seismics. Part 1: Development of processing methods for VSP surveys. Part 2: Piezoelectric signal transmitter for seismic measurements

    International Nuclear Information System (INIS)

    Cosma, C.; Heikkinen, P.; Pekonen, S.

    1991-05-01

    The purpose of the high resolution borehole seismics project has been to improve the reliability and resolution of seismic methods in the particular environment of nuclear waste repository sites. The results obtained, especially the data processing and interpretation methods developed, are applicable also to other geophysical methods (e.g. Georadar). The goals of the seismic development project have been: the development of processing and interpretation techniques for mapping fractured zones, and the design and construction of a seismic source complying with the requirements of repository site characterization programs. Because these two aspects of the work are very different in nature, we have structured the report as two self contained parts. Part 1 describes the development of interpretive techniques. We have used for demonstrating the effect of different methods a VSP data set collected at the SCV site during Stage 1 of the project. Five techniques have been studied: FK-filtering, three versions of Tau-p filtering and a new technique that we have developed lately, Image Space filtering. Part 2 refers to the construction of the piezoelectric source. Earlier results obtained over short distances with low energy piezoelectric transmitters let us believe that the same principle could be applied for seismic signal transmitters, if solutions for higher energy and lower frequency output were found. The instrument which we have constructed is a cylindrical unit which can be placed in a borehole and is able to produce a radial strain when excited axially. The minimum borehole diameter is 56 mm. (au)

  16. Method and apparatus for semi-solid material processing

    Science.gov (United States)

    Han, Qingyou [Knoxville, TN; Jian, Xiaogang [Knoxville, TN; Xu, Hanbing [Knoxville, TN; Meek, Thomas T [Knoxville, TN

    2009-02-24

    A method of forming a material includes the steps of: vibrating a molten material at an ultrasonic frequency while cooling the material to a semi-solid state to form non-dendritic grains therein; forming the semi-solid material into a desired shape; and cooling the material to a solid state. The method makes semi-solid castings directly from molten materials (usually a metal), produces grain size usually in the range of smaller than 50 .mu.m, and can be easily retrofitted into existing conventional forming machine.

  17. Ethnic Differences in the Quality of the Interview Process and Implications for Survey Analysis: The Case of Indigenous Australians.

    Directory of Open Access Journals (Sweden)

    Francisco Perales

    Full Text Available Comparable survey data on Indigenous and non-Indigenous Australians are highly sought after by policymakers to inform policies aimed at closing ethnic socio-economic gaps. However, collection of such data is compromised by group differences in socio-economic status and cultural norms. We use data from the Household, Income and Labour Dynamics in Australia Survey and multiple-membership multilevel regression models that allow for individual and interviewer effects to examine differences between Indigenous and non-Indigenous Australians in approximate measures of the quality of the interview process. We find that there are both direct and indirect ethnic effects on different dimensions of interview process quality, with Indigenous Australians faring worse than non-Indigenous Australians in all outcomes ceteris paribus. This indicates that nationwide surveys must feature interview protocols that are sensitive to the needs and culture of Indigenous respondents to improve the quality of the survey information gathered from this subpopulation.

  18. Strategic Planning, Implementation, and Evaluation Processes in Hospital Systems: A Survey From Iran

    OpenAIRE

    Sadeghifar, Jamil; Jafari, Mehdi; Tofighi, Shahram; Ravaghi, Hamid; Maleki, Mohammad Reza

    2014-01-01

    Aim & Background: Strategic planning has been presented as an important management practice. However, evidence of its deployment in healthcare systems in low-income and middle-income countries (LMICs) is limited. This study investigated the strategic management process in Iranian hospitals. Methods: The present study was accomplished in 24 teaching hospitals in Tehran, Iran from September 2012 to March 2013. The data collection instrument was a questionnaire including 130 items. This question...

  19. Geostatistical methods for rock mass quality prediction using borehole and geophysical survey data

    Science.gov (United States)

    Chen, J.; Rubin, Y.; Sege, J. E.; Li, X.; Hehua, Z.

    2015-12-01

    For long, deep tunnels, the number of geotechnical borehole investigations during the preconstruction stage is generally limited. Yet tunnels are often constructed in geological structures with complex geometries, and in which the rock mass is fragmented from past structural deformations. Tunnel Geology Prediction (TGP) is a geophysical technique widely used during tunnel construction in China to ensure safety during construction and to prevent geological disasters. In this paper, geostatistical techniques were applied in order to integrate seismic velocity from TGP and borehole information into spatial predictions of RMR (Rock Mass Rating) in unexcavated areas. This approach is intended to apply conditional probability methods to transform seismic velocities to directly observed RMR values. The initial spatial distribution of RMR, inferred from the boreholes, was updated by including geophysical survey data in a co-kriging approach. The method applied to a real tunnel project shows significant improvements in rock mass quality predictions after including geophysical survey data, leading to better decision-making for construction safety design.

  20. Methods and Conditions for Achieving Continuous Improvement of Processes

    OpenAIRE

    Florica BADEA; Catalina RADU; Ana-Maria GRIGORE

    2010-01-01

    In the early twentieth century, the Taylor model improved, in a spectacular maner the efficiency of the production processes. This allowed obtaining high productivity by low-skilled workers, but used in large number in the execution of production. Currently this model is questioned by experts and was replaced by the concept of "continuous improvement". The first signs of change date from the '80s, with the apparition of quality circles and groups of operators on quality issues, principles whi...

  1. A survey of analytical methods employed for monitoring of Advanced Oxidation/Reduction Processes for decomposition of selected perfluorinated environmental pollutants.

    Science.gov (United States)

    Trojanowicz, Marek; Bobrowski, Krzysztof; Szostek, Bogdan; Bojanowska-Czajka, Anna; Szreder, Tomasz; Bartoszewicz, Iwona; Kulisa, Krzysztof

    2018-01-15

    The monitoring of Advanced Oxidation/Reduction Processes (AO/RPs) for the evaluation of the yield and mechanisms of decomposition of perfluorinated compounds (PFCs) is often a more difficult task than their determination in the environmental, biological or food samples with complex matrices. This is mostly due to the formation of hundreds, or even thousands, of both intermediate and final products. The considered AO/RPs, involving free radical reactions, include photolytic and photocatalytic processes, Fenton reactions, sonolysis, ozonation, application of ionizing radiation and several wet oxidation processes. The main attention is paid to the most commonly occurring PFCs in the environment, namely PFOA and PFOS. The most powerful and widely exploited method for this purpose is without a doubt LC/MS/MS, which allows the identification and trace quantitation of all species with detectability and resolution power depending on the particular instrumental configurations. The GC/MS is often employed for the monitoring of volatile fluorocarbons, confirming the formation of radicals in the processes of C‒C and C‒S bonds cleavage. For the direct monitoring of radicals participating in the reactions of PFCs decomposition, the molecular spectrophotometry is employed, especially electron paramagnetic resonance (EPR). The UV/Vis spectrophotometry as a detection method is of special importance in the evaluation of kinetics of radical reactions with the use of pulse radiolysis methods. The most commonly employed for the determination of the yield of mineralization of PFCs is ion-chromatography, but there is also potentiometry with ion-selective electrode and the measurements of general parameters such as Total Organic Carbon and Total Organic Fluoride. The presented review is based on about 100 original papers published in both analytical and environmental journals. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Remote sensing models and methods for image processing

    CERN Document Server

    Schowengerdt, Robert A

    2007-01-01

    Remote sensing is a technology that engages electromagnetic sensors to measure and monitor changes in the earth's surface and atmosphere. Normally this is accomplished through the use of a satellite or aircraft. This book, in its 3rd edition, seamlessly connects the art and science of earth remote sensing with the latest interpretative tools and techniques of computer-aided image processing. Newly expanded and updated, this edition delivers more of the applied scientific theory and practical results that helped the previous editions earn wide acclaim and become classroom and industry standa

  3. Method and system for nanoscale plasma processing of objects

    Science.gov (United States)

    Oehrlein, Gottlieb S [Clarksville, MD; Hua, Xuefeng [Hyattsville, MD; Stolz, Christian [Baden-Wuerttemberg, DE

    2008-12-30

    A plasma processing system includes a source of plasma, a substrate and a shutter positioned in close proximity to the substrate. The substrate/shutter relative disposition is changed for precise control of substrate/plasma interaction. This way, the substrate interacts only with a fully established, stable plasma for short times required for nanoscale processing of materials. The shutter includes an opening of a predetermined width, and preferably is patterned to form an array of slits with dimensions that are smaller than the Debye screening length. This enables control of the substrate/plasma interaction time while avoiding the ion bombardment of the substrate in an undesirable fashion. The relative disposition between the shutter and the substrate can be made either by moving the shutter or by moving the substrate.

  4. Processing method and results of meteor shower radar observations

    International Nuclear Information System (INIS)

    Belkovich, O.I.; Suleimanov, N.I.; Tokhtasjev, V.S.

    1987-01-01

    Studies of meteor showers permit the solving of some principal problems of meteor astronomy: to obtain the structure of a stream in cross section and along its orbits; to retrace the evolution of particle orbits of the stream taking into account gravitational and nongravitational forces and to discover the orbital elements of its parent body; to find out the total mass of solid particles ejected from the parent body taking into account physical and chemical evolution of meteor bodies; and to use meteor streams as natural probes for investigation of the average characteristics of the meteor complex in the solar system. A simple and effective method of determining the flux density and mass exponent parameter was worked out. This method and its results are discussed

  5. Chapter 12: Survey Design and Implementation for Estimating Gross Savings Cross-Cutting Protocol. The Uniform Methods Project: Methods for Determining Energy Efficiency Savings for Specific Measures

    Energy Technology Data Exchange (ETDEWEB)

    Kurnik, Charles W [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Baumgartner, Robert [Tetra Tech, Madison, WI (United States)

    2017-10-05

    This chapter presents an overview of best practices for designing and executing survey research to estimate gross energy savings in energy efficiency evaluations. A detailed description of the specific techniques and strategies for designing questions, implementing a survey, and analyzing and reporting the survey procedures and results is beyond the scope of this chapter. So for each topic covered below, readers are encouraged to consult articles and books cited in References, as well as other sources that cover the specific topics in greater depth. This chapter focuses on the use of survey methods to collect data for estimating gross savings from energy efficiency programs.

  6. Wetland and Sensitive Species Survey Report for Y-12: Proposed Uranium Processing Facility (UPF)

    Energy Technology Data Exchange (ETDEWEB)

    Giffen, N.; Peterson, M.; Reasor, S.; Pounds, L.; Byrd, G.; Wiest, M. C.; Hill, C. C.

    2009-11-01

    This report summarizes the results of an environmental survey conducted at sites associated with the proposed Uranium Processing Facility (UPF) at the Y-12 National Security Complex in September-October 2009. The survey was conducted in order to evaluate potential impacts of the overall project. This project includes the construction of a haul road, concrete batch plant, wet soil storage area and dry soil storage area. The environmental surveys were conducted by natural resource experts at ORNL who routinely assess the significance of various project activities on the Oak Ridge Reservation (ORR). Natural resource staff assistance on this project included the collection of environmental information that can aid in project location decisions that minimize impacts to sensitive resource such as significant wildlife populations, rare plants and wetlands. Natural resources work was conducted in various habitats, corresponding to the proposed areas of impact. Thc credentials/qualifications of the researchers are contained in Appendix A. The proposed haul road traverses a number of different habitats including a power-line right-of-way. wetlands, streams, forest and mowed areas. It extends from what is known as the New Salvage Yard on the west to the Polaris Parking Lot on the east. This haul road is meant to connect the proposed concrete batch plant to the UPF building site. The proposed site of the concrete batch plant itself is a highly disturbed fenced area. This area of the project is shown in Fig. 1. The proposed Wet Soils Disposal Area is located on the north side of Bear Creek Road at the former Control Burn Study Area. This is a second growth arce containing thick vegetation, and extensive dead and down woody material. This area of the project is shown in Fig. 2. Thc dry soils storage area is proposed for what is currently known as the West Borrow Area. This site is located on the west side of Reeves Road south of Bear Creek Road. The site is an early successional

  7. Survey field methods for expanded biospecimen and biomeasure collection in NSHAP Wave 2.

    Science.gov (United States)

    O'Doherty, Katie; Jaszczak, Angela; Hoffmann, Joscelyn N; You, Hannah M; Kern, David W; Pagel, Kristina; McPhillips, Jane; Schumm, L Philip; Dale, William; Huang, Elbert S; McClintock, Martha K

    2014-11-01

    The National Social Life, Health, and Aging Project is a nationally representative, longitudinal survey of older adults. A main component is the collection of biomeasures to objectively assess physiological status relevant to psychosocial variables, aging conditions, and disease. Wave 2 added novel biomeasures, refined those collected in Wave 1, and provides a reference for the collection protocols and strategy common to the biomeasures. The effects of aging, gender, and their interaction are presented in the specific biomeasure papers included in this Special Issue. A transdisciplinary working group expanded the biomeasures collected to include physiological, genetic, anthropometric, functional, neuropsychological, and sensory measures, yielding 37 more than in Wave 1. All were designed for collection in respondents' homes by nonmedically trained field interviewers. Both repeated and novel biomeasures were successful. Those in Wave 1 were refined to improve quality, and ensure consistency for longitudinal analysis. Four new biospecimens yielded 27 novel measures. During the interview, 19 biomeasures were recorded covering anthropometric, functional, neuropsychological, and sensory measures and actigraphy provided data on activity and sleep. Improved field methods included in-home collection, temperature control, establishment of a central survey biomeasure laboratory, and shipping, all of which were crucial for successful collection by the field interviewers and accurate laboratory assay of the biomeasures (92.1% average co-operation rate and 97.3% average assay success rate). Developed for home interviews, these biomeasures are readily applicable to other surveys. © The Author 2014. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  8. OVERVIEW OF VALIDATION, BASIC CONCEPTS AND ANALYTICAL METHOD PROCESS VALIDATION

    OpenAIRE

    Indu Gurram* , M.V.S.Kavitha, M.V.Nagabhushnam, Brahmaiah Bonthagara, D.Nagarjuna Reddy

    2017-01-01

    Quality is the primordial intention to any industry and its products manufactured. Multiple views on obtaining such quality are the current interest in the pharmaceutical industry. Validation is the art of designing and practicing the designed steps alongside with the documentation. Validation and quality assurance will go hand in hand, ensuring the through quality for the products. When analytical method is utilized to generate results about the characteristics of drug related samples it is ...

  9. THE BASE OF THE METHODICAL DESIGN AND IMPLEMENTATION OF ENGINEERING EDUCATION PROCESS

    Directory of Open Access Journals (Sweden)

    Renata Lis

    2012-12-01

    Full Text Available The article is devoted to the methodology of implementation of European and national qualifications framework in the academic process. It consists of: the methodic of design degree programs and classes and the methodic of the teaching process.

  10. Comparison of Health Examination Survey Methods in Brazil, Chile, Colombia, Mexico, England, Scotland, and the United States.

    Science.gov (United States)

    Mindell, Jennifer S; Moody, Alison; Vecino-Ortiz, Andres I; Alfaro, Tania; Frenz, Patricia; Scholes, Shaun; Gonzalez, Silvia A; Margozzini, Paula; de Oliveira, Cesar; Sanchez Romero, Luz Maria; Alvarado, Andres; Cabrera, Sebastián; Sarmiento, Olga L; Triana, Camilo A; Barquera, Simón

    2017-09-15

    Comparability of population surveys across countries is key to appraising trends in population health. Achieving this requires deep understanding of the methods used in these surveys to examine the extent to which the measurements are comparable. In this study, we obtained detailed protocols of 8 nationally representative surveys from 2007-2013 from Brazil, Chile, Colombia, Mexico, the United Kingdom (England and Scotland), and the United States-countries that that differ in economic and inequity indicators. Data were collected on sampling frame, sample selection procedures, recruitment, data collection methods, content of interview and examination modules, and measurement protocols. We also assessed their adherence to the World Health Organization's "STEPwise Approach to Surveillance" framework for population health surveys. The surveys, which included half a million participants, were highly comparable on sampling methodology, survey questions, and anthropometric measurements. Heterogeneity was found for physical activity questionnaires and biological samples collection. The common age range included by the surveys was adults aged 18-64 years. The methods used in these surveys were similar enough to enable comparative analyses of the data across the 7 countries. This comparability is crucial in assessing and comparing national and subgroup population health, and to assisting the transfer of research and policy knowledge across countries. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. A Survey of Methods for Computing Best Estimates of Endoatmospheric and Exoatmospheric Trajectories

    Science.gov (United States)

    Bernard, William P.

    2018-01-01

    Beginning with the mathematical prediction of planetary orbits in the early seventeenth century up through the most recent developments in sensor fusion methods, many techniques have emerged that can be employed on the problem of endo and exoatmospheric trajectory estimation. Although early methods were ad hoc, the twentieth century saw the emergence of many systematic approaches to estimation theory that produced a wealth of useful techniques. The broad genesis of estimation theory has resulted in an equally broad array of mathematical principles, methods and vocabulary. Among the fundamental ideas and methods that are briefly touched on are batch and sequential processing, smoothing, estimation, and prediction, sensor fusion, sensor fusion architectures, data association, Bayesian and non Bayesian filtering, the family of Kalman filters, models of the dynamics of the phases of a rocket's flight, and asynchronous, delayed, and asequent data. Along the way, a few trajectory estimation issues are addressed and much of the vocabulary is defined.

  12. Adapting data collection methods in the Australian Life Histories and Health Survey: a retrospective life course study

    OpenAIRE

    Kendig, Hal; Byles, Julie E; O'Loughlin, Kate; Nazroo, James Y; Mishra, Gita; Noone, Jack; Loh, Vanessa; Forder, Peta M

    2014-01-01

    Objective Ideally, life course data are collected prospectively through an ongoing longitudinal study. We report adaptive multimethod fieldwork procedures that gathered life history data by mail survey and telephone interview, comparable with the face-to-face methods employed in the English Longitudinal Study on Ageing (ELSA). Design The Australian Life Histories and Health (LHH) Survey was a substudy of the Australian 45 and Up Study, with data collection methods modified from the ELSA Study...

  13. Contraception coverage and methods used among women in South Africa: A national household survey

    Directory of Open Access Journals (Sweden)

    M F Chersich

    2017-04-01

    Full Text Available Background. Globally, family planning services are being strengthened and the range of contraceptive choices expanded. Data on contraceptive coverage and service gaps could help to shape these initiatives. Objective. To assess contraception coverage in South Africa (SA and identify underserved populations and aspects of programming that require strengthening. Methods. Data from a 2012 SA household survey assessed contraception coverage among 6 296 women aged 15 - 49 years and identified underserved populations. Results. Two-thirds had an unintended pregnancy in the past 5 years, a quarter of which were contraceptive failures. Most knew of injectable (92.0% and oral contraception (89.9%, but fewer of intrauterine devices (56.1% and emergency contraception (47.3%. Contraceptive prevalence was 49.1%, and 41.8% women used modern non-barrier methods. About half had ever used injectable contraception. Contraception was lower in black Africans and younger women, who used a limited range of methods. Conclusion. Contraception coverage is higher than many previous estimates. Rates of unintended pregnancy, contraceptive failure and knowledge gaps, however, demonstrate high levels of unmet need, especially among black Africans and young women.

  14. Generations and Gender Survey (GGS: Towards a Better Understanding of Relationships and Processes in the Life Course

    Directory of Open Access Journals (Sweden)

    Zsolt Spéder

    2007-11-01

    Full Text Available The Generations and Gender Survey (GGS is one of the two pillars of the Generations and Gender Programme designed to improve understanding of demographic and social development and of the factors that influence these developments. This article describes how the theoretical perspectives applied in the survey, the survey design and the questionnaire are related to this objective. The key features of the survey include panel design, multidisciplinarity, comparability, context-sensitivity, inter-generational and gender relationships. The survey applies the life course approach, focussing on the processes of childbearing, partnership dynamics, home leaving, and retiring. The selection of topics for data collection mainly follows the criterion of theoretically grounded relevance to explaining one or more of the mentioned processes. A large portion of the survey deals with economic aspects of life, such as economic activity, income, and economic well-being; a comparably large section is devoted to values and attitudes. Other domains covered by the survey include gender relationships, household composition and housing, residential mobility, social networks and private transfers, education, health, and public transfers. The third chapter of the article describes the motivations for their inclusion. The GGS questionnaire is designed for a face-to-face interview. It includes the core that each participating country needs to implement in full, and four optional sub-modules on nationality and ethnicity, on previous partners, on intentions of breaking up, and on housing, respectively. The participating countries are encouraged to include also the optional sub-modules to facilitate comparative research on these topics.

  15. A Survey of tooth morphology teaching methods employed in the United Kingdom and Ireland.

    Science.gov (United States)

    Lone, M; McKenna, J P; Cryan, J F; Downer, E J; Toulouse, A

    2018-01-15

    Tooth morphology is a central component of the dental curriculum and is applicable to all dental specialities. Traditional teaching methods are being supplemented with innovative strategies to tailor teaching and accommodate the learning styles of the recent generation of students. An online survey was compiled and distributed to the staff involved in teaching tooth morphology in the United Kingdom and Ireland to assess the importance of tooth morphology in the dentistry curriculum and the methodologies employed in teaching. The results of the survey show that tooth morphology constitutes a small module in the dental curriculum. It is taught in the first 2 years of the dental curriculum but is applicable in the clinical years and throughout the dental career. Traditional teaching methods, lecture and practical, are being augmented with innovative teaching including e-learning via virtual learning environment, tooth atlas and e-books leading to blended learning. The majority of the schools teach both normal dental anatomy and morphologic variations of dental anatomy and utilise plastic teeth for practical and examination purposes. Learning the 3D aspects of tooth morphology was deemed important by most of the respondents who also agreed that tooth morphology is a difficult topic for the students. Despite being core to the dental curriculum, overall minimal time is dedicated to the delivery of tooth morphology, creating a reliance on the student to learn the material. New forms of delivery including computer-assisted learning tools should help sustain learning and previously acquired knowledge. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. Vision and Control for UAVs: A Survey of General Methods andof Inexpensive Platforms for Infrastructure Inspection

    Directory of Open Access Journals (Sweden)

    Koppány Máthé

    2015-06-01

    Full Text Available Unmanned aerial vehicles (UAVs have gained significant attention in recent years. Low-cost platforms using inexpensive sensor payloads have been shown to provide satisfactory flight and navigation capabilities. In this report, we survey vision and control methods that can be applied to low-cost UAVs, and we list some popular inexpensive platforms and application fields where they are useful. We also highlight the sensor suites used where this information is available. We overview, among others, feature detection and tracking, optical flow and visual servoing, low-level stabilization and high-level planning methods. We then list popular low-cost UAVs, selecting mainly quadrotors. We discuss applications, restricting our focus to the field of infrastructure inspection. Finally, as an example, we formulate two use-cases for railway inspection, a less explored application field, and illustrate the usage of the vision and control techniques reviewed by selecting appropriate ones to tackle these use-cases. To select vision methods, we run a thorough set of experimental evaluations.

  17. Citizens' perceptions of political processes. A critical evaluation of preference consistency and survey items

    Directory of Open Access Journals (Sweden)

    Bengtsson, Åsa

    2012-12-01

    Full Text Available The current state of research does not tell us much about citizens’ expectations of political decision making. Most surveys allow respondents to evaluate how the current system is working, but do not inquire about alternative political decision-making procedures. The lack of established survey items can be explained by the fact that radical changes in decision-making procedures have been hard to envisage, but also by a general scepticism regarding people’s ability to form opinions on these matters. Political processes are, without doubt, complex matters that do not lend themselves very well to simplistic survey questions. Moreover, previous research has convincingly shown that most people in general have difficulties forming single, coherent and stable attitudes even towards far more straightforward political issues. In order to determine if trying to grasp attitudes towards political decision-making in future empirical studies can be considered a fruitful endeavour, this study sets out to critically assess the extent to which people express coherent preferences on these matters, and if preferences are in line with expectations in previous, rather scattered research. The study is based on the Finnish National Election Study 2011; a study which, contrary to most other election studies, includes a rich variety of survey items on the topic, and utilises a combination of strategies in order to explore patterns in the opinions held by citizens.

    El estado actual de las investigaciones no nos dice mucho sobre las expectativas de los ciudadanos con respecto a la toma de decisiones políticas. La mayoría de las encuestas permiten que quienes las responden evalúen cómo funciona el sistema actual, pero no preguntan por procedimientos alternativos de decisión política. La falta de preguntas de encuesta contrastadas se puede explicar tanto por el hecho de que los cambios en los procedimientos de toma de decisiones han resultado difíciles de

  18. Analytical techniques for in-line/on-line monitoring of uranium and plutonium in process solutions : a brief literature survey

    International Nuclear Information System (INIS)

    Marathe, S.G.; Sood, D.D.

    1991-01-01

    In-line/on-line monitoring of various parameters such as uranium-plutonium-fission product concentration, acidity, density etc. plays an important role in quickly understanding the efficiency of processes in a reprocessing plant. Efforts in studying and installation of such analytical instruments are going on since more than three decades with adaptation of newer methods and technologies. A review on the developement of in-line analytical instrumentation was carried out in this laboratory about two decades ago. This report presents a very short literature survey of the work in the last two decades. The report includes an outline of principles of the main techniques employed in the in-line/on-line monitoring. (author). 77 refs., 6 tabs

  19. A survey of the use of soy in processed Turkish meat products and detection of genetic modification.

    Science.gov (United States)

    Ulca, Pelin; Balta, Handan; Senyuva, Hamide Z

    2014-01-01

    To screen for possible illegal use of soybeans in meat products, the performance characteristics of a commercial polymer chain reaction (PCR) kit for detection of soybean DNA in raw and cooked meat products were established. Minced chicken and beef products containing soybean at levels from 0.1% to 10.0% were analysed by real-time PCR to amplify the soybean lectin gene. The PCR method could reliably detect the addition of soybean at a level of 0.1%. A survey of 38 Turkish processed meat products found only six samples to be negative for the presence of soybean. In 32 (84%) positive samples, 13 (34%) contained levels of soy above 0.1%. Of soybean positive samples, further DNA analysis was conducted by real-time PCR to detect whether genetically modified (GM) soybean had been used. Of 32 meat samples containing soybean, two samples were positive for GM modification.

  20. The prioritization and categorization method (PCM) process evaluation at Ericsson : a case study

    NARCIS (Netherlands)

    Ohlsson, Jens; Han, Shengnan; Bouwman, W.A.G.A.

    2017-01-01

    Purpose: The purpose of this paper is to demonstrate and evaluate the prioritization and categorization method (PCM), which facilitates the active participation of process stakeholders (managers, owners, customers) in process assessments. Stakeholders evaluate processes in terms of effectiveness,

  1. Multiresolution, Geometric, and Learning Methods in Statistical Image Processing, Object Recognition, and Sensor Fusion

    National Research Council Canada - National Science Library

    Willsky, Alan

    2004-01-01

    .... Our research blends methods from several fields-statistics and probability, signal and image processing, mathematical physics, scientific computing, statistical learning theory, and differential...

  2. Dark Energy Survey Year 1 Results: Cross-Correlation Redshifts - Methods and Systematics Characterization

    Science.gov (United States)

    Gatti, M.; Vielzeuf, P.; Davis, C.; Cawthon, R.; Rau, M. M.; DeRose, J.; De Vicente, J.; Alarcon, A.; Rozo, E.; Gaztanaga, E.; Hoyle, B.; Miquel, R.; Bernstein, G. M.; Bonnett, C.; Carnero Rosell, A.; Castander, F. J.; Chang, C.; da Costa, L. N.; Gruen, D.; Gschwend, J.; Hartley, W. G.; Lin, H.; MacCrann, N.; Maia, M. A. G.; Ogando, R. L. C.; Roodman, A.; Sevilla-Noarbe, I.; Troxel, M. A.; Wechsler, R. H.; Asorey, J.; Davis, T. M.; Glazebrook, K.; Hinton, S. R.; Lewis, G.; Lidman, C.; Macaulay, E.; Möller, A.; O'Neill, C. R.; Sommer, N. E.; Uddin, S. A.; Yuan, F.; Zhang, B.; Abbott, T. M. C.; Allam, S.; Annis, J.; Bechtol, K.; Brooks, D.; Burke, D. L.; Carollo, D.; Carrasco Kind, M.; Carretero, J.; Cunha, C. E.; D'Andrea, C. B.; DePoy, D. L.; Desai, S.; Eifler, T. F.; Evrard, A. E.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gerdes, D. W.; Goldstein, D. A.; Gruendl, R. A.; Gutierrez, G.; Honscheid, K.; Hoormann, J. K.; Jain, B.; James, D. J.; Jarvis, M.; Jeltema, T.; Johnson, M. W. G.; Johnson, M. D.; Krause, E.; Kuehn, K.; Kuhlmann, S.; Kuropatkin, N.; Li, T. S.; Lima, M.; Marshall, J. L.; Melchior, P.; Menanteau, F.; Nichol, R. C.; Nord, B.; Plazas, A. A.; Reil, K.; Rykoff, E. S.; Sako, M.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sheldon, E.; Smith, M.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Tucker, B. E.; Tucker, D. L.; Vikram, V.; Walker, A. R.; Weller, J.; Wester, W.; Wolf, R. C.

    2018-02-01

    We use numerical simulations to characterize the performance of a clustering-based method to calibrate photometric redshift biases. In particular, we cross-correlate the weak lensing (WL) source galaxies from the Dark Energy Survey Year 1 (DES Y1) sample with redMaGiC galaxies (luminous red galaxies with secure photometric redshifts) to estimate the redshift distribution of the former sample. The recovered redshift distributions are used to calibrate the photometric redshift bias of standard photo-z methods applied to the same source galaxy sample. We apply the method to two photo-z codes run in our simulated data: Bayesian Photometric Redshift (BPZ) and Directional Neighborhood Fitting (DNF). We characterize the systematic uncertainties of our calibration procedure, and find that these systematic uncertainties dominate our error budget. The dominant systematics are due to our assumption of unevolving bias and clustering across each redshift bin, and to differences between the shapes of the redshift distributions derived by clustering vs photo-z's. The systematic uncertainty in the mean redshift bias of the source galaxy sample is Δz ≲ 0.02, though the precise value depends on the redshift bin under consideration. We discuss possible ways to mitigate the impact of our dominant systematics in future analyses.

  3. Assessing the oral health of an ageing population: methods, challenges and predictors of survey participation.

    Science.gov (United States)

    Matthews, Debora C; Brillant, Martha G S; Clovis, Joanne B; McNally, Mary E; Filiaggi, Mark J; Kotzer, Robert D; Lawrence, Herenia P

    2012-06-01

    To examine predictors of participation and to describe the methodological considerations of conducting a two-stage population-based oral health survey. An observational, cross-sectional survey (telephone interview and clinical oral examination) of community-dwelling adults aged 45-64 and ≥65 living in Nova Scotia, Canada was conducted. The survey response rate was 21% for the interview and 13.5% for the examination. A total of 1141 participants completed one or both components of the survey. Both age groups had higher levels of education than the target population; the age 45-64 sample also had a higher proportion of females and lower levels of employment than the target population. Completers (participants who completed interview and examination) were compared with partial completers (who completed only the interview), and stepwise logistic regression was performed to examine predictors of completion. Identified predictors were as follows: not working, post-secondary education and frequent dental visits. Recruitment, communications and logistics present challenges in conducting a province-wide survey. Identification of employment, education and dental visit frequency as predictors of survey participation provide insight into possible non-response bias and suggest potential for underestimation of oral disease prevalence in this and similar surveys. This potential must be considered in analysis and in future recruitment strategies. © 2011 The Gerodontology Society and John Wiley & Sons A/S.

  4. [Reasearch on evolution and transition of processing method of fuzi in ancient and modern times].

    Science.gov (United States)

    Liu, Chan-Chan; Cheng, Ming-En; Duan, Hai-Yan; Peng, Hua-Sheng

    2014-04-01

    Fuzi is a medicine used for rescuing from collapse by restoring yang as well as a famous toxic traditional Chinese medicine. In order to ensure the efficacy and safe medication, Fuzi has mostly been applied after being processed. There have been different Fuzi processing methods recorded by doctors of previous generations. Besides, there have also been differences in Fuzi processing methods recorded in modern pharmacopeia and ancient medical books. In this study, the authors traced back to medical books between the Han Dynasty and the period of Republic of China, and summarized Fuzi processing methods collected in ancient and modern literatures. According to the results, Fuzi processing methods and using methods have changed along with the evolution of dynasties, with differences in ancient and modern processing methods. Before the Tang Dynasty, Fuzi had been mostly processed and soaked. From Tang to Ming Dynasties, Fuzi had been mostly processed, soaked and stir-fried. During the Qing Dynasty, Fuzi had been mostly soaked and boiled. In the modem times, Fuzi is mostly processed by being boiled and soaked. Before the Tang Dynasty, a whole piece of Fuzi herbs or their fragments had been applied in medicines; Whereas their fragments are primarily used in the modern times. Because different processing methods have great impacts on the toxicity of Fuzi, it is suggested to study Fuzi processing methods.

  5. System Maturity and Architecture Assessment Methods, Processes, and Tools

    Science.gov (United States)

    2012-03-02

    an architecture can give a false impression of “architecting” being complete ( Bergey , Blanchette et al. 2009). Throughout the development of a...World Congress on Engineering and Computer Science, San Francisco, CA. Bergey , J., J. S. Blanchette, et al. (2009). U.S. Army Workshop on Exploring

  6. Effect of processing and preservation methods on Vitamin C and ...

    African Journals Online (AJOL)

    African Journal of Food, Agriculture, Nutrition and Development. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 5, No 2 (2005) >. Log in or Register to get access to full text downloads.

  7. Control and optimization system and method for chemical looping processes

    Science.gov (United States)

    Lou, Xinsheng; Joshi, Abhinaya; Lei, Hao

    2015-02-17

    A control system for optimizing a chemical loop system includes one or more sensors for measuring one or more parameters in a chemical loop. The sensors are disposed on or in a conduit positioned in the chemical loop. The sensors generate one or more data signals representative of an amount of solids in the conduit. The control system includes a data acquisition system in communication with the sensors and a controller in communication with the data acquisition system. The data acquisition system receives the data signals and the controller generates the control signals. The controller is in communication with one or more valves positioned in the chemical loop. The valves are configured to regulate a flow of the solids through the chemical loop.

  8. Comparison between Digital Image Processing and Spectrophotometric Measurements Methods

    Directory of Open Access Journals (Sweden)

    Bogdan Adnan HAIFA

    2011-03-01

    Full Text Available Background: Spectrophotometer is a very common instrument in various scientific fields and gives accurate information about light absorbance and transmittance through materials using monochromatic light source. Though, devices used in spectrophotometry can be quite expensive, using components with high technical specifications and the procedure itself is time consuming. Regular digital image acquisition instruments like scanners and cameras on the other hand uses very cheap electronic components to record the information on 3 wide band channels (Red, Green, Blue. Purpose: This paper studies the possibility of correlating the measurements from the spectrophotometer with raw data from digital image acquisition instruments. Materials and Methods: Because the results will be used in protein electrophoresis, we prepared o set of plates with blood serum in different dilutions, stained with Coomassie Brilliant Blue. The absorbance of the resulting plates has been measured using a spectrophotometer and after that, the plates were scanned with a regular office scanner. The digital image was converted in different color spaces (gray scale, RGB, HSV, HSL, CIEXYZ and CIELAB using custom developed software in C++. We statistically measured the correlation coefficient of different parameters from the color space with the absorption measured with the spectrophotometer. Results and Discussion: The findings of this work show that a consumer digital scanner can be used as a fast and inexpensive alternative to spectrophotometers. This offers the possibility of using scanned images of protein electrophoresis to make quantitative estimations regarding the proteinogram.

  9. Upper and lower bounds for stochastic processes modern methods and classical problems

    CERN Document Server

    Talagrand, Michel

    2014-01-01

    The book develops modern methods and in particular the "generic chaining" to bound stochastic processes. This methods allows in particular to get optimal bounds for Gaussian and Bernoulli processes. Applications are given to stable processes, infinitely divisible processes, matching theorems, the convergence of random Fourier series, of orthogonal series, and to functional analysis. The complete solution of a number of classical problems is given in complete detail, and an ambitious program for future research is laid out.

  10. [Monitoring method of extraction process for Schisandrae Chinensis Fructus based on near infrared spectroscopy and multivariate statistical process control].

    Science.gov (United States)

    Xu, Min; Zhang, Lei; Yue, Hong-Shui; Pang, Hong-Wei; Ye, Zheng-Liang; Ding, Li

    2017-10-01

    To establish an on-line monitoring method for extraction process of Schisandrae Chinensis Fructus, the formula medicinal material of Yiqi Fumai lyophilized injection by combining near infrared spectroscopy with multi-variable data analysis technology. The multivariate statistical process control (MSPC) model was established based on 5 normal batches in production and 2 test batches were monitored by PC scores, DModX and Hotelling T2 control charts. The results showed that MSPC model had a good monitoring ability for the extraction process. The application of the MSPC model to actual production process could effectively achieve on-line monitoring for extraction process of Schisandrae Chinensis Fructus, and can reflect the change of material properties in the production process in real time. This established process monitoring method could provide reference for the application of process analysis technology in the process quality control of traditional Chinese medicine injections. Copyright© by the Chinese Pharmaceutical Association.

  11. Process synthesis, design and analysis using a process-group contribution method

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Eden, Mario R.; Gani, Rafiqul

    2015-01-01

    ) techniques. The fundamental pillars of this framework are the definition and use of functional process-groups (building blocks) representing a wide range of process operations, flowsheet connectivity rules to join the process-groups to generate all the feasible flowsheet alternatives and flowsheet property...... models like energy consumption, atom efficiency, environmental impact to evaluate the performance of the generated alternatives. In this way, a list of feasible flowsheets are quickly generated, screened and selected for further analysis. Since the flowsheet is synthesized and the operations...

  12. Business Process Re-engineering in Saudi Arabia: A Survey of Understanding and Attitudes

    Directory of Open Access Journals (Sweden)

    Christopher Chiu

    2009-12-01

    Full Text Available This survey was conducted in the Kingdom of Saudi Arabia (KSA to investigate the level of awareness of BPR. Respondents (customers, employees, and Managers had different educational backgrounds and were from private and public sectors. Findings of the study indicate a general awareness of BPR in KSA.

  13. Optimization Models and Methods for Demand-Side Management of Residential Users: A Survey

    Directory of Open Access Journals (Sweden)

    Antimo Barbato

    2014-09-01

    Full Text Available The residential sector is currently one of the major contributors to the global energy balance. However, the energy demand of residential users has been so far largely uncontrollable and inelastic with respect to the power grid conditions. With the massive introduction of renewable energy sources and the large variations in energy flows, also the residential sector is required to provide some flexibility in energy use so as to contribute to the stability and efficiency of the electric system. To address this issue, demand management mechanisms can be used to optimally manage the energy resources of customers and their energy demand profiles. A very promising technique is represented by demand-side management (DSM, which consists in a proactive method aimed at making users energy-efficient in the long term. In this paper, we survey the most relevant studies on optimization methods for DSM of residential consumers. Specifically, we review the related literature according to three axes defining contrasting characteristics of the schemes proposed: DSM for individual users versus DSM for cooperative consumers, deterministic DSM versus stochastic DSM and day-ahead DSM versus real-time DSM. Based on this classification, we provide a big picture of the key features of different approaches and techniques and discuss future research directions.

  14. Longitudinal cohort survey of women's smoking behaviour and attitudes in pregnancy: study methods and baseline data

    Science.gov (United States)

    Orton, Sophie; Bowker, Katharine; Cooper, Sue; Naughton, Felix; Ussher, Michael; Pickett, Kate E; Leonardi-Bee, Jo; Sutton, Stephen; Dhalwani, Nafeesa N; Coleman, Tim

    2014-01-01

    Objectives To report the methods used to assemble a contemporary pregnancy cohort for investigating influences on smoking behaviour before, during and after pregnancy and to report characteristics of women recruited. Design Longitudinal cohort survey. Setting Two maternity hospitals, Nottingham, England. Participants 3265 women who attended antenatal ultrasound scan clinics were offered cohort enrolment; those who were 8–26 weeks pregnant and were currently smoking or had recently stopped smoking were eligible. Cohort enrollment took place between August 2011 and August 2012. Primary and secondary outcome measures Prevalence of smoking at cohort entry and at two follow-up time points (34–36 weeks gestation and 3 months postnatally); response rate, participants’ sociodemographic characteristics. Results 1101 (33.7%, 95% CI 32.1% to 35.4%) women were eligible for inclusion in the cohort, and of these 850 (77.2%, 95% CI 74.6% to 79.6%) were recruited. Within the cohort, 57.4% (N=488, 95% CI 54.1% to 60.7%) reported to be current smokers. Current smokers were significantly younger than ex-smokers (p<0.05), more likely to have no formal qualifications and to not be in current paid employment compared to recent ex-smokers (p<0.001). Conclusions This contemporary cohort, which seeks very detailed information on smoking in pregnancy and its determinants, includes women with comparable sociodemographic characteristics to those in other UK cross-sectional studies and cohorts. This suggests that future analyses using this cohort and aimed at understanding smoking behaviour in pregnancy may produce findings that are broadly generalisable. PMID:24833689

  15. Processing Trade, Productivity and Prices: Evidence from a Chinese Production Survey

    DEFF Research Database (Denmark)

    Li, Yao Amber; Smeets, Valerie; Warzynski, Frederic

    In this paper, we use a detailed production survey in the Chinese manufacturing industry to estimate both revenue and physical productivity and relate our measurements to firms' trade activity. We find that Chinese exporters for largely export oriented products like leather shoes or shirts appear...

  16. A survey of decontamination processes applicable to DOE nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Chen, L.; Chamberlain, D.B.; Conner, C.; Vandegrift, G.F.

    1997-05-01

    The objective of this survey was to select an appropriate technology for in situ decontamination of equipment interiors as part of the decommissioning of U.S. Department of Energy nuclear facilities. This selection depends on knowledge of existing chemical decontamination methods. This report provides an up-to-date review of chemical decontamination methods. According to available information, aqueous systems are probably the most universally used method for decontaminating and cleaning metal surfaces. We have subdivided the technologies, on the basis of the types of chemical solvents, into acid, alkaline permanganate, highly oxidizing, peroxide, and miscellaneous systems. Two miscellaneous chemical decontamination methods (electrochemical processes and foam and gel systems) are also described. A concise technical description of various processes is given, and the report also outlines technical considerations in the choice of technologies, including decontamination effectiveness, waste handing, fields of application, and the advantages and limitations in application. On the basis of this survey, six processes were identified for further evaluation. 144 refs., 2 tabs.

  17. A survey of decontamination processes applicable to DOE nuclear facilities

    International Nuclear Information System (INIS)

    Chen, L.; Chamberlain, D.B.; Conner, C.; Vandegrift, G.F.

    1997-05-01

    The objective of this survey was to select an appropriate technology for in situ decontamination of equipment interiors as part of the decommissioning of U.S. Department of Energy nuclear facilities. This selection depends on knowledge of existing chemical decontamination methods. This report provides an up-to-date review of chemical decontamination methods. According to available information, aqueous systems are probably the most universally used method for decontaminating and cleaning metal surfaces. We have subdivided the technologies, on the basis of the types of chemical solvents, into acid, alkaline permanganate, highly oxidizing, peroxide, and miscellaneous systems. Two miscellaneous chemical decontamination methods (electrochemical processes and foam and gel systems) are also described. A concise technical description of various processes is given, and the report also outlines technical considerations in the choice of technologies, including decontamination effectiveness, waste handing, fields of application, and the advantages and limitations in application. On the basis of this survey, six processes were identified for further evaluation. 144 refs., 2 tabs

  18. Hyperspectral imager survey and developments for scientific and operational land processes monitoring applications

    Science.gov (United States)

    Kunkel, Bernd P.; Posselt, Winfried; Schmidt, Elke; Del Bello, Umberto; Harnisch, Bernd; Meynart, Roland

    1997-05-01

    The recent developments of airborne imaging spectrometers, currently mostly designated hyperspectral imagers, in the spectral regime from 400-2400 nm revealed and proved an enormous application potential for remote sensing of vegetation in particular. Current spaceborne instrument developments and soon mission will expand these applications to regional and global scale surveys and monitoring. Hyperspectral imagers covering the a.m. spectral range promise to represent the ideal future remote sensing tool for vegetation type and status monitoring. The paper starts with a compilation of relevant applications - with emphasis on vegetation and soils - and their particular spectral and radiometric requirements which has been established by the main author recently as part of a Dornier Satellitensysteme (DSS) in-house activity, including a survey of existing and planned instruments of this type. To the possible extent, airborne measurement data from existing instruments will be included to underline the application potential. The second part will provide an insight into current development activities at DSS, mainly as results of ESA contracts, covering instruments such as ROSIS, HRIS demo model and current PRISM studies. The two latter instruments are ideally suited for vegetation monitoring in terms of pixel size, spectral resolution and range from 450-2350 nm, and radiometric performance. An outlook will conclude the paper for future developments and planning for operational hyperspectral missions.

  19. Survey of molds, yeast and Alicyclobacillus spp. from a concentrated apple juice productive process.

    Science.gov (United States)

    de Cássia Martins Salomão, Beatriz; Muller, Chalana; do Amparo, Hudson Couto; de Aragão, Gláucia Maria Falcão

    2014-01-01

    Bacteria and molds may spoil and/or contaminate apple juice either by direct microbial action or indirectly by the uptake of metabolites as off-flavours and toxins. Some of these microorganisms and/or metabolites may remain in the food even after extensive procedures. This study aim to identify the presence of molds (including heat resistant species) and Alicyclobacillus spp., during concentrated apple juice processing. Molds were isolated at different steps and then identified by their macroscopic and microscopic characteristics after cultivation on standard media at 5, 25 and 37 °C, during 7 days. Among the 19 isolated found, 63% were identified as Penicillium with 50% belonging to the P. expansum specie. With regards to heat resistant molds, the species Neosartorya fischeri, Byssochlamys fulva and also the genus Eupenicillium sp., Talaromyces sp. and Eurotium sp. were isolated. The thermoacidophilic spore-forming bacteria were identified as A. acidoterrestris by a further investigation based on 16S rRNA sequence similarity. The large contamination found indicates the need for methods to eliminate or prevent the presence of these microorganisms in the processing plants in order to avoid both spoilage of apple juice and toxin production.

  20. Methods of digital image processing

    International Nuclear Information System (INIS)

    Doeler, W.

    1985-01-01

    Increasing use of computerized methods for diagnostical imaging of radiological problems will open up a wide field of applications for digital image processing. The requirements set by routine diagnostics in medical radiology point to picture data storage and documentation and communication as the main points of interest for application of digital image processing. As to the purely radiological problems, the value of digital image processing is to be sought in the improved interpretability of the image information in those cases where the expert's experience and image interpretation by human visual capacities do not suffice. There are many other domains of imaging in medical physics where digital image processing and evaluation is very useful. The paper reviews the various methods available for a variety of problem solutions, and explains the hardware available for the tasks discussed. (orig.) [de

  1. Process qualification and control in electron beams--requirements, methods, new concepts and challenges

    International Nuclear Information System (INIS)

    Mittendorfer, J.; Gratzl, F.; Hanis, D.

    2004-01-01

    In this paper the status of process qualification and control in electron beam irradiation is analyzed in terms of requirements, concepts, methods and challenges for a state-of-the-art process control concept for medical device sterilization. Aspects from process qualification to routine process control are described together with the associated process variables. As a case study the 10 MeV beams at Mediscan GmbH are considered. Process control concepts like statistical process control (SPC) and a new concept to determine process capability is briefly discussed

  2. A Survey on Formal Methods for Web Service Composition

    OpenAIRE

    Wang, Yong

    2013-01-01

    Web Service Composition creates new composite Web Services from existing Web Services which embodies the added values of Web Service technology and is a key technology to solve cross-organizational business process integrations. We do a survey on formal methods for Web Service Composition in the following way. Through analyses of Web Service Composition, we establish a reference model called RM-WSComposition to capture elements of Web Service Composition. Based on the RM-WSComposition, issues...

  3. Methods of extending signatures and training without ground information. [data processing, pattern recognition

    Science.gov (United States)

    Henderson, R. G.; Thomas, G. S.; Nalepka, R. F.

    1975-01-01

    Methods of performing signature extension, using LANDSAT-1 data, are explored. The emphasis is on improving the performance and cost-effectiveness of large area wheat surveys. Two methods were developed: ASC, and MASC. Two methods, Ratio, and RADIFF, previously used with aircraft data were adapted to and tested on LANDSAT-1 data. An investigation into the sources and nature of between scene data variations was included. Initial investigations into the selection of training fields without in situ ground truth were undertaken.

  4. Adapting data collection methods in the Australian Life Histories and Health Survey: a retrospective life course study.

    Science.gov (United States)

    Kendig, Hal; Byles, Julie E; O'Loughlin, Kate; Nazroo, James Y; Mishra, Gita; Noone, Jack; Loh, Vanessa; Forder, Peta M

    2014-03-24

    Ideally, life course data are collected prospectively through an ongoing longitudinal study. We report adaptive multimethod fieldwork procedures that gathered life history data by mail survey and telephone interview, comparable with the face-to-face methods employed in the English Longitudinal Study on Ageing (ELSA). The Australian Life Histories and Health (LHH) Survey was a substudy of the Australian 45 and Up Study, with data collection methods modified from the ELSA Study. A self-complete questionnaire and life history calendar were completed by the participants, followed by a computer-assisted telephone interview recording key life events. The LHH survey developed and tested procedures and instruments that gathered rich life history data within an ongoing Australian longitudinal survey on ageing. Data collection proved to be economical. The use of a self-complete questionnaire in conjunction with a life history calendar and coordinated computer-assisted telephone interview was successful in collecting retrospective life course information, in terms of being thorough, practical and efficient. This study has a diverse collection of data covering the life course, starting with early life experiences and continuing with socioeconomic and health exposures and outcomes during adult life. Mail and telephone methodology can accurately and economically add a life history dimension to an ongoing longitudinal survey. The method is particularly valuable for surveying widely dispersed populations. The results will facilitate understanding of the social determinants of health by gathering data on earlier life exposures as well as comparative data across geographical and societal contexts.

  5. Formal methods for industrial critical systems a survey of applications

    CERN Document Server

    Margaria-Steffen, Tiziana

    2012-01-01

    "Today, formal methods are widely recognized as an essential step in the design process of industrial safety-critical systems. In its more general definition, the term formal methods encompasses all notations having a precise mathematical semantics, together with their associated analysis methods, that allow description and reasoning about the behavior of a system in a formal manner.Growing out of more than a decade of award-winning collaborative work within the European Research Consortium for Informatics and Mathematics, Formal Methods for Industrial Critical Systems: A Survey of Applications presents a number of mainstream formal methods currently used for designing industrial critical systems, with a focus on model checking. The purpose of the book is threefold: to reduce the effort required to learn formal methods, which has been a major drawback for their industrial dissemination; to help designers to adopt the formal methods which are most appropriate for their systems; and to offer a panel of state-of...

  6. ASYMPTOTICS FOR EXPONENTIAL LÉVY PROCESSES AND THEIR VOLATILITY SMILE: SURVEY AND NEW RESULTS

    OpenAIRE

    LEIF ANDERSEN; ALEXANDER LIPTON

    2013-01-01

    Exponential Lévy processes can be used to model the evolution of various financial variables such as FX rates, stock prices, and so on. Considerable efforts have been devoted to pricing derivatives written on underliers governed by such processes, and the corresponding implied volatility surfaces have been analyzed in some detail. In the non-asymptotic regimes, option prices are described by the Lewis-Lipton formula, which allows one to represent them as Fourier integrals, and the prices can ...

  7. 40 CFR 63.1323 - Batch process vents-methods and procedures for group determination.

    Science.gov (United States)

    2010-07-01

    ... in § 63.1322(a)(1) or § 63.1322(b)(1) or routing the batch process vent to a control device to comply... 40 Protection of Environment 11 2010-07-01 2010-07-01 true Batch process vents-methods and... Polymers and Resins § 63.1323 Batch process vents—methods and procedures for group determination. (a...

  8. Process synthesis, design and analysis using process-group contribution method

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Eden, Mario Richard; Gani, Rafiqul

    2014-01-01

    This paper describes the development and application of a framework for synthesis, design, and analysis of chemical and biochemical processes. The framework is based on the principle of group contribution used for prediction of physical properties. The fundamental pillars of this methodology...

  9. Active and passive seismic methods for characterization and monitoring of unstable rock masses: field surveys, laboratory tests and modeling.

    Science.gov (United States)

    Colombero, Chiara; Baillet, Laurent; Comina, Cesare; Jongmans, Denis; Vinciguerra, Sergio

    2016-04-01

    Appropriate characterization and monitoring of potentially unstable rock masses may provide a better knowledge of the active processes and help to forecast the evolution to failure. Among the available geophysical methods, active seismic surveys are often suitable to infer the internal structure and the fracturing conditions of the unstable body. For monitoring purposes, although remote-sensing techniques and in-situ geotechnical measurements are successfully tested on landslides, they may not be suitable to early forecast sudden rapid rockslides. Passive seismic monitoring can help for this purpose. Detection, classification and localization of microseismic events within the prone-to-fall rock mass can provide information about the incipient failure of internal rock bridges. Acceleration to failure can be detected from an increasing microseismic event rate. The latter can be compared with meteorological data to understand the external factors controlling stability. On the other hand, seismic noise recorded on prone-to-fall rock slopes shows that the temporal variations in spectral content and correlation of ambient vibrations can be related to both reversible and irreversible changes within the rock mass. We present the results of the active and passive seismic data acquired at the potentially unstable granitic cliff of Madonna del Sasso (NW Italy). Down-hole tests, surface refraction and cross-hole tomography were carried out for the characterization of the fracturing state of the site. Field surveys were implemented with laboratory determination of physico-mechanical properties on rock samples and measurements of the ultrasonic pulse velocity. This multi-scale approach led to a lithological interpretation of the seismic velocity field obtained at the site and to a systematic correlation of the measured velocities with physical properties (density and porosity) and macroscopic features of the granitic cliff (fracturing, weathering and anisotropy). Continuous

  10. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality

    Science.gov (United States)

    Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Introduction Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. Materials and methods The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. Results The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. Conclusions The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in

  11. A method to evaluate process performance by integrating time and resources

    Science.gov (United States)

    Wang, Yu; Wei, Qingjie; Jin, Shuang

    2017-06-01

    The purpose of process mining is to improve the existing process of the enterprise, so how to measure the performance of the process is particularly important. However, the current research on the performance evaluation method is still insufficient. The main methods of evaluation are mainly using time or resource. These basic statistics cannot evaluate process performance very well. In this paper, a method of evaluating the performance of the process based on time dimension and resource dimension is proposed. This method can be used to measure the utilization and redundancy of resources in the process. This paper will introduce the design principle and formula of the evaluation algorithm. Then, the design and the implementation of the evaluation method will be introduced. Finally, we will use the evaluating method to analyse the event log from a telephone maintenance process and propose an optimization plan.

  12. Establishing Survey Validity and Reliability for American Indians Through “Think Aloud” and Test–Retest Methods

    Science.gov (United States)

    Hauge, Cindy Horst; Jacobs-Knight, Jacque; Jensen, Jamie L.; Burgess, Katherine M.; Puumala, Susan E.; Wilton, Georgiana; Hanson, Jessica D.

    2015-01-01

    The purpose of this study was to use a mixed-methods approach to determine the validity and reliability of measurements used within an alcohol-exposed pregnancy prevention program for American Indian women. To develop validity, content experts provided input into the survey measures, and a “think aloud” methodology was conducted with 23 American Indian women. After revising the measurements based on this input, a test–retest was conducted with 79 American Indian women who were randomized to complete either the original measurements or the new, modified measurements. The test–retest revealed that some of the questions performed better for the modified version, whereas others appeared to be more reliable for the original version. The mixed-methods approach was a useful methodology for gathering feedback on survey measurements from American Indian participants and in indicating specific survey questions that needed to be modified for this population. PMID:25888693

  13. Data-driven fault detection for industrial processes canonical correlation analysis and projection based methods

    CERN Document Server

    Chen, Zhiwen

    2017-01-01

    Zhiwen Chen aims to develop advanced fault detection (FD) methods for the monitoring of industrial processes. With the ever increasing demands on reliability and safety in industrial processes, fault detection has become an important issue. Although the model-based fault detection theory has been well studied in the past decades, its applications are limited to large-scale industrial processes because it is difficult to build accurate models. Furthermore, motivated by the limitations of existing data-driven FD methods, novel canonical correlation analysis (CCA) and projection-based methods are proposed from the perspectives of process input and output data, less engineering effort and wide application scope. For performance evaluation of FD methods, a new index is also developed. Contents A New Index for Performance Evaluation of FD Methods CCA-based FD Method for the Monitoring of Stationary Processes Projection-based FD Method for the Monitoring of Dynamic Processes Benchmark Study and Real-Time Implementat...

  14. Asymptotics for Exponential Levy Processes and their Volatility Smile: Survey and New Results

    OpenAIRE

    Leif Andersen; Alexander Lipton

    2012-01-01

    Exponential L\\'evy processes can be used to model the evolution of various financial variables such as FX rates, stock prices, etc. Considerable efforts have been devoted to pricing derivatives written on underliers governed by such processes, and the corresponding implied volatility surfaces have been analyzed in some detail. In the non-asymptotic regimes, option prices are described by the Lewis-Lipton formula which allows one to represent them as Fourier integrals; the prices can be trivia...

  15. Collection, Processing and Accuracy of Mobile Terrestrial Lidar Survey Data in the Coastal Environment

    Science.gov (United States)

    2017-04-01

    calibration is performed. Inaccurate boresight alignment will have a similar effect to IMU attitude errors but will be remain constant for the entire...error propagation to each point. DISCLAIMER: The contents of this report are not to be used for advertising , publication, or promotional purposes...typical mobile survey begins with initialization of the GPS/INS system to acquire stable position, attitude , and velocity data through a series of

  16. Effect of tomato cultivars, honey finisher and processing methods on ...

    African Journals Online (AJOL)

    use

    2011-12-14

    Dec 14, 2011 ... higher reducing sugar, total sugar, titratable acidity and total soluble solids content. .... samples were placed in hot air drying oven for moisture content ..... honey and spices. It was shown that tomato variety with high reducing and total sugars resulted in ketchups with high reducing and total sugars.

  17. Effect of processing methods on the nutritional values and anti ...

    African Journals Online (AJOL)

    Compaq

    2017-01-18

    Jan 18, 2017 ... Department of Plant Science and Biotechnology, University of Nigeria, Nsukka, Enugu State, Nigeria. Received 15 November .... powdered seeds are applied as a poultice to abscess and to promote suppuration ..... composition, dietary fibre and resistant starch contents of raw and cooked pea, common ...

  18. Effects of processing method and age of leaves on phytochemical profiles and bioactivity of coffee leaves.

    Science.gov (United States)

    Chen, Xiu-Min; Ma, Zhili; Kitts, David D

    2018-05-30

    The use of coffee leaves as a novel beverage has recently received consumer interest, but there is little known about how processing methods affect the quality of final product. We applied tea (white, green, oolong and black tea) processing methods to process coffee leaves and then investigated their effects on phytochemical composition and related antioxidant and anti-inflammatory properties. Using Japanese-style green tea-processing of young leaves, and black tea-processing of mature (BTP-M) coffee leaves, produced contrasting effects on phenolic content, and associated antioxidant activity and nitric oxide (NO) inhibitory activity in IFN-γ and LPS induced Raw 264.7 cells. BTP-M coffee leaves also had significantly (P coffee leaves and the type of processing method affect phytochemical profiles sufficiently to produce characteristic antioxidant and anti-inflammatory activities. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  19. Character of GPR wave in air and processed method

    International Nuclear Information System (INIS)

    Shi Jianping; Zhang Zhiyong; Deng Juzhi

    2009-01-01

    The wave reflected by objects in the air is unavoidable because electromagnetic wave of GPR was send to all directions. There are three air reflection types: directly arrived wave, system ring and reflection wave. The directly arrived waves don't disturb the recognition of the reflections from earth because they affect the first short time of GPR trace record. But system ring and reflection from air are the mainly part of disturbs. The time and distance curve of reflection from air can be classified into two types: hyperbola type and line type. The reflection from air and from earth can be recognized by calculating the velocity of electromagnetic wave. Line type reflection can be filtered by background remove and 2-D filter; by comparing the migrated profiles with velocity in air and ground, the interpretation will become more exact. (authors)

  20. Microfluidic device and method for processing of macromolecules

    DEFF Research Database (Denmark)

    2012-01-01

    . The device further comprises first inlet and outlet channels for filling the reaction channels via the manifolds with one or more macromolecule containers suspended in a first carrier fluid, wherein the first inlet and outlet channels are configured such that a flow established from the first set of inlets...... to the first set of outlets is guided through the reaction channels, and second inlet and outlet channels for feeding an enzymatic reagent to the reaction chamber essentially without displacing the macromolecule containers trapped in the reaction channels, wherein the second set of inlets and outlets...... are configured such that a flow established from the second inlet to the second outlet is guided through at least one of the manifolds and bypasses the reaction channels....

  1. Does the underground sidewall station survey method meet MHSA ...

    African Journals Online (AJOL)

    Results indicated that the second method of surveying appears to have a greater probability of severe bearing error propagation over distances in excess of 180m and would require regular check surveys in order to strengthen the network. Some of the advantages and disadvantages of the sidewall survey system are ...

  2. Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality.

    Science.gov (United States)

    L'Engle, Kelly; Sefa, Eunice; Adimazoya, Edward Akolgo; Yartey, Emmanuel; Lenzi, Rachel; Tarpo, Cindy; Heward-Mills, Nii Lante; Lew, Katherine; Ampeh, Yvonne

    2018-01-01

    Generating a nationally representative sample in low and middle income countries typically requires resource-intensive household level sampling with door-to-door data collection. High mobile phone penetration rates in developing countries provide new opportunities for alternative sampling and data collection methods, but there is limited information about response rates and sample biases in coverage and nonresponse using these methods. We utilized data from an interactive voice response, random-digit dial, national mobile phone survey in Ghana to calculate standardized response rates and assess representativeness of the obtained sample. The survey methodology was piloted in two rounds of data collection. The final survey included 18 demographic, media exposure, and health behavior questions. Call outcomes and response rates were calculated according to the American Association of Public Opinion Research guidelines. Sample characteristics, productivity, and costs per interview were calculated. Representativeness was assessed by comparing data to the Ghana Demographic and Health Survey and the National Population and Housing Census. The survey was fielded during a 27-day period in February-March 2017. There were 9,469 completed interviews and 3,547 partial interviews. Response, cooperation, refusal, and contact rates were 31%, 81%, 7%, and 39% respectively. Twenty-three calls were dialed to produce an eligible contact: nonresponse was substantial due to the automated calling system and dialing of many unassigned or non-working numbers. Younger, urban, better educated, and male respondents were overrepresented in the sample. The innovative mobile phone data collection methodology yielded a large sample in a relatively short period. Response rates were comparable to other surveys, although substantial coverage bias resulted from fewer women, rural, and older residents completing the mobile phone survey in comparison to household surveys. Random digit dialing of mobile

  3. The effect of processing and preservation methods on the oxalate ...

    African Journals Online (AJOL)

    Dr. J. T. Ekanem

    . This limits the importance of vegetables in nutrition, as the nutritional importance of any given food is a function of its nutrient and antinutrient composition10. In recognition of the importance of food safety in public health, and as part of our.

  4. Influence of rural processing methods and postharvest storage ...

    African Journals Online (AJOL)

    About 0.45 kg of kola nuts (coated and uncoated) numbering 20 nuts were put in each storage container. Sensory quality of the nuts after storage was determined with quantitative descriptive analysis in expert panel, using six quality attributes. Physical parameters measured were; weight loss, pest incidence and sprouting ...

  5. The Tianjin Mental Health Survey (TJMHS) : study rationale, design and methods

    NARCIS (Netherlands)

    Yin, Huifang; Phillips, Michael R; Wardenaar, Klaas J; Xu, Guangming; Ormel, Johan; Tian, Hongjun; Schoevers, Robert A

    Mental health in China is of growing concern to both policy-makers and researchers. The Tianjin Mental Health Survey (TJMHS) was conducted between July 2011 and March 2012 to assess the prevalence and risk factors of mental disorders in the context of recent economic growth and other

  6. Processing method and device for iodine adsorbing material

    International Nuclear Information System (INIS)

    Watanabe, Shin-ichi; Shiga, Reiko.

    1997-01-01

    An iodine adsorbing material adsorbing silver compounds is reacted with a reducing gas, so that the silver compounds are converted to metal silver and stored. Then, the silver compounds are not melted or recrystallized even under a highly humid condition, accordingly, peeling of the adsorbed materials from a carrier can be prevented, and the iodine adsorbing material can be stored stably. Since the device is disposed in an off gas line for discharging off gases from a nuclear power facility, the iodine adsorbing material formed by depositing silver halides to the carrier is contained, and a reducing or oxidizing gas is supplied to the vessel as required, and silver halides can be converted to metal silver or the metal silver can be returned to silver halide. (T.M.)

  7. Combustible gas recombining method and processing facility for gas waste

    Energy Technology Data Exchange (ETDEWEB)

    Watabe, Atsushi; Murakami, Kazuo

    1998-09-02

    Combustible gases (hydrogen, oxygen) generated by radiation decomposition of reactor water in the vicinity of a reactor core in a reactor pressure vessel of a BWR type nuclear power plant pass, together with flow of steams, through a gas/water separator and a steam dryer disposed at the upper portion of a reactor core. A catalyst for allowing hydrogen and oxygen to react efficiently and recombine them into water is plated on the surface of the steam dryer. The catalyst comprises palladium (Pd) or platinum (Pt) or a Pd-Pt alloy. The combustible gases passing through the steam dryer are recombined and formed into steams by the catalyst. A slight amount of hydrogen and oxygen which are not recombined transfers, together with main steams, from a main steam pipe to a main condensator by way of a turbine. Then they are released, together with air from an air extraction device, from an activated carbon-type rare gas hold up tower. (I.N.)

  8. Acquisition and understanding of process knowledge using problem solving methods

    CERN Document Server

    Gómez-Pérez, JM

    2010-01-01

    The development of knowledge-based systems is usually approached through the combined skills of knowledge engineers (KEs) and subject matter experts (SMEs). One of the most critical steps in this activity aims at transferring knowledge from SMEs to formal, machine-readable representations, which allow systems to reason with such knowledge. However, this is a costly and error prone task. Alleviating the knowledge acquisition bottleneck requires enabling SMEs with the means to produce the desired knowledge representations without the help of KEs. This is especially difficult in the case of compl

  9. Influence of rural processing methods and postharvest storage ...

    African Journals Online (AJOL)

    User

    2015-02-11

    Feb 11, 2015 ... The treatments comprised of: three different colour plastic buckets – red, green, white and three inner linings - ... lower in white plastic storage container for coated kola nut while green plastic container reduced weight loss for uncoated. ... industrial usage in pharmaceuticals for the production of soft drinks ...

  10. Effect of processing methods on the physico-chemical and ...

    African Journals Online (AJOL)

    ... (a commercial complementary food) as control. The various complementary foods were evaluated for proximate, functional and nutritional properties. Feeding trials were also carried out using albino rats to evaluate the growth promoting quality of the complementary foods. Malted complementary food had the lowest water ...

  11. Emerging methods, technologies and process management in software engineering

    CERN Document Server

    Ferrucci, Filomena; Tortora, Genny; Tucci, Maurizio

    2007-01-01

    A high-level introduction to new technologies andmethods in the field of software engineering Recent years have witnessed rapid evolution of software engineering methodologies, and until now, there has been no single-source introduction to emerging technologies in the field.

  12. Effects of Processing Method and Consumers' Geo-Political ...

    African Journals Online (AJOL)

    The three ugba samples were subjected to sensory evaluation test on a 9-point hedonic scale, using a 24-member semi-trained panel, 8 of which were indigenes from each of the three Imo State geo-political zones (Okigwe, Orlu and Owerri). Results from sensory assessments showed that the traditional ugba sample ...

  13. Bayesian signal processing classical, modern, and particle filtering methods

    CERN Document Server

    Candy, James V

    2016-01-01

    This book aims to give readers a unified Bayesian treatment starting from the basics (Baye's rule) to the more advanced (Monte Carlo sampling), evolving to the next-generation model-based techniques (sequential Monte Carlo sampling). This next edition incorporates a new chapter on "Sequential Bayesian Detection," a new section on "Ensemble Kalman Filters" as well as an expansion of Case Studies that detail Bayesian solutions for a variety of applications. These studies illustrate Bayesian approaches to real-world problems incorporating detailed particle filter designs, adaptive particle filters and sequential Bayesian detectors. In addition to these major developments a variety of sections are expanded to "fill-in-the gaps" of the first edition. Here metrics for particle filter (PF) designs with emphasis on classical "sanity testing" lead to ensemble techniques as a basic requirement for performance analysis. The expansion of information theory metrics and their application to PF designs is fully developed an...

  14. A method of encountering the ratio of adjacent sides and its applied study in nuclear engineering survey

    International Nuclear Information System (INIS)

    Wu Jingqin

    1996-01-01

    The cross side or range net survey method is to compute the average error of the measured lengths of sides. With the increment of the side length, the viewing variance increases greatly. Generally the photo-electrical distance survey equipment has a high inside precision, but it is affected by typical weather error so that the outside precision is decreased, and this weather error similar to systematic error greatly decreases the viewing side precision. To solve this problem, theoretical study and field test were carried out for the correlation of ratios among short sides by photo-electrical survey, and the stability degree of the ratios of sides, a new method of ratio encountering of adjacent sides is put forward. Because of the weights of the ration variance σ γ 2 = 2η 2 γ 2 and the angular variance σ β 2 = 2J 2 ρ 2 match each other, so the systematic error can be eliminated completely, and a survey point co-ordinate of high precision can be obtained. It is easy to operate, as it does not require multi-photo-band survey or to operate at the optimal observation time, and is especially suitable to nuclear engineering survey applications. (3 tabs.)

  15. Development and Psychometric Evaluation of the HPV Clinical Trial Survey for Parents (CTSP-HPV) Using Traditional Survey Development Methods and Community Engagement Principles.

    Science.gov (United States)

    Cunningham, Jennifer; Wallston, Kenneth A; Wilkins, Consuelo H; Hull, Pamela C; Miller, Stephania T

    2015-12-01

    This study describes the development and psychometric evaluation of HPV Clinical Trial Survey for Parents with Children Aged 9 to 15 (CTSP-HPV) using traditional instrument development methods and community engagement principles. An expert panel and parental input informed survey content and parents recommended study design changes (e.g., flyer wording). A convenience sample of 256 parents completed the final survey measuring parental willingness to consent to HPV clinical trial (CT) participation and other factors hypothesized to influence willingness (e.g., HPV vaccine benefits). Cronbach's a, Spearman correlations, and multiple linear regression were used to estimate internal consistency, convergent and discriminant validity, and predictively validity, respectively. Internal reliability was confirmed for all scales (a ≥ 0.70.). Parental willingness was positively associated (p advantages of adolescent CTs (r range 0.33-0.42), supporting convergent validity. Moderate discriminant construct validity was also demonstrated. Regression results indicate reasonable predictive validity with the six scales accounting for 31% of the variance in parents' willingness. This instrument can inform interventions based on factors that influence parental willingness, which may lead to the eventual increase in trial participation. Further psychometric testing is warranted. © 2015 Wiley Periodicals, Inc.

  16. The Extreme Ice Survey: Capturing and Conveying Glacial Processes Through Time-Lapse Imagery and Narration

    Science.gov (United States)

    Balog, J. D.; Box, J. E.; Pfeffer, W. T.; Hood, E. W.; Fagre, D. B.; Anker, C.; O'Neel, S.

    2010-12-01

    The Extreme Ice Survey (EIS) uses time-lapse photography, conventional photography, and video to document rapid change in the Earth's glacial ice. The EIS team currently has 38 time-lapse cameras at sites in Greenland, Iceland, Alaska, the Rocky Mountains and Nepal. EIS supplements this ongoing record with annual repeat photography in British Columbia, Iceland, the Alps, and Bolivia. EIS imagery supplies basic knowledge in glacier dynamics to the science community, as well as compelling, engaging narratives to the general public about the immediacy of the Anthropocene and climate change. Visual materials from EIS have impacted more than 150 million people, ranging from White House staff, the U. S. Congress and government agency officials to globally influential corporate officers and all age strata of the general public. Media products include a National Geographic/NOVA special, two National Geographic magazine articles, a feature in Parade magazine (circulation 71 million), and numerous presentations on CNN, NBC, BBC and National Public Radio. Columbia Glacier, Alaska, June 2006, May 2007, June 2008 terminus indicated.

  17. Prediction of periodically correlated processes by wavelet transform and multivariate methods with applications to climatological data

    Science.gov (United States)

    Ghanbarzadeh, Mitra; Aminghafari, Mina

    2015-05-01

    This article studies the prediction of periodically correlated process using wavelet transform and multivariate methods with applications to climatological data. Periodically correlated processes can be reformulated as multivariate stationary processes. Considering this fact, two new prediction methods are proposed. In the first method, we use stepwise regression between the principal components of the multivariate stationary process and past wavelet coefficients of the process to get a prediction. In the second method, we propose its multivariate version without principal component analysis a priori. Also, we study a generalization of the prediction methods dealing with a deterministic trend using exponential smoothing. Finally, we illustrate the performance of the proposed methods on simulated and real climatological data (ozone amounts, flows of a river, solar radiation, and sea levels) compared with the multivariate autoregressive model. The proposed methods give good results as we expected.

  18. (3) Simple processing method

    African Journals Online (AJOL)

    Adeyinka Odunsi

    for nutrient composition and microbial loads in order to select the most suitable for use as a feedstuff. Broiler ... Poultry waste (PW) is predominantly solid; which includes the faecal and urinary wastes, bedding material, wasted feed, feathers and non-degradable materials. ... therefore, designed to evaluate the prospect.

  19. Process control monitoring systems, industrial plants, and process control monitoring methods

    Science.gov (United States)

    Skorpik, James R [Kennewick, WA; Gosselin, Stephen R [Richland, WA; Harris, Joe C [Kennewick, WA

    2010-09-07

    A system comprises a valve; a plurality of RFID sensor assemblies coupled to the valve to monitor a plurality of parameters associated with the valve; a control tag configured to wirelessly communicate with the respective tags that are coupled to the valve, the control tag being further configured to communicate with an RF reader; and an RF reader configured to selectively communicate with the control tag, the reader including an RF receiver. Other systems and methods are also provided.

  20. Method of test and survey of caprolactam migration into foods packaged in nylon-6.

    Science.gov (United States)

    Bradley, E L; Speck, D R; Read, W A; Castle, L

    2004-12-01

    An analytical method for the determination of the nylon-6 monomer caprolactam in foods is described. The foodstuff was extracted with ethanol: water (1:2) containing capryllactam as internal standard and the extract was defatted using hexane. The extract was analysed by liquid chromatography coupled with mass spectrometry. The test method was calibrated down to 0.7 mg kg(-1). The repeatability of the method was good, with a relative standard deviation of 9% at the 15 mg kg(-1) level. The method was demonstrated to be accurate in an independent external check sample exercise. The new method was applied to the analysis of 50 retail foodstuffs packaged in nylon-6. Caprolactam was detected and confirmed in nine of the 50 food samples, in the range 2.8-13 mg kg(-1). The presence of caprolactam was indicated in a further 15 samples, in the range 0.8-11 mg kg(-1), but these samples did not meet all of the five confirmation criteria applied. All migration levels (both confirmed and unconfirmed) were below the European specific migration limit for caprolactam, which is 15 mg kg(-1). The average migration for all 50 samples, setting non-detectables at half the limit of detection, was 2.6 mg kg(-1) with a standard deviation of 3.1 mg kg(-1) (n = 50). All samples found to contain detectable levels of caprolactam migration were for applications involving heating the food in the packaging. They were packs of, for example, sausage meat for which the food would have been heat processed in the nylon casing, or they were nylon pouches for heating foods by boiling, microwaving or roasting.

  1. Design, implementation, and analysis methods for the National Woodland Owner Survey

    Science.gov (United States)

    Brett J. Butler; Earl C. Leatherberry; Michael S. Williams; Michael S. Williams

    2005-01-01

    The National Woodland Owner Survey (NWOS) is conducted by the USDA Forest Service, Forest Inventory and Analysis program to increase our understanding of private forest-land owners in the United States. The information is intended to help policy makers, resource managers, and others interested in the forest resources of the United States better understand the social...

  2. The use of systematic and heuristic methods in the basic design cycle: A comparative survey of students' method usage

    DEFF Research Database (Denmark)

    Person, O.; Daalhuizen, Jaap; Gattol, V.

    2013-01-01

    for this is that the experiences of design educators when using methods in their teaching do not always sit well with how methods are portrayed in the literature. Based on self-reports of the students, we study the use of systematic and heuristic methods for the five activities in the basic design cycle: (1) analysis, (2......In the present paper, we study the reported use of systematic and heuristic methods for 304 students enrolled in a master-level course on design theory and methodology. What to teach design and engineering students about methods is an important topic for discussion. One reason......) synthesis, (3) simulation, (4) evaluation, and (5) decision-making. The results of our study suggest that systematic and heuristic methods fulfil different roles for the students when designing. The students reported to use heuristic methods significantly more for synthesis, while they reported to use...

  3. The use of systematic and heuristic methods in the basic design cycle: A comparative survey of students' method usage

    DEFF Research Database (Denmark)

    Person, O.; Daalhuizen, Jaap; Gattol, V.

    2013-01-01

    In the present paper, we study the reported use of systematic and heuristic methods for 304 students enrolled in a master-level course on design theory and methodology. What to teach design and engineering students about methods is an important topic for discussion. One reason...... for this is that the experiences of design educators when using methods in their teaching do not always sit well with how methods are portrayed in the literature. Based on self-reports of the students, we study the use of systematic and heuristic methods for the five activities in the basic design cycle: (1) analysis, (2......) synthesis, (3) simulation, (4) evaluation, and (5) decision-making. The results of our study suggest that systematic and heuristic methods fulfil different roles for the students when designing. The students reported to use heuristic methods significantly more for synthesis, while they reported to use...

  4. Cement solidification method for miscellaneous radioactive solid, processing device and processing tool therefor

    International Nuclear Information System (INIS)

    Mihara, Shigeru; Suzuki, Kazunori; Hasegawa, Akira.

    1994-01-01

    A basket made of a metal net and a lid with a spacer constituting a processing tool for processing miscellaneous radioactive solid wastes is formed as a mesh which scarcely passes the miscellaneous solids but pass mortars. The size of the mesh is usually from about 10 to 30mm. Since this mesh allows fine solids approximate to powders such as burning ashes and heat insulation materials, they fall to the bottom of a dram can, to cause corrosion. Then, the corners of the bottom and the bottom of the dram can are coated with cement. The miscellaneous solid wastes are contained, and the lid of a metal net having a spacer at the upper portion thereof is set, a provisional lid is put on, and it is evacuated, and mortars are injected. Since there is a possibility that light and fine radioactive powders are exposed on the surface of the mortars coagulated and hardened by curing, conditioning for further adding mortars is applied for securing the mortars in order to prevent scattering of the radioactive powders. With such procedures, a satisfactory safe solidified products can be formed. (T.M.)

  5. Engaging Students in Survey Research Projects across Research Methods and Statistics Courses

    Science.gov (United States)

    Lovekamp, William E.; Soboroff, Shane D.; Gillespie, Michael D.

    2017-01-01

    One innovative way to help students make sense of survey research has been to create a multifaceted, collaborative assignment that promotes critical thinking, comparative analysis, self-reflection, and statistical literacy. We use a short questionnaire adapted from the Higher Education Research Institute's Cooperative Institutional Research…

  6. A review of the uses and methods of processing banana and ...

    African Journals Online (AJOL)

    ) ... Journal of Agricultural Research and Development ... Different processing methods of Musa spp. into new food products which include production of flour, preparation of jams and jellies and the quality attributes of the products obtained from ...

  7. Applying Process Improvement Methods to Clinical and Translational Research: Conceptual Framework and Case Examples.

    Science.gov (United States)

    Daudelin, Denise H; Selker, Harry P; Leslie, Laurel K

    2015-12-01

    There is growing appreciation that process improvement holds promise for improving quality and efficiency across the translational research continuum but frameworks for such programs are not often described. The purpose of this paper is to present a framework and case examples of a Research Process Improvement Program implemented at Tufts CTSI. To promote research process improvement, we developed online training seminars, workshops, and in-person consultation models to describe core process improvement principles and methods, demonstrate the use of improvement tools, and illustrate the application of these methods in case examples. We implemented these methods, as well as relational coordination theory, with junior researchers, pilot funding awardees, our CTRC, and CTSI resource and service providers. The program focuses on capacity building to address common process problems and quality gaps that threaten the efficient, timely and successful completion of clinical and translational studies. © 2015 The Authors. Clinical and Translational Science published by Wiley Periodicals, Inc.

  8. Modelling and control for laser based welding processes: modern methods of process control to improve quality of laser-based joining methods

    Science.gov (United States)

    Zäh, Ralf-Kilian; Mosbach, Benedikt; Hollwich, Jan; Faupel, Benedikt

    2017-02-01

    To ensure the competitiveness of manufacturing companies it is indispensable to optimize their manufacturing processes. Slight variations of process parameters and machine settings have only marginally effects on the product quality. Therefore, the largest possible editing window is required. Such parameters are, for example, the movement of the laser beam across the component for the laser keyhole welding. That`s why it is necessary to keep the formation of welding seams within specified limits. Therefore, the quality of laser welding processes is ensured, by using post-process methods, like ultrasonic inspection, or special in-process methods. These in-process systems only achieve a simple evaluation which shows whether the weld seam is acceptable or not. Furthermore, in-process systems use no feedback for changing the control variables such as speed of the laser or adjustment of laser power. In this paper the research group presents current results of the research field of Online Monitoring, Online Controlling and Model predictive controlling in laser welding processes to increase the product quality. To record the characteristics of the welding process, tested online methods are used during the process. Based on the measurement data, a state space model is ascertained, which includes all the control variables of the system. Depending on simulation tools the model predictive controller (MPC) is designed for the model and integrated into an NI-Real-Time-System.

  9. The Scientific Method and the Creative Process: Implications for the K-6 Classroom

    Science.gov (United States)

    Nichols, Amanda J.; Stephens, April H.

    2013-01-01

    Science and the arts might seem very different, but the processes that both fields use are very similar. The scientific method is a way to explore a problem, form and test a hypothesis, and answer questions. The creative process creates, interprets, and expresses art. Inquiry is at the heart of both of these methods. The purpose of this article is…

  10. Unmanned aircraft systems image collection and computer vision image processing for surveying and mapping that meets professional needs

    Science.gov (United States)

    Peterson, James Preston, II

    Unmanned Aerial Systems (UAS) are rapidly blurring the lines between traditional and close range photogrammetry, and between surveying and photogrammetry. UAS are providing an economic platform for performing aerial surveying on small projects. The focus of this research was to describe traditional photogrammetric imagery and Light Detection and Ranging (LiDAR) geospatial products, describe close range photogrammetry (CRP), introduce UAS and computer vision (CV), and investigate whether industry mapping standards for accuracy can be met using UAS collection and CV processing. A 120-acre site was selected and 97 aerial targets were surveyed for evaluation purposes. Four UAS flights of varying heights above ground level (AGL) were executed, and three different target patterns of varying distances between targets were analyzed for compliance with American Society for Photogrammetry and Remote Sensing (ASPRS) and National Standard for Spatial Data Accuracy (NSSDA) mapping standards. This analysis resulted in twelve datasets. Error patterns were evaluated and reasons for these errors were determined. The relationship between the AGL, ground sample distance, target spacing and the root mean square error of the targets is exploited by this research to develop guidelines that use the ASPRS and NSSDA map standard as the template. These guidelines allow the user to select the desired mapping accuracy and determine what target spacing and AGL is required to produce the desired accuracy. These guidelines also address how UAS/CV phenomena affect map accuracy. General guidelines and recommendations are presented that give the user helpful information for planning a UAS flight using CV technology.

  11. Hollow fiber structures, methods of use thereof, methods of making, and pressure-retarded processes

    KAUST Repository

    Le, Lieu Ngoc

    2016-12-08

    Embodiments of the present disclosure provide for composite materials, methods of making composite materials, methods of using composite materials, and the like. In particular, the present application relates to hollow fibers and to pressure-retarded osmosis systems comprising said fibers. The hollow fibers have an inside layer and an outside layer, wherein the outside layer covers an outside surface of the inside layer, wherein the inside layer forms a boundary around the lumen, wherein the inside layer includes a bi-layer structure, wherein the bi-layer structure includes a sponge-like layer and a finger-like layer, wherein the sponge-like layer is disposed closer to the lumen of the hollow fiber and the finger-like layer is disposed on the sponge-like layer on the side opposite the lumen, wherein the outside layer includes a polyamide layer.

  12. A robust method for processing scanning probe microscopy images and determining nanoobject position and dimensions

    NARCIS (Netherlands)

    Silly, F.

    2009-01-01

    P>Processing of scanning probe microscopy (SPM) images is essential to explore nanoscale phenomena. Image processing and pattern recognition techniques are developed to improve the accuracy and consistency of nanoobject and surface characterization. We present a robust and versatile method to

  13. China Kadoorie Biobank of 0.5 million people: survey methods, baseline characteristics and long-term follow-up.

    Science.gov (United States)

    Chen, Zhengming; Chen, Junshi; Collins, Rory; Guo, Yu; Peto, Richard; Wu, Fan; Li, Liming

    2011-12-01

    Large blood-based prospective studies can provide reliable assessment of the complex interplay of lifestyle, environmental and genetic factors as determinants of chronic disease. The baseline survey of the China Kadoorie Biobank took place during 2004-08 in 10 geographically defined regions, with collection of questionnaire data, physical measurements and blood samples. Subsequently, a re-survey of 25,000 randomly selected participants was done (80% responded) using the same methods as in the baseline. All participants are being followed for cause-specific mortality and morbidity, and for any hospital admission through linkages with registries and health insurance (HI) databases. Overall, 512,891 adults aged 30-79 years were recruited, including 41% men, 56% from rural areas and mean age was 52 years. The prevalence of ever-regular smoking was 74% in men and 3% in women. The mean blood pressure was 132/79 mmHg in men and 130/77 mmHg in women. The mean body mass index (BMI) was 23.4 kg/m(2) in men and 23.8 kg/m(2) in women, with only 4% being obese (>30 kg/m(2)), and 3.2% being diabetic. Blood collection was successful in 99.98% and the mean delay from sample collection to processing was 10.6 h. For each of the main baseline variables, there is good reproducibility but large heterogeneity by age, sex and study area. By 1 January 2011, over 10,000 deaths had been recorded, with 91% of surviving participants already linked to HI databases. This established large biobank will be a rich and powerful resource for investigating genetic and non-genetic causes of many common chronic diseases in the Chinese population.

  14. Apparatus and method for converting biomass to feedstock for biofuel and biochemical manufacturing processes

    Science.gov (United States)

    Kania, John; Qiao, Ming; Woods, Elizabeth M.; Cortright, Randy D.; Myren, Paul

    2015-12-15

    The present invention includes improved systems and methods for producing biomass-derived feedstocks for biofuel and biochemical manufacturing processes. The systems and methods use components that are capable of transferring relatively high concentrations of solid biomass utilizing pressure variations between vessels, and allows for the recovery and recycling of heterogeneous catalyst materials.

  15. The use of systematic and heuristic methods in the basic design cycle : A comparative survey of students' method usage

    NARCIS (Netherlands)

    Person, F.E.O.K.; Daalhuizen, J.J.; Gattol, V.

    2013-01-01

    In the present paper, we study the reported use of systematic and heuristic methods for 304 students enrolled in a master-level course on design theory and methodology. What to teach design and engineering students about methods is an important topic for discussion. One reason for this is that the

  16. Development of a numerical simulation method for melting/solidification and dissolution/precipitation phenomena. 1. Literature survey for computer program design

    International Nuclear Information System (INIS)

    Uchibori, Akihiro; Ohshima, Hiroyuki

    2004-04-01

    Survey research of numerical methods for melting/solidification and dissolution/precipitation phenomena was performed to determine the policy for a simulation program development. Melting/solidification and dissolution/ precipitation have been key issues for feasibility evaluation of several techniques applied in the nuclear fuel cycle processes. Physical models for single-component melting/solidification, two-component solution solidification or precipitation by cooling and precipitation by electrolysis, which are moving boundary problems, were made clear from the literature survey. The transport equations are used for thermal hydraulic analysis in the solid and the liquid regions. Behavior of the solid-liquid interface is described by the heat and mass transfer model. These physical models need to be introduced into the simulation program. The numerical methods for the moving boundary problems are categorized into two types: interface tracking method and interface capturing method. Based on the classification, performance of each numerical method was evaluated. The interface tracking method using the Lagrangian moving mesh requires relatively complicated algorithm. The algorithm has high accuracy for predicting the moving interface. On the other hand, the interface capturing method uses the Eulerian fixing mesh, leading to simple algorithm. Prediction accuracy of the method is relatively low. The extended finite element method classified as the interface capturing method can predict the interface behavior accurately even though the Eulerian fixing mesh is used. We decided to apply the extended finite element method to the simulation program. (author)

  17. Photography - Determination of thiosulphate and other residual chemicals in processed photographic films, plates and papers - Methylene blue photometric method and silver sulphide densitometric method

    CERN Document Server

    International Organization for Standardization. Geneva

    1977-01-01

    Photography - Determination of thiosulphate and other residual chemicals in processed photographic films, plates and papers - Methylene blue photometric method and silver sulphide densitometric method

  18. Present status of processing method

    Energy Technology Data Exchange (ETDEWEB)

    Kosako, Kazuaki [Sumitomo Atomic Energy Industries Ltd., Tokyo (Japan)

    1998-11-01

    Present status of processing method for a high-energy nuclear data file was examined. The NJOY94 code is the only one available to the processing. In Japan, present processing used NJOY94 is orienting toward the production of traditional cross section library, because a high-energy transport code using a high-energy cross section library is indistinct. (author)

  19. The study on the method of image recognition and processing for digital nuclear signals

    International Nuclear Information System (INIS)

    Wang Dongyang; Zhang Ruanyu; Wang Peng; Yan Yangyang; Hao Dejian

    2012-01-01

    Since there are many limits in the method of the traditional DSP system, a new method of digital nuclear signal processing based on the digital image recognition is presented in this paper. This method converts the time-series digital nuclear signal into the pulse image with adjustable pixels. A new principle and method have been taken to develop the SNR of the digital nuclear signal with the theory and method of the digital image processing. A method called ISC is presented, by which it is convenient to extract the template parameters. (authors)

  20. Effect of the method of processing on quality and oxidative stability ...

    African Journals Online (AJOL)

    In this study four samn samples prepared from cow milk using two processing methods (traditional T1, T2 and factory processed T3, T4) were investigated for their physico-chemical properties, fatty acids composition, oxidative stability and sensory properties. The traditionally processed samples showed a significance ...

  1. The Research Process in a Multi-Level Mixed-Methods Case Study: International Organization Headquarters and Field Employee Perspectives of a Program in Southern Sudan

    Science.gov (United States)

    Eschenbacher, Heidi

    2012-01-01

    This article provides an overview of the methods and data-collection process for a multi-level mixed-methods case study. Data for the study were gathered through phone interviews and electronic surveys from individuals working on the same educational program in Southern Sudan, though some were supporting the program from outside the country. The…

  2. Methods of Model Reduction for Large-Scale Biological Systems: A Survey of Current Methods and Trends.

    Science.gov (United States)

    Snowden, Thomas J; van der Graaf, Piet H; Tindall, Marcus J

    2017-07-01

    Complex models of biochemical reaction systems have become increasingly common in the systems biology literature. The complexity of such models can present a number of obstacles for their practical use, often making problems difficult to intuit or computationally intractable. Methods of model reduction can be employed to alleviate the issue of complexity by seeking to eliminate those portions of a reaction network that have little or no effect upon the outcomes of interest, hence yielding simplified systems that retain an accurate predictive capacity. This review paper seeks to provide a brief overview of a range of such methods and their application in the context of biochemical reaction network models. To achieve this, we provide a brief mathematical account of the main methods including timescale exploitation approaches, reduction via sensitivity analysis, optimisation methods, lumping, and singular value decomposition-based approaches. Methods are reviewed in the context of large-scale systems biology type models, and future areas of research are briefly discussed.

  3. [Comparison of dietary survey, frequency and 24 hour urinary Na methods in evaluation of salt intake in the population].

    Science.gov (United States)

    Li, Jianhong; Lu, Zilong; Yan, Liuxia; Zhang, Jiyu; Tang, Junli; Cai, Xiaoning; Guo, Xiaolei; Ma, Jixiang; Xu, Aiqiang

    2014-12-01

    To compare the difference and correlation between dietary salt intakes assessed by 24 hours urinary Na method, food weighted record method and food frequency questionnaire method. All 2 184 subjects aged 18 to 69 were selected by multi stage stratified cluster random sampling method in Shandong province in June to September, 2011. Dietary salt intakes were measured by 24 hours urinary Na method, food weighted record method and food frequency questionnaire method. The information on gender, age, dining locations and labour intensity of members dining at home for 3 days were recorded. And the dietary habits were surveyed by questionnaire. Salt intakes were 14.0, 12.0 and 10.5 g/d assessed by 24 hours urinary Na method, food weighted record method and food frequency questionnaire, respectively. Comparing with 24 hours urinary Na method, salt intakes assessed by food weighted record method and food frequency questionnaire method were 2.0 g (14.3% undervalued) and 3.4 g (24.3% undervalued) less, respectively. Comparing with 24 hours urinary Na method, the proportion of individuals with salt intake over-reported and under-reported were 42.4% (856/2 020) and 55.3% (1 117/2 020) by food weighted record method, and were 20.7% (418/2 020) and 16.3% (329/2 020) by food frequency questionnaire method, respectively; the proportion of individuals with salt intakes within ± 25% of 24 hours urinary Na method were 36.9% (745/2 020) and 28.4% (574/2 020), respectively. Salt intakes assessed by 24 hours urinary method correlated significantly with both salt intakes assessed by food weighted record method and food frequency questionnaire method; the correlation coefficients were 0.13 and 0.07, respectively. With the increasing of salt intakes by subjects' self-judgment, salt intakes were all rising significantly using three survey methods. Salt intakes of three group population of light, moderate and partial taste salty were 13.6, 13.6 and 14.7 g/d by 24 hours urinary Na method (F

  4. Processing method and processing device for liquid waste containing surface active agent and radioactive material

    International Nuclear Information System (INIS)

    Nishi, Takashi; Matsuda, Masami; Baba, Tsutomu; Yoshikawa, Ryozo; Yukita, Atsushi.

    1998-01-01

    Washing liquid wastes containing surface active agents and radioactive materials are sent to a deaerating vessel. Ozone is blown into the deaerating vessel. The washing liquid wastes dissolved with ozone are introduced to a UV ray irradiation vessel. UV rays are irradiated to the washing liquid wastes, and hydroxy radicals generated by photodecomposition of dissolved ozone oxidatively decompose surface active agents contained in the washing liquid wastes. The washing liquid wastes discharged from the UV ray irradiation vessel are sent to an activated carbon mixing vessel and mixed with powdery activated carbon. The surface active agents not decomposed in the UV ray irradiation vessel are adsorbed to the activated carbon. Then, the activated carbon and washing liquid wastes are separated by an activated carbon separating/drying device. Radioactive materials (iron oxide and the like) contained in the washing liquid wastes are mostly granular, and they are separated and removed from the washing liquid wastes in the activated carbon separating/drying device. (I.N.)

  5. Industrial Process Identification and Control Design Step-test and Relay-experiment-based Methods

    CERN Document Server

    Liu, Tao

    2012-01-01

      Industrial Process Identification and Control Design is devoted to advanced identification and control methods for the operation of continuous-time processes both with and without time delay, in industrial and chemical engineering practice.   The simple and practical step- or relay-feedback test is employed when applying the proposed identification techniques, which are classified in terms of common industrial process type: open-loop stable; integrating; and unstable, respectively. Correspondingly, control system design and tuning models that follow are presented for single-input-single-output processes.   Furthermore, new two-degree-of-freedom control strategies and cascade control system design methods are explored with reference to independently-improving, set-point tracking and load disturbance rejection. Decoupling, multi-loop, and decentralized control techniques for the operation of multiple-input-multiple-output processes are also detailed. Perfect tracking of a desire output trajectory is realiz...

  6. Downstream processing and chromatography based analytical methods for production of vaccines, gene therapy vectors, and bacteriophages

    Science.gov (United States)

    Kramberger, Petra; Urbas, Lidija; Štrancar, Aleš

    2015-01-01

    Downstream processing of nanoplexes (viruses, virus-like particles, bacteriophages) is characterized by complexity of the starting material, number of purification methods to choose from, regulations that are setting the frame for the final product and analytical methods for upstream and downstream monitoring. This review gives an overview on the nanoplex downstream challenges and chromatography based analytical methods for efficient monitoring of the nanoplex production. PMID:25751122

  7. Active Parent Consent for Health Surveys with Urban Middle School Students: Processes and Outcomes

    Science.gov (United States)

    Secor-Turner, Molly; Sieving, Renee; Widome, Rachel; Plowman, Shari; Vanden Berk, Eric

    2010-01-01

    Background: To achieve high participation rates and a representative sample, active parent consent procedures require a significant investment of study resources. The purpose of this article is to describe processes and outcomes of utilizing active parent consent procedures with sixth-grade students from urban, ethnically diverse, economically…

  8. MULTIPLE CRITERA METHODS WITH FOCUS ON ANALYTIC HIERARCHY PROCESS AND GROUP DECISION MAKING

    Directory of Open Access Journals (Sweden)

    Lidija Zadnik-Stirn

    2010-12-01

    Full Text Available Managing natural resources is a group multiple criteria decision making problem. In this paper the analytic hierarchy process is the chosen method for handling the natural resource problems. The one decision maker problem is discussed and, three methods: the eigenvector method, data envelopment analysis method, and logarithmic least squares method are presented for the derivation of the priority vector. Further, the group analytic hierarchy process is discussed and six methods for the aggregation of individual judgments or priorities: weighted arithmetic mean method, weighted geometric mean method, and four methods based on data envelopment analysis are compared. The case study on land use in Slovenia is applied. The conclusions review consistency, sensitivity analyses, and some future directions of research.

  9. System and method for integrating hazard-based decision making tools and processes

    Science.gov (United States)

    Hodgin, C Reed [Westminster, CO

    2012-03-20

    A system and method for inputting, analyzing, and disseminating information necessary for identified decision-makers to respond to emergency situations. This system and method provides consistency and integration among multiple groups, and may be used for both initial consequence-based decisions and follow-on consequence-based decisions. The system and method in a preferred embodiment also provides tools for accessing and manipulating information that are appropriate for each decision-maker, in order to achieve more reasoned and timely consequence-based decisions. The invention includes processes for designing and implementing a system or method for responding to emergency situations.

  10. Methods and resources for physics education in radiology residency programs: survey results.

    Science.gov (United States)

    Bresolin, Linda; Bisset, George S; Hendee, William R; Kwakwa, Francis A

    2008-11-01

    Over the past 2 years, ongoing efforts have been made to reevaluate and restructure the way physics education is provided to radiology residents. Program directors and faculty from North American radiology residency programs were surveyed about how physics is being taught and what resources are currently being used for their residents. Substantial needs were identified for additional educational resources in physics, better integration of physics into clinical training, and a standardized physics curriculum closely linked to the initial certification examination of the American Board of Radiology. (c) RSNA, 2008.

  11. Method of processing liquid wastes

    International Nuclear Information System (INIS)

    Naba, Katsumi; Oohashi, Takeshi; Kawakatsu, Ryu; Kuribayashi, Kotaro.

    1980-01-01

    Purpose: To process radioactive liquid wastes with safety by distillating radioactive liquid wastes while passing gases, properly treating the distillation fractions, adding combustible and liquid synthetic resin material to the distillation residues, polymerizing to solidify and then burning them. Method: Radioactive substance - containing liquid wastes are distillated while passing gases and the distillation fractions containing no substantial radioactive substances are treated in an adequate method. Synthetic resin material, which may be a mixture of polymer and monomer, is added together with a catalyst to the distillation residues containing almost of the radioactive substances to polymerize and solidify. Water or solvent in such an extent as not hindering the solidification may be allowed if remained. The solidification products are burnt for facilitating the treatment of the radioactive substances. The resin material can be selected suitably, methacrylate syrup (mainly solution of polymethylmethacrylate and methylmethacrylate) being preferred. (Seki, T.)

  12. Investigation of test methods, material properties, and processes for solar cell encapsulants. Annual report

    Energy Technology Data Exchange (ETDEWEB)

    Willis, P. B.; Baum, B.

    1979-06-01

    The goal of this program is to identify, evaluate, and recommend encapsulant materials and processes for the production of cost-effective, long-life solar cell modules. During the past year, the technical activities emphasized the reformulation of a commercial grade of ethylene/vinyl acetate copolymer for use as a pottant in solar cell module manufacture. After experimenting with a variety of techniques, a vacuum-bag process was developed and found to be an excellent encapsulation method. Adhesive strengths and primers for the bonding of ethylene/vinyl acetate to superstrate and substrate materials was assessed with encouraging results. The weathering effects on ten other polymers exposed to twelve months of weathering in Arizona, Florida, and under EMMAQUA were evaluated by determination of tensile strengths, elongations, optical transmission, etc. As may be expected, the best overall retention of mechanical properties is found for the fluorocarbon polymers, especially FEP. Hard coatings containing ultraviolet absorbers were investigated for the purpose of providing a soil resistant surface and additional weathering stability to the soft EVA pottant. Corrosion studies using a standard salt spray test were used to determine the degree of protection offered to a variety of metals by encapsulation in EVA pottant. A survey of scrim materials was also conducted. These open hole weaves are intended for use as spacers between the cell and substrate to provide a mechanical barrier, improve insulation resistance and prevent migration of the pigmented pottant over the cell surface. A mechanical engineering analysis of composite structural materials for use as substrates was performed. Results are presented in detail. (WHK)

  13. Processing of fallopian tube, ovary, and endometrial surgical pathology specimens: A survey of U.S. laboratory practices.

    Science.gov (United States)

    Samimi, Goli; Trabert, Britton; Duggan, Máire A; Robinson, Jennifer L; Coa, Kisha I; Waibel, Elizabeth; Garcia, Edna; Minasian, Lori M; Sherman, Mark E

    2018-03-01

    Many high-grade serous carcinomas initiate in fallopian tubes as serous tubal intraepithelial carcinoma (STIC), a microscopic lesion identified with specimen processing according to the Sectioning and Extensive Examination of the Fimbria protocol (SEE-Fim). Given that the tubal origin of these cancers was recently recognized, we conducted a survey of pathology practices to assess processing protocols that are applied to gynecologic surgical pathology specimens in clinical contexts in which finding STIC might have different implications. We distributed a survey electronically to the American Society for Clinical Pathology list-serve to determine practice patterns and compared results between practice types by chi-square (χ2) tests for categorical variables. Free text comments were qualitatively reviewed. Survey responses were received from 159 laboratories (72 academic, 87 non-academic), which reported diverse specimen volumes and percentage of gynecologic samples. Overall, 74.1% of laboratories reported performing SEE-Fim for risk-reducing surgical specimens (82.5% academic versus 65.7% non-academic, p STIC or early cancer precursors. Published by Elsevier Inc.

  14. The relative size of measurement error and attrition error in a panel survey. Comparing them with a new multi-trait multi-method model

    NARCIS (Netherlands)

    Lugtig, Peter

    2017-01-01

    This paper proposes a method to simultaneously estimate both measurement and nonresponse errors for attitudinal and behavioural questions in a longitudinal survey. The method uses a Multi-Trait Multi-Method (MTMM) approach, which is commonly used to estimate the reliability and validity of survey

  15. A survey on the methodological processes and policies of renal guideline groups as a first step to harmonize renal guidelines.

    Science.gov (United States)

    Haller, Maria C; van der Veer, Sabine N; Nagler, Evi V; Tomson, Charlie; Lewington, Andrew; Hemmelgarn, Brenda R; Gallagher, Martin; Rocco, Michael; Obrador, Gregorio; Vanholder, Raymond; Craig, Jonathan C; van Biesen, Wim

    2015-07-01

    Worldwide, several bodies produce renal guidelines, potentially leading to duplication of effort while other topics may remain uncovered. A collaborative work plan could improve efficiency and impact, but requires a common approved methodology. The aim of this study was to identify organizational and methodological similarities and differences among seven major renal guideline bodies to identify methodological barriers to a collaborative effort. An electronic 62-item survey with questions based on the Institute of Medicine standards for guidelines was completed by representatives of seven major organizations producing renal guidelines: the Canadian Society of Nephrology (CSN), European Renal Best Practice (ERBP), Kidney Disease Improving Global Outcome (KDIGO), Kidney Health Australia-Caring for Australians with Renal Insufficiency (KHA-CARI), Kidney Disease Outcome Quality Initiative (KDOQI), Sociedad Latino-Americano de Nefrologia e Hipertension (SLANH) and United Kingdom Renal Association (UK-RA). Five of the seven groups conduct systematic searches for evidence, two include detailed critical appraisal and all use the GRADE framework. Five have public review of the guideline draft. Guidelines are updated as new evidence comes up in all, and/or after a specified time frame has passed (N = 3). Commentaries or position statements on guidelines published by other groups are produced by five, with the ADAPTE framework (N = 1) and the AGREEII (N = 2) used by some. Funding is from their parent organizations (N = 5) or directly from industry (N = 2). None allow funders to influence topic selection or guideline content. The budgets to develop a full guideline vary from $2000 to $500 000. Guideline development groups vary in size from <5 (N = 1) to 13-20 persons (N = 3). Three explicitly seek patient perspectives, for example, by involving patients in the scoping process, and four incorporate health economic considerations. All provide training in methodology for

  16. Integrated Safety and Security Risk Assessment Methods: A Survey of Key Characteristics and Applications

    NARCIS (Netherlands)

    Chockalingam, Sabarathinam; Hadziosmanovic, D.; Pieters, Wolter; Texeira, Andre; van Gelder, Pieter

    2016-01-01

    Over the last years, we have seen several security incidents that compromised system safety, of which some caused physical harm to people. Meanwhile, various risk assessment methods have been developed that integrate safety and security, and these could help to address the corresponding threats by

  17. Survey of methods and measurements of nuclear reactor time and frequency responses

    International Nuclear Information System (INIS)

    Cummins, J.D.

    1961-11-01

    Methods of measuring reactivity effects in nuclear reactors are described and the main control engineering analytical problems in nuclear reactors are detailed. A description of the use of reactor models and adaptive control in improving the economy of power producing nuclear reactors is included. (author)

  18. Fault Diagnosis Method on Polyvinyl Chloride Polymerization Process Based on Dynamic Kernel Principal Component and Fisher Discriminant Analysis Method

    Directory of Open Access Journals (Sweden)

    Shu-zhi Gao

    2016-01-01

    Full Text Available In view of the fact that the production process of Polyvinyl chloride (PVC polymerization has more fault types and its type is complex, a fault diagnosis algorithm based on the hybrid Dynamic Kernel Principal Component Analysis-Fisher Discriminant Analysis (DKPCA-FDA method is proposed in this paper. Kernel principal component analysis and Dynamic Kernel Principal Component Analysis are used for fault diagnosis of Polyvinyl chloride (PVC polymerization process, while Fisher Discriminant Analysis (FDA method was adopted to make failure data for further separation. The simulation results show that the Dynamic Kernel Principal Component Analyses to fault diagnosis of Polyvinyl chloride (PVC polymerization process have better diagnostic accuracy, the Fisher Discriminant Analysis (FDA can further realize the fault isolation, and the actual fault in the process of Polyvinyl chloride (PVC polymerization production can be monitored by Dynamic Kernel Principal Component Analysis.

  19. Team sport athletes' perceptions and use of recovery strategies: a mixed-methods survey study.

    Science.gov (United States)

    Crowther, Fiona; Sealey, Rebecca; Crowe, Melissa; Edwards, Andrew; Halson, Shona

    2017-01-01

    A variety of recovery strategies are used by athletes, although there is currently no research that investigates perceptions and usage of recovery by different competition levels of team sport athletes. The recovery techniques used by team sport athletes of different competition levels was investigated by survey. Specifically this study investigated if, when, why and how the following recovery strategies were used: active land-based recovery (ALB), active water-based recovery (AWB), stretching (STR), cold water immersion (CWI) and contrast water therapy (CWT). Three hundred and thirty-one athletes were surveyed. Fifty-seven percent were found to utilise one or more recovery strategies. Stretching was rated the most effective recovery strategy (4.4/5) with ALB considered the least effective by its users (3.6/5). The water immersion strategies were considered effective/ineffective mainly due to psychological reasons; in contrast STR and ALB were considered to be effective/ineffective mainly due to physical reasons. This study demonstrates that athletes may not be aware of the specific effects that a recovery strategy has upon their physical recovery and thus athlete and coach recovery education is encouraged. This study also provides new information on the prevalence of different recovery strategies and contextual information that may be useful to inform best practice among coaches and athletes.

  20. [Precautions of physical performance requirements and test methods during product standard drafting process of medical devices].

    Science.gov (United States)

    Song, Jin-Zi; Wan, Min; Xu, Hui; Yao, Xiu-Jun; Zhang, Bo; Wang, Jin-Hong

    2009-09-01

    The major idea of this article is to discuss standardization and normalization for the product standard of medical devices. Analyze the problem related to the physical performance requirements and test methods during product standard drafting process and make corresponding suggestions.

  1. Communication methods and production techniques in fixed prosthesis fabrication: a UK based survey. Part 1: communication methods.

    Science.gov (United States)

    Berry, J; Nesbit, M; Saberi, S; Petridis, H

    2014-09-01

    The General Dental Council (GDC) states that members of the dental team have to 'communicate clearly and effectively with other team members and colleagues in the interest of patients'. A number of studies from different parts of the world have highlighted problems and confirmed the need for improved communication methods and production techniques between dentists and dental technicians. The aim of this study was to identify the communication methods and production techniques used by dentists and dental technicians for the fabrication of fixed prostheses within the UK from the dental technicians' perspective. The current publication reports on the communication methods. Seven hundred and eighty-two online questionnaires were distributed to the Dental Laboratories Association membership and included a broad range of topics. Statistical analysis was undertaken to test the influence of various demographic variables. The number of completed responses totalled 248 (32% response rate). The laboratory prescription and the telephone were the main communication tools used. Statistical analysis of the results showed that a greater number of communication methods were used by large laboratories. Frequently missing items from the laboratory prescription were the shade and the date required. The majority of respondents (73%) stated that a single shade was selected in over half of cases. Sixty-eight percent replied that the dentist allowed sufficient laboratory time. Twenty-six percent of laboratories felt either rarely involved or not involved at all as part of the dental team. This study suggests that there are continuing communication and teamwork issues between dentists and dental laboratories.

  2. Data Processing And Machine Learning Methods For Multi-Modal Operator State Classification Systems

    Science.gov (United States)

    Hearn, Tristan A.

    2015-01-01

    This document is intended as an introduction to a set of common signal processing learning methods that may be used in the software portion of a functional crew state monitoring system. This includes overviews of both the theory of the methods involved, as well as examples of implementation. Practical considerations are discussed for implementing modular, flexible, and scalable processing and classification software for a multi-modal, multi-channel monitoring system. Example source code is also given for all of the discussed processing and classification methods.

  3. First-order Convex Optimization Methods for Signal and Image Processing

    DEFF Research Database (Denmark)

    Jensen, Tobias Lindstrøm

    2012-01-01

    In this thesis we investigate the use of first-order convex optimization methods applied to problems in signal and image processing. First we make a general introduction to convex optimization, first-order methods and their iteration complexity. Then we look at different techniques, which can...... be used with first-order methods such as smoothing, Lagrange multipliers and proximal gradient methods. We continue by presenting different applications of convex optimization and notable convex formulations with an emphasis on inverse problems and sparse signal processing. We also describe the multiple...

  4. A survey on the effect of transport method on bruises, pH and colour ...

    African Journals Online (AJOL)

    ... the relationship between L* and distance was negative. Percentage of bruised carcasses varied with method of transport: Group 1, 41.1%; Group 2, 63.1% and Group 3, 51.1%. Transport method affected bruising, pHu and colour of beef. Keywords: Bruise score, ultimate pH, DFD meat, animal welfare, pre-slaughter stress ...

  5. Examination of the equivalence of self-report survey-based paper-and-pencil and internet data collection methods.

    Science.gov (United States)

    Weigold, Arne; Weigold, Ingrid K; Russell, Elizabeth J

    2013-03-01

    Self-report survey-based data collection is increasingly carried out using the Internet, as opposed to the traditional paper-and-pencil method. However, previous research on the equivalence of these methods has yielded inconsistent findings. This may be due to methodological and statistical issues present in much of the literature, such as nonequivalent samples in different conditions due to recruitment, participant self-selection to conditions, and data collection procedures, as well as incomplete or inappropriate statistical procedures for examining equivalence. We conducted 2 studies examining the equivalence of paper-and-pencil and Internet data collection that accounted for these issues. In both studies, we used measures of personality, social desirability, and computer self-efficacy, and, in Study 2, we used personal growth initiative to assess quantitative equivalence (i.e., mean equivalence), qualitative equivalence (i.e., internal consistency and intercorrelations), and auxiliary equivalence (i.e., response rates, missing data, completion time, and comfort completing questionnaires using paper-and-pencil and the Internet). Study 1 investigated the effects of completing surveys via paper-and-pencil or the Internet in both traditional (i.e., lab) and natural (i.e., take-home) settings. Results indicated equivalence across conditions, except for auxiliary equivalence aspects of missing data and completion time. Study 2 examined mailed paper-and-pencil and Internet surveys without contact between experimenter and participants. Results indicated equivalence between conditions, except for auxiliary equivalence aspects of response rate for providing an address and completion time. Overall, the findings show that paper-and-pencil and Internet data collection methods are generally equivalent, particularly for quantitative and qualitative equivalence, with nonequivalence only for some aspects of auxiliary equivalence. PsycINFO Database Record (c) 2013 APA, all

  6. Implementation of a new rapid tissue processing method--advantages and challenges

    DEFF Research Database (Denmark)

    Munkholm, Julie; Talman, Maj-Lis; Hasselager, Thomas

    2008-01-01

    specimens through the processor with a processing time of as low as 1h. In this article, we present the effects of the automated microwave-assisted tissue processor on the histomorphologic quality and the turnaround time (TAT) for histopathology reports. We present a blind comparative study regarding......Conventional tissue processing of histologic specimens has been carried out in the same manner for many years. It is a time-consuming process involving batch production, resulting in a 1-day delay of the diagnosis. Microwave-assisted tissue processing enables a continuous high flow of histologic...... the histomorphologic quality of microwave-processed and conventionally processed tissue samples. A total of 333 specimens were included. The microwave-assisted processing method showed a histomorphologic quality comparable to the conventional method for a number of tissue types, including skin and specimens from...

  7. Horvitz-Thompson survey sample methods for estimating large-scale animal abundance

    Science.gov (United States)

    Samuel, M.D.; Garton, E.O.

    1994-01-01

    Large-scale surveys to estimate animal abundance can be useful for monitoring population status and trends, for measuring responses to management or environmental alterations, and for testing ecological hypotheses about abundance. However, large-scale surveys may be expensive and logistically complex. To ensure resources are not wasted on unattainable targets, the goals and uses of each survey should be specified carefully and alternative methods for addressing these objectives always should be considered. During survey design, the impoflance of each survey error component (spatial design, propofiion of detected animals, precision in detection) should be considered carefully to produce a complete statistically based survey. Failure to address these three survey components may produce population estimates that are inaccurate (biased low), have unrealistic precision (too precise) and do not satisfactorily meet the survey objectives. Optimum survey design requires trade-offs in these sources of error relative to the costs of sampling plots and detecting animals on plots, considerations that are specific to the spatial logistics and survey methods. The Horvitz-Thompson estimators provide a comprehensive framework for considering all three survey components during the design and analysis of large-scale wildlife surveys. Problems of spatial and temporal (especially survey to survey) heterogeneity in detection probabilities have received little consideration, but failure to account for heterogeneity produces biased population estimates. The goal of producing unbiased population estimates is in conflict with the increased variation from heterogeneous detection in the population estimate. One solution to this conflict is to use an MSE-based approach to achieve a balance between bias reduction and increased variation. Further research is needed to develop methods that address spatial heterogeneity in detection, evaluate the effects of temporal heterogeneity on survey

  8. Protocol of the Belgian food consumption survey 2014: objectives, design and methods.

    Science.gov (United States)

    Bel, Sarah; Van den Abeele, Sofie; Lebacq, Thérésa; Ost, Cloë; Brocatus, Loes; Stiévenart, Charlotte; Teppers, Eveline; Tafforeau, Jean; Cuypers, Koenraad

    2016-01-01

    Dietary patterns are one of the major determinants as far as health and burden of disease is concerned. Food consumption data are essential to evaluate and develop nutrition and food safety policies. The last national food consumption survey in Belgium took place in 2004 among the Belgian population aged 15 years and older. Since dietary habits are prone to change over time a new Belgian National Food Consumption Survey (BNFCS2014) was conducted in 2014-2015. The BNFCS2014 is a cross-sectional study. A representative sample (n = 3200) of the Belgian population aged 3 to 64 years old was randomly selected from the National Population Register following a multistage stratified sampling procedure. Data collection was divided equally over the four seasons and days of the week in order to incorporate seasonal effects and day-to-day variation in food intake. Information on food intake was collected in adults with two non-consecutive 24-h dietary recalls (using the GloboDiet® software). In children food intake was collected with two non-consecutive one-day food diaries followed by a completion interview with GloboDiet. Additional data on socio-demographic characteristics, eating habits, lifestyle, food safety (at household level), physical activity and sedentary behaviour were collected with a face-to-face questionnaire using a computer-assisted personal interviewing technique. In the time between the two visits, participants were asked to complete a self-administered food frequency questionnaire and health questionnaire. Height, weight and waist circumference were measured. In addition, children and adolescents were asked to wear an accelerometer and keep a logbook for seven consecutive days to objectively measure physical activity and sedentary behaviour. The main objective of the BNFCS2014 is to evaluate the habitual food, energy and nutrient intake in the Belgian population and to compare these with recommendations from the national dietary guidelines. A second

  9. Process Research Methods and Their Application in the Didactics of Text Production and Translation

    DEFF Research Database (Denmark)

    Dam-Jensen, Helle; Heine, Carmen

    2009-01-01

    Teaching of translation and writing in the university classroom tends to focus on task knowledge by practicing text production and analyzing and discussing the quality of products. In this article, we will argue that the outcome of teaching may be increased if students are taught to see themselves...... not only as learners, but also as thinkers and problem solvers. This can be achieved by systematically applying knowledge from process research as this can give insight into mental and physical processes of text production. This article provides an overview of methods commonly used in process research...... and discusses the pros and cons of their application in teaching of translation and writing at university levels....

  10. A mixed methods survey of social anxiety, anxiety, depression and wig use in alopecia.

    Science.gov (United States)

    Montgomery, Kerry; White, Caroline; Thompson, Andrew

    2017-05-04

    This study aimed to examine levels of social anxiety, anxiety and depression reported by people with alopecia as a result of a dermatological condition and associations with wig use. The study also sought to report on experiences of wearing wigs in social situations and the relationship with social confidence. A cross-sectional survey was sent by email to the Alopecia UK charity mailing list and advertised on social media. Inclusion criteria were a diagnosis of alopecia, aged 13 or above and sufficient English to complete the survey. Exclusion criteria included experiencing hair loss as a result of chemotherapy treatment or psychological disorder. Participants (n=338) were predominantly female (97.3%), Caucasian (93.5%) and aged between 35 and 54 years (49.4%) with a diagnosis of alopecia areata (82.6%). The Social Phobia Inventory measured symptoms of social anxiety, and the Hospital Anxiety and Depression Scale was used to measure symptoms of anxiety and depression. Survey questions were designed to measure the use of wigs. Open-ended questions enabled participants to comment on their experiences of wearing wigs. Clinically significant levels of social anxiety (47.5%), anxiety (35.5%) and depression (29%) were reported. Participants who reported worries about not wearing a wig reported significantly higher levels of depression: t(103)=3.40, p≤0.001; anxiety: t(109)=4.80, p≤0.001; and social anxiety: t(294)=3.89, p≤0.001. Wearing wigs was reported as increasing social confidence; however, the concealment it afforded was also reported as both reducing fear of negative evaluation and maintaining anxiety. Overall, 46% of participants reported that wearing a wig had a positive impact on their everyday life with negative experiences related to fears of the wig being noticed. Psychological interventions alongside wig provision would be beneficial for people living with alopecia. © Article author(s) (or their employer(s) unless otherwise stated in the text of the

  11. Survey, Statistical Analysis and Classification of Launched CubeSat Missions with Emphasis on the Attitude Control Method

    OpenAIRE

    Polat, Halis C.; Virgili-Llop, Josep; Romano, Marcello

    2016-01-01

    CubeSat missions have evolved, becoming increasingly capable and complex since their first launch. Rela- tively high adoption rates and advances in technology allow mission developers to choose from different orbital altitudes, CubeSat configurations, and commercial off-the-shelf (COTS) subsystems. To fulfill particular mis- sion requirements, designers have also developed custom subsystems. In this study, a survey of the attitude con- trol method for each individual launched CubeSat mission ...

  12. Duality of Ross Ice Shelf systems: crustal boundary, ice sheet processes and ocean circulation from ROSETTA-Ice surveys

    Science.gov (United States)

    Tinto, K. J.; Siddoway, C. S.; Padman, L.; Fricker, H. A.; Das, I.; Porter, D. F.; Springer, S. R.; Siegfried, M. R.; Caratori Tontini, F.; Bell, R. E.

    2017-12-01

    Bathymetry beneath Antarctic ice shelves controls sub-ice-shelf ocean circulation and has a major influence on the stability and dynamics of the ice sheets. Beneath the Ross Ice Shelf, the sea-floor bathymetry is a product of both tectonics and glacial processes, and is influenced by the processes it controls. New aerogeophysical surveys have revealed a fundamental crustal boundary bisecting the Ross Ice Shelf and imparting a duality to the Ross Ice Shelf systems, encompassing bathymetry, ocean circulation and ice flow history. The ROSETTA-Ice surveys were designed to increase the resolution of Ross Ice Shelf mapping from the 55 km RIGGS survey of the 1970s to a 10 km survey grid, flown over three years from New York Air National Guard LC130s. Radar, LiDAR, gravity and magnetic instruments provide a top to bottom profile of the ice shelf and the underlying seafloor, with 20 km resolution achieved in the first two survey seasons (2015 and 2016). ALAMO ocean-profiling floats deployed in the 2016 season are measuring the temperature and salinity of water entering and exiting the sub-ice water cavity. A significant east-west contrast in the character of the magnetic and gravity fields reveals that the lithospheric boundary between East and West Antarctica exists not at the base of the Transantarctic Mountains (TAM), as previously thought, but 300 km further east. The newly-identified boundary spatially coincides with the southward extension of the Central High, a rib of shallow basement identified in the Ross Sea. The East Antarctic side is characterized by lower amplitude magnetic anomalies and denser TAM-type lithosphere compared to the West Antarctic side. The crustal structure imparts a fundamental duality on the overlying ice and ocean, with deeper bathymetry and thinner ice on the East Antarctic side creating a larger sub-ice cavity for ocean circulation. The West Antarctic side has a shallower seabed, more restricted ocean access and a more complex history of

  13. Method for qualification of cementation processes and its application to a vibration mixer

    International Nuclear Information System (INIS)

    Vicente, R.; Rzyski, B.M.; Suarez, A.A.

    1987-01-01

    In this paper the definition of homogeneneity is discussed and methods to measure the 'degree of heterogeneity' of waste forms are proposed. These measurements are important as aids for mixing process qualification, and as tools in quality assurance procedures and in the development of waste management standards. Homogeneity is a basic quality requirement for waste forms to be accepted in final sites. It do not depend on the matrix immmobilization, rather it is one mean for qualification of the immobilization process. The proposed methods were applied to a vibration assisted mixing process and has proved to an useful mean to judge process improvements. There are many conceivable methods to evaluate homogeneity of waste forms. Some were selected as screening tests aiming at quickly reaching a promising set of process variables. Others were selected to evaluate the degree of excellence of the process in respect to product quality. These envisaged methods were: visual inspection, the use of cement dye as tracer, scanning of radioactive tracers, and measurements of variations of density, water absorption, porosity and mechanical strength across the waste form sample. The process variables were: waste-cement and water-cement ratios, mixer geometry, mixing time and vibration intensity. Some of the apparatus details were change during the experimental work in order to improve product quality. Experimental methods and results statistically analysed and compared with data obtained from samples prepared with a planetary paddle mixer, which were adopted as the homogeneity standard. (Author) [pt

  14. Clinical Reasoning: Survey of Teaching Methods, Integration, and Assessment in Entry-Level Physical Therapist Academic Education.

    Science.gov (United States)

    Christensen, Nicole; Black, Lisa; Furze, Jennifer; Huhn, Karen; Vendrely, Ann; Wainwright, Susan

    2017-02-01

    Although clinical reasoning abilities are important learning outcomes of physical therapist entry-level education, best practice standards have not been established to guide clinical reasoning curricular design and learning assessment. This research explored how clinical reasoning is currently defined, taught, and assessed in physical therapist entry-level education programs. A descriptive, cross-sectional survey was administered to physical therapist program representatives. An electronic 24-question survey was distributed to the directors of 207 programs accredited by the Commission on Accreditation in Physical Therapy Education. Descriptive statistical analysis and qualitative content analysis were performed. Post hoc demographic and wave analyses revealed no evidence of nonresponse bias. A response rate of 46.4% (n=96) was achieved. All respondents reported that their programs incorporated clinical reasoning into their curricula. Only 25% of respondents reported a common definition of clinical reasoning in their programs. Most respondents (90.6%) reported that clinical reasoning was explicit in their curricula, and 94.8% indicated that multiple methods of curricular integration were used. Instructor-designed materials were most commonly used to teach clinical reasoning (83.3%). Assessment of clinical reasoning included practical examinations (99%), clinical coursework (94.8%), written examinations (87.5%), and written assignments (83.3%). Curricular integration of clinical reasoning-related self-reflection skills was reported by 91%. A large number of incomplete surveys affected the response rate, and the program directors to whom the survey was sent may not have consulted the faculty members who were most knowledgeable about clinical reasoning in their curricula. The survey construction limited some responses and application of the results. Although clinical reasoning was explicitly integrated into program curricula, it was not consistently defined, taught, or

  15. Survey of hydrogen production and utilization methods. Volume 1: Executive summary

    Science.gov (United States)

    Gregory, D. P.; Pangborn, J. B.; Gillis, J. C.

    1975-01-01

    The use of hydrogen as a synthetic fuel is considered. Processes for the production of hydrogen are described along with the present and future industrial uses of hydrogen as a fuel and as a chemical feedstock. Novel and unconventional hydrogen-production techniques are evaluated, with emphasis placed on thermochemical and electrolytic processes. Potential uses for hydrogen as a fuel in industrial and residential applications are identified and reviewed in the context of anticipated U.S. energy supplies and demands. A detailed plan for the period from 1975 to 1980 prepared for research on and development of hydrogen as an energy carrier is included.

  16. Analytical method for the determination and a survey of parabens and their derivatives in pharmaceuticals.

    Science.gov (United States)

    Moreta, Cristina; Tena, María-Teresa; Kannan, Kurunthachalam

    2015-10-01

    Exposure of humans to parabens is a concern due to the estrogenic activity of these compounds. Parabens are widely used as preservatives in some personal care products, foodstuffs and pharmaceuticals owing to their low cost, high water solubility and broad spectrum antimicrobial properties. Despite this, little is known on the occurrence of parabens in pharmaceutical products. In this study, a method based on solid-liquid or liquid-liquid extraction (SLE or LLE), and high performance liquid chromatography (HPLC) coupled with triple quadrupole tandem mass spectrometry (QqQ or MS/MS) was developed for the determination of six most frequently used parabens and four paraben derivatives (methyl- and ethyl-protocatechuates, and mono- and di-hydroxybenzoic acids) in pharmaceuticals. A sample-purification step involving solid phase extraction (SPE) was optimized for the analysis of solid and lipid-rich pharmaceuticals. To our knowledge, this is the first comprehensive report on the occurrence of parabens in pharmaceuticals. The developed method was applied for the analysis of 128 liquid/syrup, cream, solid, prescription or over-the counter (OTC) drugs collected from the USA and a few other countries in Europe and Asia. Although majority of the drugs analyzed in the study did not contain parabens, concentrations as high as 2 mg/g were found in some drugs. Methyl- and propyl- parabens were the frequently detected compounds. 4-Hydroxybenzoic acid was the major metabolite found in pharmaceutical products. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. The Open Method of Coordination and the Implementation of the Bologna Process

    Science.gov (United States)

    Veiga, Amelia; Amaral, Alberto

    2006-01-01

    In this paper the authors argue that the use of the Open Method of Coordination (OMC) in the implementation of the Bologna process presents coordination problems that do not allow for the full coherence of the results. As the process is quite complex, involving three different levels (European, national and local) and as the final actors in the…

  18. Priority survey between indicators and analytic hierarchy process analysis for green chemistry technology assessment.

    Science.gov (United States)

    Kim, Sungjune; Hong, Seokpyo; Ahn, Kilsoo; Gong, Sungyong

    2015-01-01

    This study presents the indicators and proxy variables for the quantitative assessment of green chemistry technologies and evaluates the relative importance of each assessment element by consulting experts from the fields of ecology, chemistry, safety, and public health. The results collected were subjected to an analytic hierarchy process to obtain the weights of the indicators and the proxy variables. These weights may prove useful in avoiding having to resort to qualitative means in absence of weights between indicators when integrating the results of quantitative assessment by indicator. This study points to the limitations of current quantitative assessment techniques for green chemistry technologies and seeks to present the future direction for quantitative assessment of green chemistry technologies.

  19. A survey of pathogen survival during municipal solid waste and manure treatment processes. Final report

    International Nuclear Information System (INIS)

    Ware, S.A.

    1980-08-01

    Municipal solid waste (MSW) and animal manures may contain microorganisms that can cause disease in man and animals. These pathogenic microorganisms include enteric bacteria, fungi, viruses, and human and animal parasites. This report summarizes and discusses various research findings documenting the extent of pathogen survival during MSW treatment. The technologies discussed are composting, incineration, landfill, and anaerobic digestion. There is also a limited examination of the use of the oxidation ditch as a means of animal manure stabilization. High gradient magnetic separation (HGMS), and gamma radiation sterilization are mentioned as future options, especially for animal waste management. Several standard methods for the sampling, concentration, and isolation of microorganisms from raw and treated solid waste are also summarized

  20. Particle methods for simulation of subsurface multiphase fluid flow and biogeochemical processes

    International Nuclear Information System (INIS)

    Meakin, Paul; Tartakovsky, Alexandre; Scheibe, Tim; Tartakovsky, Daniel; Redden, George; Long, Philip E; Brooks, Scott C; Xu Zhijie

    2007-01-01

    A number of particle models that are suitable for simulating multiphase fluid flow and biogeochemical processes have been developed during the last few decades. Here we discuss three of them: a microscopic model - molecular dynamics; a mesoscopic model - dissipative particle dynamics; and a macroscopic model - smoothed particle hydrodynamics. Particle methods are robust and versatile, and it is relatively easy to add additional physical, chemical and biological processes into particle codes. However, the computational efficiency of particle methods is low relative to continuum methods. Multiscale particle methods and hybrid (particle-particle and particle-continuum) methods are needed to improve computational efficiency and make effective use of emerging computational capabilities. These new methods are under development

  1. Measuring male fertility rates in developing countries with Demographic and Health Surveys: An assessment of three methods

    Directory of Open Access Journals (Sweden)

    Bruno Schoumaker

    2017-03-01

    Full Text Available Background: Levels and patterns of male fertility are poorly documented in developing countries. Demographic accounts of male fertility focus primarily on developed countries, and where such accounts do exist for developing countries they are mainly available at the local or regional level. Objective: We show how data from Demographic and Health Surveys (DHS can be used to compute age-specific male fertility rates. Three methods are described and compared: the own-children method, the date-of-last-birth method, and the crisscross method. Male and female fertility rates are compared using the own-children method. Results: Male fertility estimates produced using the own-children method emerge as the most trustworthy. The data needed for this method is widely available and makes it possible to document male fertility in a large number of developing countries. The date-of-last-birth method also appears worthwhile, and may be especially useful for analyzing fertility differentials. The crisscross method is less reliable, but may be of interest for ages below 40. Comparisons of male and female fertility show that reproductive experiences differ across gender in most developing countries: Male fertility is substantially higher than female fertility, and males have their children later than females. Contribution: This study shows that Demographic and Health Surveys constitute a valuable and untapped source of data that can be used to document male fertility in a large number of countries. Male fertility rates are markedly different from female fertility rates in developing countries, and documenting both male and female fertility provides a more complete picture of fertility.

  2. A Survey on Data Compression Methods for Biological Sequences

    Directory of Open Access Journals (Sweden)

    Morteza Hosseini

    2016-10-01

    Full Text Available The ever increasing growth of the production of high-throughput sequencing data poses a serious challenge to the storage, processing and transmission of these data. As frequently stated, it is a data deluge. Compression is essential to address this challenge—it reduces storage space and processing costs, along with speeding up data transmission. In this paper, we provide a comprehensive survey of existing compression approaches, that are specialized for biological data, including protein and DNA sequences. Also, we devote an important part of the paper to the approaches proposed for the compression of different file formats, such as FASTA, as well as FASTQ and SAM/BAM, which contain quality scores and metadata, in addition to the biological sequences. Then, we present a comparison of the performance of several methods, in terms of compression ratio, memory usage and compression/decompression time. Finally, we present some suggestions for future research on biological data compression.

  3. Comparing two remote video survey methods for spatial predictions of the distribution and environmental niche suitability of demersal fishes.

    Science.gov (United States)

    Galaiduk, Ronen; Radford, Ben T; Wilson, Shaun K; Harvey, Euan S

    2017-12-15

    Information on habitat associations from survey data, combined with spatial modelling, allow the development of more refined species distribution modelling which may identify areas of high conservation/fisheries value and consequentially improve conservation efforts. Generalised additive models were used to model the probability of occurrence of six focal species after surveys that utilised two remote underwater video sampling methods (i.e. baited and towed video). Models developed for the towed video method had consistently better predictive performance for all but one study species although only three models had a good to fair fit, and the rest were poor fits, highlighting the challenges associated with modelling habitat associations of marine species in highly homogenous, low relief environments. Models based on baited video dataset regularly included large-scale measures of structural complexity, suggesting fish attraction to a single focus point by bait. Conversely, models based on the towed video data often incorporated small-scale measures of habitat complexity and were more likely to reflect true species-habitat relationships. The cost associated with use of the towed video systems for surveying low-relief seascapes was also relatively low providing additional support for considering this method for marine spatial ecological modelling.

  4. Process and research method of radionuclide migration in high level radioactive waste geological disposal system

    International Nuclear Information System (INIS)

    Chen Rui; Zhang Zhanshi

    2014-01-01

    Radionuclides released from waste can migrate from the repository to the rock and soil outside. On the other hand, nuclides also are retarded by the backfill material. Radionuclide migration is the main geochemical process of the waste disposal. This paper introduces various methods for radionuclide migration research, and give a brief analysis of the geochemical process of radionuclide migration. Finally, two of the most important processes of the radionuclide migration have been instanced. (authors)

  5. Survey of probabilistic methods in safety and risk assessment for nuclear power plant licensing

    International Nuclear Information System (INIS)

    1984-04-01

    After an overview about the goals and general methods of probabilistic approaches in nuclear safety the main features of probabilistic safety or risk assessment (PRA) methods are discussed. Mostly in practical applications not a full-fledged PRA is applied but rather various levels of analysis leading from unavailability assessment of systems over the more complex analysis of the probable core damage stages up to the assessment of the overall health effects on the total population from a certain practice. The various types of application are discussed in relation to their limitation and benefits for different stages of design or operation of nuclear power plants. This gives guidance for licensing staff to judge the usefulness of the various methods for their licensing decisions. Examples of the application of probabilistic methods in several countries are given. Two appendices on reliability analysis and on containment and consequence analysis provide some more details on these subjects. (author)

  6. Plane and geodetic surveying

    CERN Document Server

    Johnson, Aylmer

    2014-01-01

    IntroductionAim And ScopeClassification Of SurveysThe Structure Of This BookGeneral Principles Of SurveyingErrorsRedundancyStiffnessAdjustmentPlanning And Record KeepingPrincipal Surveying ActivitiesEstablishing Control NetworksMappingSetting OutResectioningDeformation MonitoringAngle MeasurementThe Surveyor's CompassThe ClinometerThe Total StationMaking ObservationsChecks On Permanent AdjustmentsDistance MeasurementGeneralTape MeasurementsOptical Methods (Tachymetry)Electromagnetic Distance Measurement (EDM)Ultrasonic MethodsGNSSLevellingTheoryThe InstrumentTechniqueBookingPermanent Adjustmen

  7. The measurement of magnetic properties of electrical sheet steel - survey on methods and situation of standards

    CERN Document Server

    Sievert, J

    2000-01-01

    A brief review of the different requirements for magnetic measurement techniques for material research, modelling of material properties and grading of the electrical sheet steel for trade purposes is presented. In relation to the main application of laminated electrical steel, this paper deals with AC measurement techniques. Two standard methods, Epstein frame and Single Sheet Tester (SST), producing different results, are used in parallel. This dilemma was analysed in detail. The study leads to a possible solution of the problem, i.e. the possibility of converting the results of one of the two methods into the results of the other in order to satisfy the users of the Epstein method and, at the same time, to improve the acceptance of the more economical SST method.

  8. A novel method to survey parameters of an ion beam and its interaction with a target

    Science.gov (United States)

    Long, J. D.; Yang, Z.; Li, J.; Wang, X. H.; Wang, T.; Lan, C. H.; Dong, P.; Li, X.; He, J. L.; Zheng, L.; Liu, P.

    2017-09-01

    Beam profile and composition of the pulsed ion beam from a vacuum arc source are valuable information for designing a high-intensity deuterium-tritium neutron generator. Traditional methods are notoriously difficult to obtain the information at the same time. A novel off-line diagnostic method is presented, which can obtain the transverse beam profile with high resolution as well as species of the ions in the beam. The method is using a silicon target with high purity to interact with the ion beam, and then use secondary ion mass spectrometry (SIMS) to analyze the interaction zone of the target to get the beam information. More information on beam-target interaction could get simultaneously. Proof-of-principle simulation and experimental works have demonstrated this method is practical.

  9. Terrestrial Laser Scanning for Measuring Stream Bank Erosion within Legacy Sediments: Data Processing and Analysis Methods

    Science.gov (United States)

    Starek, M. J.; Mitasova, H.; Wegmann, K. W.

    2011-12-01

    Land clearing for agricultural purposes following European settlement of America resulted in upland erosion rates 50-400 times above long-term geologic rates in much of the North Carolina Piedmont region. A considerable amount of the eroded sediment was subsequently aggraded on floodplains and impounded in the slackwater ponds behind milldams. This trapped "legacy" sediment is commonly mistaken for natural floodplain deposition and has remained largely unrecognized as a potential source of accelerated sediment erosion contributing to modern water quality impairment. In this study, terrestrial laser scanning (TLS) is utilized to monitor stream bank evolution along a reach that has breached a former millpond. Due to the unique surface geometry and orientation of the stream bank, vegetation occlusion, and true 3D structure of the point cloud, a systematic data processing approach is implemented to compute the change in sediment volume between repeat TLS surveys. The processing approach consists of the following four steps: 1) segmentation of the stream bank point cloud; 2) transformation of the point cloud such that the xy plane is parallel to the trend of the bank; 3) filter vegetation by selecting local lowest point within a grid cell; 4) smooth high frequency noise 5) generate bare earth digital elevation model (DEM). From the DEMs, change in volume was quantified for a 13 m x 3.5 m section of the stream bank providing an estimate on erosion rates and slumping between surveys. The major mechanisms for the observed changes are freeze-thaw events and fluvial entrainment. To evaluate the surface evolution between the distinct sedimentary layers (legacy vs non-legacy) that comprise the stream bank, elevation change is modeled as a continuous trivariate function z = f(x,y,t) where x,y is horizontal location, t is time, and z is a first-surface referenced elevation. Hence, z=0 for all x,y at t=0, time of first survey. The filtered, transformed, and first

  10. Bispectral methods of signal processing applications in radar, telecommunications and digital image restoration

    CERN Document Server

    Totsky, Alexander V; Kravchenko, Victor F

    2015-01-01

    By studying applications in radar, telecommunications and digital image restoration, this monograph discusses signal processing techniques based on bispectral methods. Improved robustness against different forms of noise as well as preservation of phase information render this method a valuable alternative to common power-spectrum analysis used in radar object recognition, digital wireless communications, and jitter removal in images.

  11. Accelerator and transport line survey and alignment

    International Nuclear Information System (INIS)

    Ruland, R.E.

    1991-10-01

    This paper summarizes the survey and alignment processes of accelerators and transport lines and discusses the propagation of errors associated with these processes. The major geodetic principles governing the survey and alignment measurement space are introduced and their relationship to a lattice coordinate system shown. The paper continues with a broad overview about the activities involved in the step sequence from initial absolute alignment to final smoothing. Emphasis is given to the relative alignment of components, in particular to the importance of incorporating methods to remove residual systematic effects in surveying and alignment operations. Various approaches to smoothing used at major laboratories are discussed. 47 refs., 19 figs., 1 tab

  12. Commercial off-the-shelf software dedication process based on the commercial grade survey of supplier

    International Nuclear Information System (INIS)

    Kim, J. Y.; Lee, J. S.; Chon, S. W.; Lee, G. Y.; Park, J. K.

    2000-01-01

    Commercial Off-The-Shelf(COTS) software dedication process can apply to a combination of methods like the hardware commercial grade item dedication process. In general, these methods are : methods 1(special test and inspection), method 2(commercial grade survey of supplier), method 3(source verification), and method 4(acceptance supplier/item performance record). In this paper, the suggested procedure-oriented dedication process on the basis of method 2 for COTS software is consistent with EPRI/TR-106439 and NUREG/CR-6421 requirements. Additional tailoring policy based on code and standards related to COTS software may be also founded in the suggested commercial software dedication process. Suggested commercial software dedication process has been developed for a commercial I and C software dedication who performs COTS qualification according to the dedication procedure

  13. Bibliographical survey of radiostrontium uptake capacity and processes in aquatic plants

    International Nuclear Information System (INIS)

    Pally, M.; Foulquier, L.

    1983-09-01

    This report covers 302 articles published between 1949 and 1980 on the contamination of freshwater and marine aquatic plants by radioactive strontium. For the marine and continental environments, the results of laboratory experiments on the dynamics of radiostrontium buildup and localization, concentration factors, elimination processes, the effects of biological factors and of the environment, the activity levels and concentration factors measured in areas directly and indirectly affected by waste discharges, discrimination factors and the role of plants as radiation indicators, are examined. The radioactive strontium uptake potentials are higher for freshwater plants -especially mosses and characeae- than for marine plants. In zones not directly affected by waste discharges, the maximum activity measured is 82 pCi/kg wet weight, compared with 750 pCi/kg for freshwater plants. The peak values were observed in 1964-1965. In zones directly affected by waste discharges, the activity levels range from 15 to 1700 pCi of 90 Sr per kilogram of wet weight in the marine environment, and from 20 to 207000 pCi/kg in fresh water. This work underlines the need for greater accuracy in allowing for the ecological characteristics of each site when assessing the impact of nuclear facilities, and for thoroughly correlating field observations with laboratory experiments in order to obtain a prospective view of the potentials for radioactive strontium uptake by plants according to the activity levels present in the liquid effluents [fr

  14. Advances in the biometric recognition methods: a survey on iris and fingerprint recognition

    Science.gov (United States)

    Zaeri, Naser; Alkoot, Fuad

    2010-02-01

    Human recognition based on biometrics finds many important applications in many life sectors and in particular in commercial and law enforcement. This paper aims to give a general overview of the advances in the biometric recognition methods. We concentrate on main methods and accessible ideas presented for human recognition systems based on two types of biometrics: iris and fingerprint. We present a quick overview of the landmark papers that laid the foundation in each track then we present the latest updates and important turns and solutions that developed in each track in the last few years.

  15. A SURVEY ON TIDAL ANALYSIS AND FORECASTING METHODS FOR TSUNAMI DETECTION

    Directory of Open Access Journals (Sweden)

    Diego Reforgiato Recupero

    2014-01-01

    Full Text Available Accurate analysis and forecasting of tidal level are very important tasks for human activities in oceanic and coastal areas. They can be crucial in catastrophic situations like occurrences of Tsunamis in order to provide a rapid alerting to the human population involved and to save lives. Conventional tidal forecasting methods are based on harmonic analysis using the least squares method to determine harmonic parameters. However, a large number of parameters and long-term measured data are required for precise tidal level predictions with harmonic analysis. Furthermore, traditional harmonic methods rely on models based on the analysis of astronomical components and they can be inadequate when the contribution of non-astronomical components, such as the weather, is significant. Other alternative approaches have been developed in the literature in order to deal with these situations and provide predictions with the desired accuracy, with respect also to the length of the available tidal record. These methods include standard high or band pass filtering techniques, although the relatively deterministic character and large amplitude of tidal signals make special techniques, like artificial neural networks and wavelets transform analysis methods, more effective. This paper is intended to provide the communities of both researchers and practitioners with a broadly applicable, up to date coverage of tidal analysis and forecasting methodologies that have proven to be successful in a variety of circumstances, and that hold particular promise for success in the future. Classical and novel methods are reviewed in a systematic and consistent way, outlining their main concepts and components, similarities and differences, advantages and disadvantages.

  16. A method for energy optimization and product quality improvement in manufacturing processes

    Energy Technology Data Exchange (ETDEWEB)

    Abou Khalil, Bachir; Berthou, Marc; Perrotin, Thomas [EDF R and D Les Renardieres Departement Eco-efficacite et Procedes Industriels (France); Clodic, Denis [Ecole des Mines de Paris, Centre Energetique et Procedes (France)

    2007-07-01

    Manufacturing processes are diverse by nature. Consequently, energy efficiency of each process requires a specific analysis leading to significant costs, SMEs (Small and Medium-sized-Enterprises) cannot always sustain such costs.The present paper introduces a method for Energy Optimisation of manufacturing processes and product quality improvement during manufacturing. This innovative method is based on a 4-step analysis that allows to rapidly identifying the potential energy savings in industrial processes. The 4-step method consists in: (1) the process definition, (2) the analysis of the detailed synopsis of production lines (considering energy and mass fluxes), (3) the analysis of energy consumptions and production volumes, and (4) energy efficiency assessment by identification of energy savings and technical option proposals.The first step is based on the preliminary expertise of the considered process. Process efficiency is analysed based on the theoretical minimum energy requirement, leading to the identification of the best available technologies (BATs) for the considered process, and the optimal energy efficiency. For the second and third steps, specific data of the process are collected. The process energy consumption and the production information are obtained from the production manager. When needed, measurements could be performed to complete the energy and mass balances. The actual energy efficiency of the manufacturing process is then calculated.The fourth step consists in the comparison of the different energy ratios. Based on these ratios and on the knowledge of the actual process, energy savings are evaluated, technical solutions for energy efficiency improvement are proposed and first economic analysis is carried out.

  17. Prevalence of coronary artery disease and coronary risk factors in Kerala, South India: A population survey – Design and methods

    Directory of Open Access Journals (Sweden)

    Geevar Zachariah

    2013-05-01

    Methods: The design of the study was cross-sectional population survey. We estimated the sample size based on an anticipated prevalence of 7.4% of CAD for rural and 11% for urban Kerala. The derived sample sizes for rural and urban areas were 3000 and 2400, respectively. The urban areas for sampling constituted one ward each from three municipal corporations at different parts of the state. The rural sample was drawn from two panchayats each in the same districts as the urban sample. One adult from each household in the age group of 20–59 years was selected using Kish method. All subjects between 60 and 79 years were included from each household. A detailed questionnaire was administered to assess the risk factors, history of CAD, family history, educational status, socioeconomic status, dietary habits, physical activity and treatment for CAD; anthropometric measurements, blood pressure, electrocardiogram and fasting blood levels of glucose and lipids were recorded.

  18. Condition Assessment Survey (CAS) Program. Deficiency standards and inspections methods manual: Volume 3, 0.03 Superstructure

    Energy Technology Data Exchange (ETDEWEB)

    1993-05-01

    General information is presented on asset determinant factor/CAS profile codes/CAS cost process; guide sheet tool & material listing; testing methods; inspection frequency; standard system design life tables; system work breakdown structure; and general system/material data. Deficiency standards and inspection methods are presented for beams; pre-engineered building systems; floors; roof structure; stairs; and fireproofing.

  19. METHODS IN THE POST-METHODS ERA. REPORT ON AN INTERNATIONAL SURVEY ON LANGUAGE TEACHING METHODS'

    Directory of Open Access Journals (Sweden)

    Jun Liu

    2004-06-01

    Full Text Available Do methods still have a place in 21" century language teaching? To answer this question, an international survey was conducted in the surnmer of 1999. A sample of 800 language teachers world-wide randomly drawn from 17,800 TESOLers were each given a 2-page survey. The return rate was 58.5% with the actual usable data set of448, which was analyzed by using both descriptive and inferential statistics. Among the ten commonly recognized teaching methods surveyed, both the Communicative Language Teaching Approach and an eclectic method seem to have the highest rate in familiarity, preference, and use. But when multiple factors, such as teaching contexts, instructional settings, learners' proficiency levels, class size, teaching experience and educational backgrounds of the teachers, and the status of being a native or nonnative English speaking professional were taken into consideration, various patterns and themes emerged. One interesting finding is that Grammar Translation is still used in EFL contexts, in larger classes, and with learners at low proficiency levels, though the ratio between the actual use of this method and teachers' preference does not match. Based on the results of the survey, a new theoretical framework is proposed to conceptualize language teaching methods in the post-methods era.

  20. Application of remote sensing methods and GIS in erosive process investigations

    Directory of Open Access Journals (Sweden)

    Mustafić Sanja

    2007-01-01

    Full Text Available Modern geomorphologic investigations of condition and change of the intensity of erosive process should be based on application of remote sensing methods which are based on processing of aerial and satellite photographs. Using of these methods is very important because it enables good possibilities for realizing regional relations of the investigated phenomenon, as well as the estimate of spatial and temporal variability of all physical-geographical and anthropogenic factors influencing given process. Realizing process of land erosion, on the whole, is only possible by creating universal data base, as well as by using of appropriate software, more exactly by establishing uniform information system. Geographical information system, as the most effective one, the most complex and the most integral system of information about the space enables unification as well as analytical and synthetically processing of all data.

  1. Detection of wood failure by image processing method: influence of algorithm, adhesive and wood species

    Science.gov (United States)

    Lanying Lin; Sheng He; Feng Fu; Xiping Wang

    2015-01-01

    Wood failure percentage (WFP) is an important index for evaluating the bond strength of plywood. Currently, the method used for detecting WFP is visual inspection, which lacks efficiency. In order to improve it, image processing methods are applied to wood failure detection. The present study used thresholding and K-means clustering algorithms in wood failure detection...

  2. Contribution to the experimental survey of the nuclear isomerism. Application of the deferred coincidences method to research and to the survey of metastable states of short period

    International Nuclear Information System (INIS)

    Ballini, R.

    1953-06-01

    Various methods of the physics brought many informations on the nuclear elements which one can arrange some ponderable quantities, what is the case of the steady elements and some, unsteady elements, as most of natural radioelements. On the other hand, elements of shorter life duration, and notably those that are carried to an excited state, are more badly known, and one can get information to their consideration that by the mediator of the properties of the transitions that they undergo, when they give birth has best-known nuclear cash: among these transitions represent the isomeric transitions. The goal of this work is the survey of the isomeric transitions from metastable states of short period, included in the domain of the microsecond to some milliseconds. The method of the deferred coincidences has been put to the point and applied in this goal while using the advantages of the selectors to several channels, under two main aspects where the device to several channels was either a selector of time, either a selector of amplitudes. This method served to study the working of Geiger-Muller counter and to measure with precision the period of 181 Ta * in of the varied experimental conditions. The adopted value is 17,2 ±0,2 μs. This work also found an immediate practical application to the setting in evidence of very weak quantities of hafnium in zirconium, of which it constitutes a tenacious and difficult impurity to analyze by the ordinary ways. (M.B.) [fr

  3. Department of Defense Financial Management Education and Training Programs: A Survey of Quality Assurance Methods

    Science.gov (United States)

    1992-06-01

    transformed into outputs. [Ref. 16: pp. 64-65] Stufflebeam uses a similar dichotomy to describe the evaluation process. He divides evaluation into...Sage Publications, 1979. 19. Stufflebeam , Daniel L., et al, Education Evaluation and Decision Making, F. E. Peacock Publishers, 1971. 20. Suchman

  4. A Survey on Sensor Coverage and Visual Data Capturing/Processing/Transmission in Wireless Visual Sensor Networks

    Directory of Open Access Journals (Sweden)

    Florence G. H. Yap

    2014-02-01

    Full Text Available Wireless Visual Sensor Networks (WVSNs where camera-equipped sensor nodes can capture, process and transmit image/video information have become an important new research area. As compared to the traditional wireless sensor networks (WSNs that can only transmit scalar information (e.g., temperature, the visual data in WVSNs enable much wider applications, such as visual security surveillance and visual wildlife monitoring. However, as compared to the scalar data in WSNs, visual data is much bigger and more complicated so intelligent schemes are required to capture/process/ transmit visual data in limited resources (hardware capability and bandwidth WVSNs. WVSNs introduce new multi-disciplinary research opportunities of topics that include visual sensor hardware, image and multimedia capture and processing, wireless communication and networking. In this paper, we survey existing research efforts on the visual sensor hardware, visual sensor coverage/deployment, and visual data capture/ processing/transmission issues in WVSNs. We conclude that WVSN research is still in an early age and there are still many open issues that have not been fully addressed. More new novel multi-disciplinary, cross-layered, distributed and collaborative solutions should be devised to tackle these challenging issues in WVSNs.

  5. Risk of psychological ill health and methods of organisational downsizing: a cross-sectional survey in four European countries

    Directory of Open Access Journals (Sweden)

    Elena Andreeva

    2017-09-01

    Full Text Available Abstract Background The manner in which organizational downsizing is implemented can make a substantial difference as to whether the exposed workers will suffer from psychological ill health. Surprisingly, little research has directly investigated this issue. We examined the likelihood of psychological ill health associated with strategic and reactive downsizing. Methods A cross-sectional survey included 1456 respondents from France, Sweden, Hungary and the United Kingdom: 681 employees in stable workplaces (reference group and 775 workers from downsized companies. Reactive downsizing was exemplified by the exposures to compulsory redundancies of medium to large scale resulting in job loss or surviving a layoff while staying employed in downsized organizations. The workforce exposed to strategic downsizing was represented by surplus employees who were internally redeployed and supported through their career change process within a policy context of “no compulsory redundancy”. Symptoms of anxiety, depression and emotional exhaustion were assessed in telephone interviews with brief subscales from Hospital Anxiety Scale (HADS-A, Hopkins Symptom Checklist (SCL-CD6 and Maslach Burnout Inventory (MBI-GS. Data were analyzed using logistic regression. Results We observed no increased risk of psychological ill health in the case of strategic downsizing. The number of significant associations with psychological ill health was the largest for the large-scale reactive downsizing: surviving a layoff was consistently associated with all three outcome measures; returning to work after the job loss experience was related to anxiety and depression, while persons still unemployed at interview had elevated odds of anxiety. After reactive medium-scale downsizing, unemployment at interview was the only exposure associated with anxiety and depression. Conclusions The manner in which organizational downsizing is implemented can be important for the psychological

  6. Strategic planning, implementation, and evaluation processes in hospital systems: a survey from Iran.

    Science.gov (United States)

    Sadeghifar, Jamil; Jafari, Mehdi; Tofighi, Shahram; Ravaghi, Hamid; Maleki, Mohammad Reza

    2014-09-28

    Strategic planning has been presented as an important management practice. However, evidence of its deployment in healthcare systems in low-income and middle-income countries (LMICs) is limited. This study investigated the strategic management process in Iranian hospitals. The present study was accomplished in 24 teaching hospitals in Tehran, Iran from September 2012 to March 2013. The data collection instrument was a questionnaire including 130 items. This questionnaire measured the situation of formulation, implementation, and evaluation of strategic plan as well as the requirements, facilitators, and its benefits in the studied hospitals. All the investigated hospitals had a strategic plan. The obtained percentages for the items "the rate of the compliance to requirements" and "the quantity of planning facilitators" (68.75%), attention to the stakeholder participation in the planning (55.74%), attention to the planning components (62.22%), the status of evaluating strategic plan (59.94%) and the benefits of strategic planning for hospitals (65.15%) were in the medium limit. However, the status of implementation of the strategic plan (53.71%) was found to be weak. Significant statistical correlations were observed between the incentive for developing strategic plan and status of evaluating phase (P=0.04), and between status of implementation phase and having a documented strategic plan (P=0.03). According to the results, it seems that absence of appropriate internal incentive for formulating and implementing strategies led more hospitals to start formulation strategic planning in accordance with the legal requirements of Ministry of Health. Consequently, even though all the investigated hospital had the documented strategic plan, the plan has not been implemented efficiently and valid evaluation of results is yet to be achieved.

  7. Influence of Processing Method on the Mechanical and Electrical Properties of MWCNT/PET Composites

    Directory of Open Access Journals (Sweden)

    O. Rodríguez-Uicab

    2013-01-01

    Full Text Available Multiwalled carbon nanotube (MWCNT/polyethylene terephthalate (PET composites were prepared by three processing methods: direct extrusion (DE, melt compounding followed by extrusion (MCE, and dispersion of the MWCNTs in a solvent by sonication followed by extrusion (SSE. The mechanical properties of the MWCNT/PET composites processed by MCE increased with 0.1 wt% MWCNTs with respect to the neat PET. The electrical percolation threshold of MWCNT/PET composites processed by DE and MCE was ~1 wt% and the conductivity was higher for composites processed by MCE. Raman spectroscopy and scanning electron microscopy showed that mixing the MWCNTs by melt compounding before extruding yields better dispersion of the MWCNTs within the PET matrix. The processing method assisted by a solvent resulted in matrix plasticization.

  8. Canadian National Survey on Sun Exposure & Protective Behaviours: methods.

    Science.gov (United States)

    Lovato, C Y; Shoveller, J A; Peters, L; Rivers, J K

    1998-06-01

    This article describes the methods used for the 1996 Canadian National Survey on Sun Exposure & Protective Behaviours. A 55-item random-digit-dialling telephone household survey of people 15 years of age or more was completed in 1996. Items assessed were daily sun exposure and protective behaviours, as well as other sun-related behaviours and attitudes. Data were collected regarding sun-related behaviours during leisure, work time and winter holidays, as well as for children 12 years of age or less (as reported by parents). To test for an effect on the survey response rate, a letter of introduction was sent to 40% of the households. The survey response rate was 69% (4023 successfully completed surveys out of 5847 households included in the sample). The response rate achieved in the subset that received the introductory letter was 75%. This survey is the first to establish national population estimates for sun exposure and protective behaviours in Canada.

  9. A survey of methods and practices used to stop digit sucking in 2-5 ...

    African Journals Online (AJOL)

    Conclusion: The findings from this study show that the most common method to stop digit sucking habit was the use of adhesive plaster. Negative practices included the use of razor blade to cut the digits and the application of bitter or peppery tasting substances. The use of appliances was not common and many mothers ...

  10. A conceptual guide to detection probability for point counts and other count-based survey methods

    Science.gov (United States)

    D. Archibald McCallum

    2005-01-01

    Accurate and precise estimates of numbers of animals are vitally needed both to assess population status and to evaluate management decisions. Various methods exist for counting birds, but most of those used with territorial landbirds yield only indices, not true estimates of population size. The need for valid density estimates has spawned a number of models for...

  11. FROM “MODELS” TO “REALITY”, AND RETURN. SOME REFLECTIONS ON THE INTERACTION BETWEEN SURVEY AND INTERPRETATIVE METHODS FOR BUILT HERITAGE CONSERVATION

    Directory of Open Access Journals (Sweden)

    F. Ottoni

    2017-05-01

    Full Text Available It's well known that more and more accurate methodologies and automatic tools are now available in the field of geometric survey and image processing and they constitute a fundamental instrument for cultural heritage knowledge and preservation; on the other side, very smart and precise numerical models are continuously improved and used in order to simulate the mechanical behaviour of masonry structures: both instruments and technologies are important part of a global process of knowledge which is at the base of any conservation project of cultural heritage. Despite the high accuracy and automation level reached by both technologies and programs, the transfer of data between them is not an easy task and defining the most reliable way to translate and exchange information without data loosing is still an open issue. The goal of the present paper is to analyse the complex process of translation from the very precise (and sometimes redundant information obtainable by the modern survey methodologies for historic buildings (as laser scanner, into the very simplified (may be too much schemes used to understand their real structural behaviour, with the final aim to contribute to the discussion on reliable methods for cultural heritage knowledge improvement, through empiricism.

  12. CESAR cost-efficient methods and processes for safety-relevant embedded systems

    CERN Document Server

    Wahl, Thomas

    2013-01-01

    The book summarizes the findings and contributions of the European ARTEMIS project, CESAR, for improving and enabling interoperability of methods, tools, and processes to meet the demands in embedded systems development across four domains - avionics, automotive, automation, and rail. The contributions give insight to an improved engineering and safety process life-cycle for the development of safety critical systems. They present new concept of engineering tools integration platform to improve the development of safety critical embedded systems and illustrate capacity of this framework for end-user instantiation to specific domain needs and processes. They also advance state-of-the-art in component-based development as well as component and system validation and verification, with tool support. And finally they describe industry relevant evaluated processes and methods especially designed for the embedded systems sector as well as easy adoptable common interoperability principles for software tool integratio...

  13. DoD Information Assurance Certification and Accreditation Process (DIACAP) Survey and Decision Tree

    Science.gov (United States)

    2011-07-01

    CVC Compliance and Validation Certification DAA designated accrediting authority DATO denial of authorization to operate DIACAP DoD Information...standard based on implementation of the best practices listed in paragraph 2.3. c. Direct the DSG to rename the Data Protection Committee to the...Information Grid (GIG)- based environment. Figure A-1. DoD IA program management. 1.1.1 DIACAP Background. a. Interim DIACAP signed 6 July 2006

  14. A novel method for detecting and counting overlapping tracks in SSNTD by image processing techniques

    International Nuclear Information System (INIS)

    Ab Azar, N.; Babakhani, A.; Broumandnia, A.; Sepanloo, K.

    2016-01-01

    Overlapping object detection and counting is a challenge in image processing. A new method for detecting and counting overlapping circles is presented in this paper. This method is based on pattern recognition and feature extraction using “neighborhood values“ in an object image by implementation of image processing techniques. The junction points are detected by assignment of a value for each pixel in an image. As is shown, the neighborhood values for junction points are larger than the values for other points. This distinction of neighborhood values is the main feature which can be utilized to identify the junction points and to count the overlapping tracks. This method can be used for recognizing and counting charged particle tracks, blood cells and also cancer cells. The method is called “Track Counting based on Neighborhood Values” and is symbolized by “TCNV”. - Highlights: • A new method is introduced to recognize nuclear tracks by image processing. • The method is used to specify neighborhood pixels in junction points in overlapping tracks. • Enhanced method of counting overlapping tracks. • New counting system has linear behavior in counting tracks with density less than 300,000 tracks per cm 2 . • In the new method, the overlap tracks can be recognized even to 10× tracks and more.

  15. The Oil Point Method - A tool for indicative environmental evaluation in material and process selection

    DEFF Research Database (Denmark)

    Bey, Niki

    2000-01-01

    to three essential assessment steps, the method enables rough environmental evaluations and supports in this way material- and process-related decision-making in the early stages of design. In its overall structure, the Oil Point Method is related to Life Cycle Assessment - except for two main differences...... of environmental evaluation and only approximate information about the product and its life cycle. This dissertation addresses this challenge in presenting a method, which is tailored to these requirements of designers - the Oil Point Method (OPM). In providing environmental key information and confining itself...

  16. Criteria for method selection and process design of a technical development strategy

    CSIR Research Space (South Africa)

    Joseph-Malherbe, S

    2012-08-01

    Full Text Available . They often have charismatic leadership and manage the organization through selection and indoctrination of members. The organization is basically decentralized but has powerful centralized controls. square4 Political: This is not so much an organization... that is used to structure, plan and control the process of developing a system. It is a collection of related processes, methods (techniques) and tools. Each methodology has its strengths and weaknesses and no single methodology is necessarily suitable...

  17. Underground processing method for radiation-contaminated material and transferring method for buffer molding material

    International Nuclear Information System (INIS)

    Akasaka, Hidenari; Shimura, Satoshi; Asano, Eiichi; Yamagata, Junji; Ninomiya, Nobuo; Kawakami, Susumu.

    1995-01-01

    A bottomed molding material (buffer molding material) is formed into a bottomed cylindrical shape by solidifying, under pressure, powders such as of bentonite into a highly dense state by a cold isotropic pressing or the like, having a hole for accepting and containing a vessel for radiation-contaminated materials. The bottomed cylindrical molding material is loaded on a transferring vessel, and transferred to a position near the site for underground disposal. The bottomed cylindrical molding material having a upwarded containing hole is buried in the cave for disposal. The container for radiation-contaminated material is loaded and contained in the containing hole of the bottomed cylindrical molding material. A next container for radiation-contaminated materials is juxtaposed thereover. Then, a bottomed cylindrical molding material having a downwarded containing hole is covered to the container for the radiation-contaminated material in a state being protruded upwardly. The radiation-contaminated material is thus closed by a buffer material of the same material at the circumference thereof. (I.N.)

  18. The uranium waste fluid processing examination by liquid and liquid extraction method using the emulsion flow method

    International Nuclear Information System (INIS)

    Kanda, Nobuhiro; Daiten, Masaki; Endo, Yuji; Yoshida, Hideaki; Mita, Yutaka; Naganawa, Hirochika; Nagano, Tetsushi; Yanase, Nobuyuki

    2015-03-01

    Spent centrifuges which had used for the development of the uranium enrichment technology are stored in the uranium enrichment facility located in Ningyo-toge Environmental Center, Japan Atomic Energy Agency (JAEA). Our technology of the centrifugal machine processing are supposed to separate the radioactive material adhered on surface of inner parts of centrifuges by the wet way decontamination method using the ultrasonic bath filled dilute sulfuric acid and water, and it is generated the neutralization sediment (sludge) by the processing of the radioactive waste fluid with the decontamination. JAEA had been considering the applicability of a streamlining and reduction of the processing of the sludge by decreases radioactive concentration including the sludge through the removes uranium from the radioactive waste fluid. As part of considerations, JAEA have been promoting technological developments of the uranium extraction separation using The Emulsion Flow Extraction Method (a theory propounded by JAEA-Nuclear Science and Engineering Center) in close coordination and cooperation between with JAEA-Nuclear Science and Engineering Center and Ningyo-toge Environmental Center from 2007 fiscal year. This report describes the outline of the application test using actual waste fluid of dilute sulfuric acid and water by developed the examination system introducing the emulsion flow extraction method. (author)

  19. Mapping Norway - A Method to Register and Survey the Status of Accessibility.

    Science.gov (United States)

    Bögelsack, Kathrin; Michaelis, Sven

    2016-01-01

    The Norwegian mapping authority has developed an APP and a standard method for mapping accessibility for people with limited or no walking abilities, the blind and the partially sighted in urban and recreational areas. We choose an object-orientated approach where points, lines and polygons represents objects in the environment. All data are stored in a geospatial database, so they can be presented as web map and analysed using GIS software. To this day, more than 160 municipalities are mapped using that method. The aim of this project is to establish a national standard for mapping and to provide a geodatabase that shows the status of accessibility throughout Norway. The data provide a useful tool for national statistics, local planning authorities and private users. First results show that accessibility is low and Norway still faces many challenges to meet the government's goals for Universal Design.

  20. Survey of insect fauna from plants medicinal, aromatic and seasoning and disinfestation by the process of radiation

    International Nuclear Information System (INIS)

    Reis, Fabricio Caldeira

    2013-01-01

    The present study aimed to survey the insect fauna associated with medicinal plants, aromatic dehydrated and seasoning trade in Sao Paulo city, using different doses of gamma radiation with the aim of disinfestation of the material and determine the lethal dose of gamma radiation on Sphaericus gibboides. From April to May 2011 were collected in 10 establishments the following sample materials: Melissa officinalis L. (Lemongrass), Mentha piperita L. (Mint), Ocimum basilicum L. (Basil), Origanum vulgare L. (Oregano), Rosmarinus officinalis L. (Rosemary), Thymus vulgaris L. (Thyme), Senna alexandrina Mill (senna), Coriandrum sativum L. (Coriander), Petroselinum crispum (Mill.) Fuss (salsa) and Pimpinella anisum L. (Fennel), Baccharis trimera (Less.) DC. (Gorse), Chamomilla recutita L. (= M. recutita L.) (chamomile), Laurus nobilis L. (Blonde) (Lauraceae); Capsicum annuum L. (Sweet paprika), Bixa orellana L. (Spice) (Bixaceae) and Peumus boldus Molina (Boldo). The first screening showed that all the tested materials did not show the presence of adult insects. After 45 days 940 adult insects were found and larvae from eggs. The substrates analyzed Chamomilla recutita showed the highest rate of infestation, with 70,6%. Pelmus boldus, Laurus nobilis, Chamomilla recutita and Capsicum annuum, had the highest species diversity. Baccharis trimera, Bixa orellana, Melissa officinalis, Origanum vulgare and Coriandrum sativum showed no infestation. The species was Lasioderma serricorne the insect with the largest number of individuals found (936), higher percentage of infestation in different materials (62.5%) and lots, and highest occurrence (68,75%) materials (M. piperita, S. alexandrian, P. anisum, Chamomilla recutita, P. crispum, L. nobilis, C. sativum, C. annuum, O. basilicum, P. boldus and T. vulgaris). The following materials were selected for testing disinfestation by irradiation process: Bixa orellana, Capsicum annuum, Cassia angustifolia, Coriandrum sativum, Mentha

  1. Analyses of Methods and Algorithms for Modelling and Optimization of Biotechnological Processes

    Directory of Open Access Journals (Sweden)

    Stoyan Stoyanov

    2009-08-01

    Full Text Available A review of the problems in modeling, optimization and control of biotechnological processes and systems is given in this paper. An analysis of existing and some new practical optimization methods for searching global optimum based on various advanced strategies - heuristic, stochastic, genetic and combined are presented in the paper. Methods based on the sensitivity theory, stochastic and mix strategies for optimization with partial knowledge about kinetic, technical and economic parameters in optimization problems are discussed. Several approaches for the multi-criteria optimization tasks are analyzed. The problems concerning optimal controls of biotechnological systems are also discussed.

  2. Quality assessment of crude and processed Arecae semen based on colorimeter and HPLC combined with chemometrics methods.

    Science.gov (United States)

    Sun, Meng; Yan, Donghui; Yang, Xiaolu; Xue, Xingyang; Zhou, Sujuan; Liang, Shengwang; Wang, Shumei; Meng, Jiang

    2017-05-01

    Raw Arecae Semen, the seed of Areca catechu L., as well as Arecae Semen Tostum and Arecae semen carbonisata are traditionally processed by stir-baking for subsequent use in a variety of clinical applications. These three Arecae semen types, important Chinese herbal drugs, have been used in China and other Asian countries for thousands of years. In this study, the sensory technologies of a colorimeter and sensitive validated high-performance liquid chromatography with diode array detection were employed to discriminate raw Arecae semen and its processed drugs. The color parameters of the samples were determined by a colorimeter instrument CR-410. Moreover, the fingerprints of the four alkaloids of arecaidine, guvacine, arecoline and guvacoline were surveyed by high-performance liquid chromatography. Subsequently, Student's t test, the analysis of variance, fingerprint similarity analysis, hierarchical cluster analysis, principal component analysis, factor analysis and Pearson's correlation test were performed for final data analysis. The results obtained demonstrated a significant color change characteristic for components in raw Arecae semen and its processed drugs. Crude and processed Arecae semen could be determined based on colorimetry and high-performance liquid chromatography with a diode array detector coupled with chemometrics methods for a comprehensive quality evaluation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Estimating interregional economic impacts : an evaluation of nonsurvey, semisurvey, and full-survey methods

    NARCIS (Netherlands)

    Oosterhaven, J; van der Knijff, EC; Eding, GJ

    Literature shows that nonsurvey input-output tables tend to produce regional multipliers with systematic upward biases. This paper explores the related, relatively uncharted territory of nonsurvey versus survey impact studies by means of a series of simulations. The base case is provided by a very

  4. Integrated simulation method for interaction between manufacturing process and machine tool

    Science.gov (United States)

    Chen, Wanqun; Huo, Dehong; Xie, Wenkun; Teng, Xiangyu; Zhang, Jiayi

    2016-10-01

    The interaction between the machining process and the machine tool (IMPMT) plays an important role on high precision components manufacturing. However, most researches are focused on the machining process or the machine tool separately, and the interaction between them has been always overlooked. In this paper, a novel simplified method is proposed to realize the simulation of IMPMT by combining use the finite element method and state space method. In this method, the transfer function of the machine tool is built as a small state space. The small state space is obtained from the complicated finite element model of the whole machine tool. Furthermore, the control system of the machine tool is integrated with the transfer function of the machine tool to generate the cutting trajectory. Then, the tool tip response under the cutting force is used to predict the machined surface. Finally, a case study is carried out for a fly-cutting machining process, the dynamic response analysis of an ultra-precision fly-cutting machine tool and the machined surface verifies the effectiveness of this method. This research proposes a simplified method to study the IMPMT, the relationships between the machining process and the machine tool are established and the surface generation is obtained.

  5. Evaluation and selection of in-situ leaching mining method using analytic hierarchy process

    International Nuclear Information System (INIS)

    Zhao Heyong; Tan Kaixuan; Liu Huizhen

    2007-01-01

    According to the complicated conditions and main influence factors of in-situ leaching min- ing, a model and processes of analytic hierarchy are established for evaluation and selection of in-situ leaching mining methods based on analytic hierarchy process. Taking a uranium mine in Xinjiang of China for example, the application of this model is presented. The results of analyses and calculation indicate that the acid leaching is the optimum project. (authors)

  6. The Selected Method and Tools for Performance Measurement in the Green Supply Chain—Survey Analysis in Poland

    Directory of Open Access Journals (Sweden)

    Blanka Tundys

    2018-02-01

    Full Text Available The methods and tools for the performance measurement and evaluation of the green supply chain management are very important elements for the construction and function of this type of supply chain. The result is a presentation of the considerations underlying a very general model, which presents some selected tools, but no breakdown of individual industries. The considerations undertaken are important and have scientific added value as usually in practice, a very large number of tools are used to assess the supply chain, which are not always correlated or adapted to the specificity of the chain. It is worth pointing out which of the already used or completely new tools and methods will be most useful for assessing the green supply chain. The structure of the paper covers the theoretical and empirical. It includes an introduction, our goals and hypotheses, state of the art, methodology, empirical findings, and discussion. We present the definitional differences between green and sustainable supply chains and focus on the selection and identification of methods for the framework model for evaluating the green supply chain. In the next step, the theoretical and selected method and tools were compared to a survey of Poland. On the basis of the survey, we present the findings and discussions found in this area. The main methodology used includes a literature review, a survey analysis using a questionnaire and statistical tools. The survey was carried out in 2015 in sample organizations in Poland. The research results showed that organizations were aware of the environmental elements of measuring and assessing the supply chain from an environmental point of view, but their use depended on many factors: the area, size of the organization, or the industry. If certain boundary conditions are met and the organizations are aware of the essence of environmental aspects in the chain, then they are applying green measures to the supply chain. These findings

  7. A Survey on Synthesis Processes of Structured Materials for Biomedical Applications: Iron-based Magnetic Nanoparticles, Polymeric Materials and Polymerization Processes.

    Science.gov (United States)

    Neto, Weslany Silvério; Jensen, Alan Thyago; Ferreira, Gabriella Ribeiro; Valadares, Leonardo Fonseca; Gambetta, Rossano; Gonçalves, Sílvia Belém; Machado, Fabricio

    2015-01-01

    Magnetic materials based on iron oxides are extensively designed for several biomedical applications. Heterogeneous polymerization processes are powerful tools for the production of tailored micro-sized and nanosized magneto-polymeric particles. Although several polymerization processes have been adopted along the years, suspension, emulsion and miniemulsion systems deserve special attention due to its ability to produce spherical polymer particles containing magnetic nanoparticles homogeneously dispersed into the polymer thermoplastic matrices. The main objective of this paper is to review the main methods of synthesis of iron-based magnetic nanoparticles and to illustrate how typical polymerization processes in different dispersion medium can be successfully used to produce engineered magnetic core-shell structures. It is exemplified the use of suspension, emulsion and miniemulsion polymerization processes in order to support experimental methodologies required for the production of magnetic polymer particles intended for biomedical applications such as intravascular embolization treatments, drug delivery systems and hyperthermia treatment.

  8. Computing the Effects of Strain on Electronic States: A Survey of Methods and Issues

    Science.gov (United States)

    2012-12-01

    covered in this report. In section 6, we show computed comparisons of the different methods using models of GaAs, InAs, and aluminum arsenide (AlAs...constants called the Luttinger parameters. Kane (83) studied the band structure of indium antimonide (InSb), whose electronic band structure resembles...Band Structure of Indium Antimonide . J. of Phys. and Chem. of Solids 1957, 1, 249–261. 84. Vurgaftman, I. J.; Meyer, R.; Ram-Mohan, L. R. Band

  9. Curcumin complexation with cyclodextrins by the autoclave process: Method development and characterization of complex formation.

    Science.gov (United States)

    Hagbani, Turki Al; Nazzal, Sami

    2017-03-30

    One approach to enhance curcumin (CUR) aqueous solubility is to use cyclodextrins (CDs) to form inclusion complexes where CUR is encapsulated as a guest molecule within the internal cavity of the water-soluble CD. Several methods have been reported for the complexation of CUR with CDs. Limited information, however, is available on the use of the autoclave process (AU) in complex formation. The aims of this work were therefore to (1) investigate and evaluate the AU cycle as a complex formation method to enhance CUR solubility; (2) compare the efficacy of the AU process with the freeze-drying (FD) and evaporation (EV) processes in complex formation; and (3) confirm CUR stability by characterizing CUR:CD complexes by NMR, Raman spectroscopy, DSC, and XRD. Significant differences were found in the saturation solubility of CUR from its complexes with CD when prepared by the three complexation methods. The AU yielded a complex with expected chemical and physical fingerprints for a CUR:CD inclusion complex that maintained the chemical integrity and stability of CUR and provided the highest solubility of CUR in water. Physical and chemical characterizations of the AU complexes confirmed the encapsulated of CUR inside the CD cavity and the transformation of the crystalline CUR:CD inclusion complex to an amorphous form. It was concluded that the autoclave process with its short processing time could be used as an alternate and efficient methods for drug:CD complexation. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. METHODS AND PRINCIPLES OF OPTIMIZATION SPECIFIC TO THE DOMAIN OF EQUIPMENT AND MANUFACTURING PROCESSES

    Directory of Open Access Journals (Sweden)

    Radu Virgil GRIGORIU

    2011-11-01

    Full Text Available The objectives of the industrial products manufacturers are generally oriented to manufacture high quality level products, in less time and with maximum economic efficiency. The achievement of these objectives can be realized, generally, by optimizing the processes and the technological manufacturing equipment parameters. In order to optimize these parameters it is necessary to apply series of optimization methods and principles that allow the identification and establishment of the best solution from a variety of alternatives.

  11. Method and apparatus for rapid adjustment of process gas inventory in gaseous diffusion cascades

    International Nuclear Information System (INIS)

    1980-01-01

    A method is specified for the operation of a gaseous diffusion cascade wherein electrically driven compressors circulate a process gas through a plurality of serially connected gaseous diffusion stages to establish first and second countercurrently flowing cascade streams of process gas, one of the streams being at a relatively low pressure and enriched in a component of the process gas and the other being at a higher pressure and depleted in the same, and wherein automatic control systems maintain the stage process gas pressures by positioning process gas flow control valve openings at values which are functions of the difference between reference-signal inputs to the systems, and signal inputs proportional to the process gas pressures in the gaseous diffusion stages associated with the systems, the cascade process gas inventory being altered, while the cascade is operating, by simultaneously directing into separate process-gas freezing zones a plurality of substreams derived from one of the first and second streams at different points along the lengths thereof to solidify approximately equal weights of process gas in the zone while reducing the reference-signal inputs to maintain the positions of the control valves substantially unchanged despite the removal of process gas inventory via the substreams. (author)

  12. A Method for Sustainable Carbon Dioxide Utilization Process Synthesis and Design

    DEFF Research Database (Denmark)

    Frauzem, Rebecca; Fjellerup, Kasper; Roh, Kosan

    for the process synthesis, design and more sustainable design. Using a superstructure-based approach a network of utilization alternatives is created linking CO2 and other raw materials with various products using processing blocks. This will then be optimized and verified for sustainability. Detailed design has...... also been performed for various case studies. These case studies include multiple pathways for the production of methanol and the production of dimethyl carbonate (DMC). From detailed design and analysis, CO2 conversion processes show promise as an additional method for the sustainable reduction of CO2...

  13. Processing the ground vibration signal produced by debris flows: the methods of amplitude and impulses compared

    Science.gov (United States)

    Arattano, M.; Abancó, C.; Coviello, V.; Hürlimann, M.

    2014-12-01

    Ground vibration sensors have been increasingly used and tested, during the last few years, as devices to monitor debris flows and they have also been proposed as one of the more reliable devices for the design of debris flow warning systems. The need to process the output of ground vibration sensors, to diminish the amount of data to be recorded, is usually due to the reduced storing capabilities and the limited power supply, normally provided by solar panels, available in the high mountain environment. There are different methods that can be found in literature to process the ground vibration signal produced by debris flows. In this paper we will discuss the two most commonly employed: the method of impulses and the method of amplitude. These two methods of data processing are analyzed describing their origin and their use, presenting examples of applications and their main advantages and shortcomings. The two methods are then applied to process the ground vibration raw data produced by a debris flow occurred in the Rebaixader Torrent (Spanish Pyrenees) in 2012. The results of this work will provide means for decision to researchers and technicians who find themselves facing the task of designing a debris flow monitoring installation or a debris flow warning equipment based on the use of ground vibration detectors.

  14. Place of modern imaging methods and their influence on the diagnostic process

    International Nuclear Information System (INIS)

    Petkov, D.; Lazarova, I.

    1991-01-01

    The main trends in development of the modern imaging diagnostic methods are presented: increasing the specificity of CT, nuclear-magnetic resonance imaging, positron-emission tomography, digital substractional angiography, echography etc. based on modern technical improvements; objective representation of the physiological and biochemical divergencies in particular diseases; interventional radiology; integral application of different methods; improving the sensitivity and specificity of the methods based on developments in pharmacology (new contrast media, parmaceuticals influencing the function of examinated organs, etc.); the possibilities for data compilation and further computerized processing of primary data. Personal experience is reported with the exploitation of these methods in Bulgaria. Attention is also called to the unfavourable impact connected with the too strong technicization of the diagnostic and therapeutic process in a health, deontologic, economical and social respect. 15 refs

  15. Survey of organic electrolytic processes. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1979-11-01

    The basic objectives of this study were to review the literature to determine the commercial status of electroorganic processes and to estimate whether there would be significant possible energy savings by introduction of electroorganic processes to replace conventional chemical processes for production of certain large-tonnage organic chemicals. A list was compiled of the 220 organic chemicals that were produced at greater than 10,000 tons per year in 1975 in the United States. Search of the Swann and of the Fichter Bibliographies of electroorganic literature yielded references on 95 of these compounds. By application of selection rules to obtain promising candidates, nine chemicals with diverse electrochemical processes were chosen for detailed process energy calculations. Parallel calculations were made for presently-used commercial chemical routes to these products. Two of the nine electrochemical processes, adiponitrile and methyl ethyl ketone, had energy savings in comparison to the corresponding chemical processes. Other more-energy-efficient electrochemical processes are likely among the above 95, although they remain to be identified.

  16. Study of Injection Molding Warpage Using Analytic Hierarchy Process and Taguchi Method

    Directory of Open Access Journals (Sweden)

    Dyi-Cheng Chen

    2016-10-01

    Full Text Available This study integrated Analytic Hierarchy Process and Taguchi method to investigate into injection molding warpage. The warpage important factor will be elected by Analytic Hierarchy Process (AHP, the AHP hierarchy analysis factor from documents collected and aggregate out data, then through the expert questionnaire delete low weight factor. Finally, we used Taguchi quality engineering method to decide injection molding optimized combination factors. Furthermore, the paper used injection pressure, holding pressure, holding time, mold temperature to analyze four factors, three levels Taguchi design data. Moreover, the paper discussed the reaction of each factor on the S / N ratio and analysis of variance to obtain the best combination of minimal warpage.

  17. Acquisition and processing method for human sensorial, sensitive, motory and phonatory circuits reaction times

    International Nuclear Information System (INIS)

    Doche, Claude

    1972-01-01

    This work describes a storage and acquisition device and a method for human sensorial and sensitive motory and phonatory reaction times. The considered circuits are those made with the visual, auditory and sensory receptor organs and the motory or phonatory effector organs. The anatomo-physiological localization of these circuits allows us to appreciate the possibilities of the central nervous system for different angles. The experimental population is made of normal and pathological individuals (individuals having tumoral or vascular, localized or diffused cerebral lesions or parkinsonian individuals). The parameter processing method is based on the multivariate analysis results and allows us to position each individual compared to a normal individual and to appreciate the weight of each circuit in this positioning. Clinical exploitation results give to this method a prognosis and therapeutic interest. It seems though untimely to talk about its diagnosis value. (author) [fr

  18. A Delphi Method Analysis to Create an Emergency Medicine Educational Patient Satisfaction Survey

    Directory of Open Access Journals (Sweden)

    Kory S. London

    2015-12-01

    Full Text Available Introduction: Feedback on patient satisfaction (PS as a means to monitor and improve performance in patient communication is lacking in residency training. A physician’s promotion, compensation and job satisfaction may be impacted by his individual PS scores, once he is in practice. Many communication and satisfaction surveys exist but none focus on the emergency department setting for educational purposes. The goal of this project was to create an emergency medicine-based educational PS survey with strong evidence for content validity. Methods: We used the Delphi Method (DM to obtain expert opinion via an iterative process of surveying. Questions were mined from four PS surveys as well as from group suggestion. The DM analysis determined the structure, content and appropriate use of the tool. The group used four-point Likert-type scales and Lynn’s criteria for content validity to determine relevant questions from the stated goals. Results: Twelve recruited experts participated in a series of seven surveys to achieve consensus. A 10-question, single-page survey with an additional page of qualitative questions and demographic questions was selected. Thirty one questions were judged to be relevant from an original 48-question list. Of these, the final 10 questions were chosen. Response rates for individual survey items was 99.5%. Conclusion: The DM produced a consensus survey with content validity evidence. Future work will be needed to obtain evidence for response process, internal structure and construct validity.

  19. Symbolic processing methods for 3D visual processing

    Science.gov (United States)

    Tedder, Maurice; Hall, Ernest L.

    2001-10-01

    The purpose of this paper is to describe a theory that defines an open method for solving 3D visual data processing and artificial intelligence problems that is independent of hardware or software implementation. The goal of the theory is to generalize and abstract the process of 3D visual processing so that the method can be applied to a wide variety of 3D visual processing problems. Once the theory is described a heuristic derivation is given. Symbolic processing methods can be generalized into an abstract model composed of eight basic components. The symbolic processing model components are: input data; input data interface; symbolic data library; symbolic data environment space; relationship matrix; symbolic logic driver; output data interface and output data. An obstacle detection and avoidance experiment was constructed to demonstrate the symbolic processing method. The results of the robot obstacle avoidance experiment demonstrated that the mobile robot could successfully navigate the obstacle course using symbolic processing methods for the control software. The significance of the symbolic processing approach is that the method arrived at a solution by using a more formal quantifiable process. Some of the practical applications for this theory are: 3D object recognition, obstacle avoidance, and intelligent robot control.

  20. Applying the Mixed Methods Instrument Development and Construct Validation Process: the Transformative Experience Questionnaire

    Science.gov (United States)

    Koskey, Kristin L. K.; Sondergeld, Toni A.; Stewart, Victoria C.; Pugh, Kevin J.

    2018-01-01

    Onwuegbuzie and colleagues proposed the Instrument Development and Construct Validation (IDCV) process as a mixed methods framework for creating and validating measures. Examples applying IDCV are lacking. We provide an illustrative case integrating the Rasch model and cognitive interviews applied to the development of the Transformative…

  1. The Impact of Project Work and the Writing Process Method on Writing Production

    Directory of Open Access Journals (Sweden)

    Marcela Díaz Ramírez

    2014-10-01

    Full Text Available This article presents the outcomes of an investigation whose main goal was to implement the methodology of project work and a process approach in order to improve writing production in an English class of Colombian university students since their diagnostic tests showed that their written production had the lowest score. Based on data collected, four factors were developed in the process of learning to write when project work and the writing process method are implemented: accuracy, fluency, integrative language skills, and a positive perception towards writing.

  2. Rapid and accurate processing method for amide proton exchange rate measurement in proteins

    International Nuclear Information System (INIS)

    Koskela, Harri; Heikkinen, Outi; Kilpelaeinen, Ilkka; Heikkinen, Sami

    2007-01-01

    Exchange between protein backbone amide hydrogen and water gives relevant information about solvent accessibility and protein secondary structure stability. NMR spectroscopy provides a convenient tool to study these dynamic processes with saturation transfer experiments. Processing of this type of NMR spectra has traditionally required peak integration followed by exponential fitting, which can be tedious with large data sets. We propose here a computer-aided method that applies inverse Laplace transform in the exchange rate measurement. With this approach, the determination of exchange rates can be automated, and reliable results can be acquired rapidly without a need for manual processing

  3. Comparative exergy analyses of Jatropha curcas oil extraction methods: Solvent and mechanical extraction processes

    International Nuclear Information System (INIS)

    Ofori-Boateng, Cynthia; Keat Teong, Lee; JitKang, Lim

    2012-01-01

    Highlights: ► Exergy analysis detects locations of resource degradation within a process. ► Solvent extraction is six times exergetically destructive than mechanical extraction. ► Mechanical extraction of jatropha oil is 95.93% exergetically efficient. ► Solvent extraction of jatropha oil is 79.35% exergetically efficient. ► Exergy analysis of oil extraction processes allow room for improvements. - Abstract: Vegetable oil extraction processes are found to be energy intensive. Thermodynamically, any energy intensive process is considered to degrade the most useful part of energy that is available to produce work. This study uses literature values to compare the efficiencies and degradation of the useful energy within Jatropha curcas oil during oil extraction taking into account solvent and mechanical extraction methods. According to this study, J. curcas seeds on processing into J. curcas oil is upgraded with mechanical extraction but degraded with solvent extraction processes. For mechanical extraction, the total internal exergy destroyed is 3006 MJ which is about six times less than that for solvent extraction (18,072 MJ) for 1 ton J. curcas oil produced. The pretreatment processes of the J. curcas seeds recorded a total internal exergy destructions of 5768 MJ accounting for 24% of the total internal exergy destroyed for solvent extraction processes and 66% for mechanical extraction. The exergetic efficiencies recorded are 79.35% and 95.93% for solvent and mechanical extraction processes of J. curcas oil respectively. Hence, mechanical oil extraction processes are exergetically efficient than solvent extraction processes. Possible improvement methods are also elaborated in this study.

  4. Method development and survey of Sudan I-IV in palm oil and chilli spices in the Washington, DC, area.

    Science.gov (United States)

    Genualdi, Susie; MacMahon, Shaun; Robbins, Katherine; Farris, Samantha; Shyong, Nicole; DeJager, Lowri

    2016-01-01

    Sudan I, II, III and IV dyes are banned for use as food colorants in the United States and European Union because they are toxic and carcinogenic. These dyes have been illegally used as food additives in products such as chilli spices and palm oil to enhance their red colour. From 2003 to 2005, the European Union made a series of decisions requiring chilli spices and palm oil imported to the European Union to contain analytical reports declaring them free of Sudan I-IV. In order for the USFDA to investigate the adulteration of palm oil and chilli spices with unapproved colour additives in the United States, a method was developed for the extraction and analysis of Sudan dyes in palm oil, and previous methods were validated for Sudan dyes in chilli spices. Both LC-DAD and LC-MS/MS methods were examined for their limitations and effectiveness in identifying adulterated samples. Method validation was performed for both chilli spices and palm oil by spiking samples known to be free of Sudan dyes at concentrations close to the limit of detection. Reproducibility, matrix effects, and selectivity of the method were also investigated. Additionally, for the first time a survey of palm oil and chilli spices was performed in the United States, specifically in the Washington, DC, area. Illegal dyes, primarily Sudan IV, were detected in palm oil at concentrations from 150 to 24 000 ng ml(-1). Low concentrations (< 21 µg kg(-1)) of Sudan dyes were found in 11 out of 57 spices and are most likely a result of cross-contamination during preparation and storage and not intentional adulteration.

  5. Methods and procedures of mathematic self-tuition process for technicians training

    Directory of Open Access Journals (Sweden)

    Martínez E. C.

    2013-07-01

    Full Text Available The paper describes the mathematic self-tuition process for technicians training. Once a theoretical framework was built up, the process of self-tuition in mathematics was modeled and introduced at experimental scale. The structure of the process is fully described together with the connections between subsystems and components. The methods and procedures of self-tuition are also described. The guiding principle is that designing a mathematic self-tuition process requires didactic procedures illustrating how professional technical problems may be contextualized for planning, organizing, performing and controlling the study of mathematics. The feasibility appraisal of the model proved that it favors the learning technical contents under the basis of strengthening a mathematic culture. Key words: self-tuition process, technicians´ training, self-control

  6. Validation of Alternative In Vitro Methods to Animal Testing: Concepts, Challenges, Processes and Tools.

    Science.gov (United States)

    Griesinger, Claudius; Desprez, Bertrand; Coecke, Sandra; Casey, Warren; Zuang, Valérie

    This chapter explores the concepts, processes, tools and challenges relating to the validation of alternative methods for toxicity and safety testing. In general terms, validation is the process of assessing the appropriateness and usefulness of a tool for its intended purpose. Validation is routinely used in various contexts in science, technology, the manufacturing and services sectors. It serves to assess the fitness-for-purpose of devices, systems, software up to entire methodologies. In the area of toxicity testing, validation plays an indispensable role: "alternative approaches" are increasingly replacing animal models as predictive tools and it needs to be demonstrated that these novel methods are fit for purpose. Alternative approaches include in vitro test methods, non-testing approaches such as predictive computer models up to entire testing and assessment strategies composed of method suites, data sources and decision-aiding tools. Data generated with alternative approaches are ultimately used for decision-making on public health and the protection of the environment. It is therefore essential that the underlying methods and methodologies are thoroughly characterised, assessed and transparently documented through validation studies involving impartial actors. Importantly, validation serves as a filter to ensure that only test methods able to produce data that help to address legislative requirements (e.g. EU's REACH legislation) are accepted as official testing tools and, owing to the globalisation of markets, recognised on international level (e.g. through inclusion in OECD test guidelines). Since validation creates a credible and transparent evidence base on test methods, it provides a quality stamp, supporting companies developing and marketing alternative methods and creating considerable business opportunities. Validation of alternative methods is conducted through scientific studies assessing two key hypotheses, reliability and relevance of the

  7. Vision and Control for UAVs: A Survey of General Methods and of Inexpensive Platforms for Infrastructure Inspection.

    Science.gov (United States)

    Máthé, Koppány; Buşoniu, Lucian

    2015-06-25

    Unmanned aerial vehicles (UAVs) have gained significant attention in recent years. Low-cost platforms using inexpensive sensor payloads have been shown to provide satisfactory flight and navigation capabilities. In this report, we survey vision and control methods that can be applied to low-cost UAVs, and we list some popular inexpensive platforms and application fields where they are useful. We also highlight the sensor suites used where this information is available. We overview, among others, feature detection and tracking, optical flow and visual servoing, low-level stabilization and high-level planning methods. We then list popular low-cost UAVs, selecting mainly quadrotors. We discuss applications, restricting our focus to the field of infrastructure inspection. Finally, as an example, we formulate two use-cases for railway inspection, a less explored application field, and illustrate the usage of the vision and control techniques reviewed by selecting appropriate ones to tackle these use-cases. To select vision methods, we run a thorough set of experimental evaluations.

  8. Spacecraft attitude control systems with dynamic methods and structures for processing star tracker signals

    Science.gov (United States)

    Liu, Yong (Inventor); Wu, Yeong-Wei Andy (Inventor); Li, Rongsheng (Inventor)

    2001-01-01

    Methods are provided for dynamically processing successively-generated star tracker data frames and associated valid flags to generate processed star tracker signals that have reduced noise and a probability greater than a selected probability P.sub.slctd of being valid. These methods maintain accurate spacecraft attitude control in the presence of spurious inputs (e.g., impinging protons) that corrupt collected charges in spacecraft star trackers. The methods of the invention enhance the probability of generating valid star tracker signals because they respond to a current frame probability P.sub.frm by dynamically selecting the largest valid frame combination whose combination probability P.sub.cmb satisfies a selected probability P.sub.slctd. Noise is thus reduced while the probability of finding a valid frame combination is enhanced. Spacecraft structures are also provided for practicing the methods of the invention.

  9. Survey and critique of quantitative methods for the appraisal of mineral resources. Progress report

    International Nuclear Information System (INIS)

    Harris, D.P.

    1976-01-01

    Two major categories of appraisal methods (models) for regional mineral resources are identified by virtue of the manner in which mineral endowment is treated in the appraisal: implicit and explicit models. Implicit models do not identify nor specify the mineral endowment model. Mineral resources are inferred to exist as required to fulfill economic or secular relationships. Econometric models of mineral supply and Hubbert's time-rate trend projection are varieties of implicit models. Explicit mineral resource models separate the economic and endowment models and state the endowment model explicitly. Explicit models describe mineral endowment as a function of some physical aspect of the earth's crust, such as geology, volume of rock, density of mineral occurrences, and crustal abundance of an element. Economic factors are introduced subsequent to the appraisal of endowment either as an explicit model which interacts with the deposits inferred by the endowment model, or as a simple adjustment made directly on some aggregate measure of endowment

  10. Influence of productivity and processing method on physicochemical characteristics of white button mushrooms in Brazil.

    Science.gov (United States)

    Zied, Diego Cunha; Penachio, Sara Maciel; Dias, Eustáquio Souza; de Almeida Minhoni, Marli Teixeira; Ferraz, Rafael Augusto; Vieites, Rogério Lopes

    2014-11-01

    The white button mushroom is the edible fungus most commonly cultivated and commercialized in Brazil and worldwide. This work assesses the productivity of the different strains ABI 07/06 and ABI 06/05 of Agaricus bisporus grown under the conditions normally employed by growers in the southeast of Brazil, and the influence of four different chemical conservation methods on the physicochemical characteristics and storage properties of the fruit bodies. The productivities of strains ABI 07/06 and ABI 06/05 of white button mushrooms were found to be comparable. The colorimetric characteristics and chemical compositions (fat, fiber and protein contents) of the mushroom strains were similar, and these parameters were not influenced significantly by the conservation processes. Texture was negatively affected by all processing methods employed. It was concluded that chemical methods of processing mushrooms were not fully effective and novel alternative technologies should be considered by mushroom processors in Brazil. Some methods of mushroom storage using chemicals such as sodium metabisulfite are harmful to the human organism, so processing using autoclaving may be the best form of conservation of canned mushrooms. © 2014 Society of Chemical Industry.

  11. A study of transient jet and spray using a Schlieren method and digital image processing

    Energy Technology Data Exchange (ETDEWEB)

    Paulsen, H.

    1995-12-31

    This thesis discusses visualization and image-based data acquisition and analyses on transient gas jet, evaporating spray and burning jet in an attempt to find a method of measuring the transient behaviour of these phenomena, which influence the combustion process in diesel engines. The experimental approach is based on visualization of the injection process inside a constant volume air chamber. The influence of different experimental conditions such as fuel type, injection conditions, temperature and pressure of the chamber is investigated. To control the dynamics of the injection, a constant pressure injection system is used. The dynamics of the fuel injection system itself is not discussed. A full-field classical Schlieren technique is used, and the data recorded by means of a CCD camera and frame-grabber combination. The method has the unusual property of being particularly useful for measurements on a dynamic system, since the gradients in the light refraction index used by the Schlieren method are enhanced by the dynamics. The method was used to measure local gas concentration inside a room temperature methane gas jet, and vapour phase concentration evaporating propane spray. The system was also used to measure the local temperature of burning methane jet based on calculated density distribution. 47 refs., 92 figs., 3 tabs.

  12. Evaluation and analysis method for natural gas hydrate storage and transportation processes

    International Nuclear Information System (INIS)

    Hao Wenfeng; Wang Jinqu; Fan Shuanshi; Hao Wenbin

    2008-01-01

    An evaluation and analysis method is presented to investigate an approach to scale-up a hydration reactor and to solve some economic problems by looking at the natural gas hydrate storage and transportation process as a whole. Experiments with the methane hydration process are used to evaluate the whole natural gas hydrate storage and transportation process. The specific contents and conclusions are as follows: first, batch stirring effects and load coefficients are studied in a semi-continuous stirred-tank reactor. Results indicate that batch stirring and appropriate load coefficients are effective in improving hydrate storage capacity. In the experiments, appropriate values for stirring velocity, stirring time and load coefficient were found to be 320 rpm, 30 min and 0.289, respectively. Second, throughput and energy consumption of the reactor for producing methane hydrates are calculated by mass and energy balance. Results show that throughput of this is 1.06 kg/d, with a product containing 12.4% methane gas. Energy consumption is 0.19 kJ, while methane hydrates containing 1 kJ heat are produced. Third, an energy consumption evaluation parameter is introduced to provide a single energy consumption evaluation rule for different hydration reactors. Parameter analyses indicate that process simplicity or process integration can decrease energy consumption. If experimental gas comes from a small-scale natural gas field and the energy consumption is 0.02 kJ when methane hydrates containing 1 kJ heat are produced, then the decrease is 87.9%. Moreover, the energy consumption evaluation parameter used as an economic criterion is converted into a process evaluation parameter. Analyses indicate that the process evaluation parameter is relevant to technology level and resource consumption for a system, which can make it applicable to economic analysis and venture forecasting for optimal capital utilization

  13. Survey of sampling-based methods for uncertainty and sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean; Helton, Jon Craig; Sallaberry, Cedric J. PhD. (.; .); Storlie, Curt B. (Colorado State University, Fort Collins, CO)

    2006-06-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (1) Definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (2) Generation of samples from uncertain analysis inputs, (3) Propagation of sampled inputs through an analysis, (4) Presentation of uncertainty analysis results, and (5) Determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition.

  14. Survey of sampling-based methods for uncertainty and sensitivity analysis

    International Nuclear Information System (INIS)

    Helton, J.C.; Johnson, J.D.; Sallaberry, C.J.; Storlie, C.B.

    2006-01-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (i) definition of probability distributions to characterize epistemic uncertainty in analysis inputs (ii) generation of samples from uncertain analysis inputs (iii) propagation of sampled inputs through an analysis (iv) presentation of uncertainty analysis results, and (v) determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two-dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition

  15. Methods, media and systems for managing a distributed application running in a plurality of digital processing devices

    Science.gov (United States)

    Laadan, Oren; Nieh, Jason; Phung, Dan

    2012-10-02

    Methods, media and systems for managing a distributed application running in a plurality of digital processing devices are provided. In some embodiments, a method includes running one or more processes associated with the distributed application in virtualized operating system environments on a plurality of digital processing devices, suspending the one or more processes, and saving network state information relating to network connections among the one or more processes. The method further include storing process information relating to the one or more processes, recreating the network connections using the saved network state information, and restarting the one or more processes using the stored process information.

  16. A measurement method for micro 3D shape based on grids-processing and stereovision technology

    International Nuclear Information System (INIS)

    Li, Chuanwei; Xie, Huimin; Liu, Zhanwei

    2013-01-01

    An integrated measurement method for micro 3D surface shape by a combination of stereovision technology in a scanning electron microscope (SEM) and grids-processing methodology is proposed. The principle of the proposed method is introduced in detail. By capturing two images of the tested specimen with grids on the surface at different tilt angles in an SEM, the 3D surface shape of the specimen can be obtained. Numerical simulation is applied to analyze the feasibility of the proposed method. A validation experiment is performed here. The surface shape of the metal-wire/polymer-membrane structures with thermal deformation is reconstructed. By processing the surface grids of the specimen, the out-of-plane displacement field of the specimen surface is also obtained. Compared with the measurement results obtained by a 3D digital microscope, the experimental error of the proposed method is discussed (paper)

  17. Survey of metallurgical recycling processes. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Pemsler, J.P.

    1979-03-01

    In the year 2000, the US will consume about 3.2 x 10/sup 15/ Btu to produce the seven major nonferrous metals Al, Cu, Zn, Pb, Ni, Mg, and Ti. Of this amount, 82% will be used in the production of Al. It is projected that 0.6 x 10/sup 15/ Btu will be saved by the recycle of secondary metals. Major opportunities for increasing the extent of recycle and thereby increasing the energy savings are discussed. An inherent feature in the energistics of recycle is that physical processes such as magnetic separation, density separations, melting, and in some instances vaporization are far less energy intensive than are chemical processes associated with dissolution and electrowinning. It is in the domain of scrap of complex composition and physical form, difficult to handle by existing technology, that opportunities exist for new chemical recycle technology. Recycle of scrap metal of adequate grade is currently achieved through pyrometallurgical processes which, in many cases, are not very energy intensive as compared with hydrometallurgical processes. Preliminary flowsheets are presented for the recovery of value metals from batteries considered for use in vehicular propulsion and load leveling applications. The battery types examined are lead/acid, nickel/zinc, nickel/iron, zinc/chlorine, lithium-aluminum/iron sulfide, and sodium/sulfur. A flow sheet has been outlined for an integrated hydrometallurgical process to treat low-grade copper scrap. A fully integrated hydrometallurgical process is outlined, and costs and energy consumption are derived, for recovering zinc metal from electric furnace flue dusts. Costs and energy are high and the process does not appear to warrant development at this time. Improvement in the recycle of magnesium is associated primarily with improved recycle in the Al industry where Mg is an important alloy additive. Ni and Ti recycle are associated with improved collection and sorting of stainless steel and specialty alloys.

  18. Study on highly efficient seismic data acquisition and processing methods based on sparsity constraint

    Science.gov (United States)

    Wang, H.; Chen, S.; Tao, C.; Qiu, L.

    2017-12-01

    High-density, high-fold and wide-azimuth seismic data acquisition methods are widely used to overcome the increasingly sophisticated exploration targets. The acquisition period is longer and longer and the acquisition cost is higher and higher. We carry out the study of highly efficient seismic data acquisition and processing methods based on sparse representation theory (or compressed sensing theory), and achieve some innovative results. The theoretical principles of highly efficient acquisition and processing is studied. We firstly reveal sparse representation theory based on wave equation. Then we study the highly efficient seismic sampling methods and present an optimized piecewise-random sampling method based on sparsity prior information. At last, a reconstruction strategy with the sparsity constraint is developed; A two-step recovery approach by combining sparsity-promoting method and hyperbolic Radon transform is also put forward. The above three aspects constitute the enhanced theory of highly efficient seismic data acquisition. The specific implementation strategies of highly efficient acquisition and processing are studied according to the highly efficient acquisition theory expounded in paragraph 2. Firstly, we propose the highly efficient acquisition network designing method by the help of optimized piecewise-random sampling method. Secondly, we propose two types of highly efficient seismic data acquisition methods based on (1) single sources and (2) blended (or simultaneous) sources. Thirdly, the reconstruction procedures corresponding to the above two types of highly efficient seismic data acquisition methods are proposed to obtain the seismic data on the regular acquisition network. A discussion of the impact on the imaging result of blended shooting is discussed. In the end, we implement the numerical tests based on Marmousi model. The achieved results show: (1) the theoretical framework of highly efficient seismic data acquisition and processing

  19. Applying sample survey methods to clinical trials data.

    Science.gov (United States)

    LaVange, L M; Koch, G G; Schwartz, T A

    This paper outlines the utility of statistical methods for sample surveys in analysing clinical trials data. Sample survey statisticians face a variety of complex data analysis issues deriving from the use of multi-stage probability sampling from finite populations. One such issue is that of clustering of observations at the various stages of sampling. Survey data analysis approaches developed to accommodate clustering in the sample design have more general application to clinical studies in which repeated measures structures are encountered. Situations where these methods are of interest include multi-visit studies where responses are observed at two or more time points for each patient, multi-period cross-over studies, and epidemiological studies for repeated occurrences of adverse events or illnesses. We describe statistical procedures for fitting multiple regression models to sample survey data that are more effective for repeated measures studies with complicated data structures than the more traditional approaches of multivariate repeated measures analysis. In this setting, one can specify a primary sampling unit within which repeated measures have intraclass correlation. This intraclass correlation is taken into account by sample survey regression methods through robust estimates of the standard errors of the regression coefficients. Regression estimates are obtained from model fitting estimation equations which ignore the correlation structure of the data (that is, computing procedures which assume that all observational units are independent or are from simple random samples). The analytic approach is straightforward to apply with logistic models for dichotomous data, proportional odds models for ordinal data, and linear models for continuously scaled data, and results are interpretable in terms of population average parameters. Through the features summarized here, the sample survey regression methods have many similarities to the broader family of

  20. Drying of water based foundry coatings: Innovative test, process design and optimization methods

    DEFF Research Database (Denmark)

    Di Muoio, Giovanni Luca; Johansen, Bjørn Budolph

    Director Bjørn Budolph Johansen has been the company supervisor from March 2012 to June 2014. In this Industrial PhD Thesis we present the main results of several tests and simulations carried out from 2011 to 2014 at Global Castings A/S (former Vestas Wind Systems A/S) and at the Technical University...... capacity goals there is a need to understand how to design, control and optimize drying processes. The main focus of this project was on the critical parameters and properties to be controlled in production in order to achieve a stable and predictable drying process. We propose for each of these parameters...... on real industrial cases. These tools have been developed in order to simulate and optimize the drying process and reduce drying time and power consumption as well as production process design time and cost of expensive drying equipment. Results show that test methods from other industries can be used...

  1. NUTRITIONAL VALUE AND METHODS OF THE TECHNOLOGICAL PROCESSING OF PELED (СOREGONUS PELED GMELIN (REVIEW

    Directory of Open Access Journals (Sweden)

    O. Nazarov

    2016-06-01

    Full Text Available Purpose. To investigate peled as a food product, raw material for processing and analyze traditional methods of its technological processing. Findings. The paper contains an analysis of the chemical composition of peled meat and its difference compared to other fish of pond aquaculture of Ukraine. According to the parameters of the biochemical composition of the meat of peled reared in the conditions of pond aquaculture, including: contents of fats, proteins, and moisture, belongs to the category of fish from medium to high fat content with medium protein content as well as to fish of increased nutritional value and assimilability based on water-protein, fat-protein, and water-fat balance, and based on amino-acid composition in percent, according to Score standard. Unlike cyprinids — objects of pond aquaculture, general indices of the biochemical composition and peculiarities of anatomical structure of peled as a coregonid representative, contribute to the formation of organoleptic features of native origin that are inherent to gourmet types of the products of traditional processing. It was found that unlike other coregonids, the biochemical indices of peled meat, which define the type and directions of its processing and its regime, first of all, the content of fat, protein, and moisture аre relatively stable for different age groups under conditions of pond aquaculture and they change less during the biological cycle. Main product requirements to the methods of technological processing of peled are summarized, namely: drying, smoking, salting. Full technological schemes of peled processing by traditional methods taking into account biochemical peculiarities of raw material and requirements for the finished product are presented and analyzed. Practical value. The summarized information is useful for further development of domestic aquaculture and processing. Different indices of biochemical composition and high output indices of peled meat

  2. Methods for processing and analysis functional and anatomical brain images: computerized tomography, emission tomography and nuclear resonance imaging

    International Nuclear Information System (INIS)

    Mazoyer, B.M.

    1988-01-01

    The various methods for brain image processing and analysis are presented and compared. The following topics are developed: the physical basis of brain image comparison (nature and formation of signals intrinsic performance of the methods image characteristics); mathematical methods for image processing and analysis (filtering, functional parameter extraction, morphological analysis, robotics and artificial intelligence); methods for anatomical localization (neuro-anatomy atlas, proportional stereotaxic atlas, numerized atlas); methodology of cerebral image superposition (normalization, retiming); image networks [fr

  3. Survey of numerical methods for compressible fluids

    Energy Technology Data Exchange (ETDEWEB)

    Sod, G A

    1977-06-01

    The finite difference methods of Godunov, Hyman, Lax-Wendroff (two-step), MacCormack, Rusanov, the upwind scheme, the hybrid scheme of Harten and Zwas, the antidiffusion method of Boris and Book, and the artificial compression method of Harten are compared with the random choice known as Glimm's method. The methods are used to integrate the one-dimensional equations of gas dynamics for an inviscid fluid. The results are compared and demonstrate that Glimm's method has several advantages. 16 figs., 4 tables.

  4. Water vapour sorption and humidity - a survey on measuring methods and standards

    OpenAIRE

    Robens, Erich; Rübner, Katrin; Klobes, Peter; Balköse, Devrim

    2011-01-01

    Under environmental conditions water exists in all three classical states of matter: solid, liquid and gas. The water molecule is non-linear and therefore polar. In comparison with other liq-uids water has anomalous features; about 63 exceptional properties are recorded. This article starts with reviewing properties of water, typical occurrences and definitions such as relative and absolute humidity and moisture content. Water is present everywhere in nature and engineering; it may be hel...

  5. A method to eliminate refraction artifacts in EM1002 multibeam echosounder system (Swath bathymetry and seabed surveys of EEZ)

    Digital Repository Service at National Institute of Oceanography (India)

    Fernandes, W.A.

    - NIO/TR-01/2011 A METHOD TO ELIMINATE REFRACTION ARTIFACTS IN EM1002 MULTIBEAM ECHOSOUNDER SYSTEM (Swath Bathymetry & Seabed Surveys of Exclusive Economic Zone) By WILLIAM A. FERNANDES... are extracted. This information is written in a defined packet type format (binary) to an output file. A Method Employed to Eliminate Refraction Artifacts in EM1002 Multibeam Echosounder 2011...

  6. Black-tailed and white-tailed jackrabbits in the American West: History, ecology, ecological significance, and survey methods

    Science.gov (United States)

    Simes, Matthew; Longshore, Kathleen M.; Nussear, Kenneth E.; Beatty, Greg L.; Brown, David E.; Esque, Todd C.

    2015-01-01

    Across the western United States, Leporidae are the most important prey item in the diet of Golden Eagles (Aquila chrysaetos). Leporids inhabiting the western United States include black-tailed (Lepus californicus) and white-tailed jackrabbits (Lepus townsendii) and various species of cottontail rabbit (Sylvilagus spp.). Jackrabbits (Lepus spp.) are particularly important components of the ecological and economic landscape of western North America because their abundance influences the reproductive success and population trends of predators such as coyotes (Canis latrans), bobcats (Lynx rufus), and a number of raptor species. Here, we review literature pertaining to black-tailed and white-tailed jackrabbits comprising over 170 published journal articles, notes, technical reports, conference proceedings, academic theses and dissertations, and other sources dating from the late 19th century to the present. Our goal is to present information to assist those in research and management, particularly with regard to protected raptor species (e.g., Golden Eagles), mammalian predators, and ecological monitoring. We classified literature sources as (1) general information on jackrabbit species, (2) black-tailed or (3) white-tailed jackrabbit ecology and natural history, or (4) survey methods. These categories, especially 2, 3, and 4, were further subdivided as appropriate. The review also produced several tables on population trends, food habits, densities within various habitats, and jackrabbit growth and development. Black-tailed and white-tailed jackrabbits are ecologically similar in general behaviors, use of forms, parasites, and food habits, and they are prey to similar predators; but they differ in their preferred habitats. While the black-tailed jackrabbit inhabits agricultural land, deserts, and shrublands, the white-tailed jackrabbit is associated with prairies, alpine tundra, and sagebrush-steppe. Frequently considered abundant, jackrabbit numbers in western North

  7. Survey and adjustment methods applied on an 11 axes high performance reflectometer for synchrotron radiation

    Energy Technology Data Exchange (ETDEWEB)

    Eggenstein, F., E-mail: Frank.Eggenstein@helmholtz-berlin.de; Bischoff, P.; Schäfers, F.; Schroeter, T.; Senf, F.; Sokolov, A.; Zeschke, T.; Erko, A. [Helmholtz-Zentrum Berlin, Albert-Einstein-Str. 15, Berlin, Germany, D-12489 (Germany)

    2016-07-27

    At BESSY-II a new UV-and XUV optics beamline [1] has recently been setup with an in-house developed versatile reflectometer [2], [3], [4] for at-wavelength metrology on reflective and diffractive optical elements up to 4 kg mass. High precision measurements of the reflection and polarization properties are feasible by a 360° azimuthal rotation of the sample around the beam of light, where samples can be adjusted reproducibly with a novel UHV-Tripod within arc sec and μm precision. The azimuthal rotation requires an extremely high precision adjustment of the goniometer axis with respect to the incident light beam. Here we describe sophisticated methods with which we achieve nearly perfect agreement of the azimuthal rotation axis and the synchrotron beam in the 30 arc sec range. By using geodetic instruments (lasertracker, theodolite, autocollimator) the quality of the reflectometer UHV-mechanics has been characterized with respect to stiffness and radial run out with highest precision [5].

  8. Survey of Soybean Insect Pollinators: Community Identification and Sampling Method Analysis.

    Science.gov (United States)

    Gill, K A; O'Neal, M E

    2015-06-01

    Soybean, Glycine max (L.) Merrill, flowers can be a source of nectar and pollen for honey bees, Apis mellifera L. (Hymenoptera: Apidae), wild social and solitary bees (Hymenoptera: Apoidea), and flower-visiting flies (Diptera). Our objectives were to describe the pollinator community in soybean fields, determine which sampling method is most appropriate for characterizing their abundance and diversity, and gain insight into which pollinator taxa may contact soybean pollen. We compared modified pan traps (i.e., bee bowls), yellow sticky traps, and sweep nets for trapping pollinators in Iowa soybean fields when soybeans were blooming (i.e., reproductive stages R1-R6) during 2011 and 2012. When all trap type captures were combined, we collected 5,368 individuals and at least 50 species. Per trap type, the most pollinators were captured in bee bowls (3,644 individuals, 44 species), yellow sticky traps (1,652 individuals, 32 species), and sweep nets (66 individuals, 10 species). The most abundant species collected include Agapostemon virescens F. and Lasioglossum (Dialictus) species (Hymenoptera: Halictidae), Melissodes bimaculata Lepeletier (Hymenoptera: Apidae), and Toxomerus marginatus Say (Diptera: Syrphidae). To determine if these pollinators were foraging on soybean flowers, we looked for soybean pollen on the most abundant bee species collected that had visible pollen loads. We found soybean pollen alone or intermixed with pollen grains from other plant species on 29 and 38% of the bees examined in 2011 and 2012, respectively. Our data suggest a diverse community of pollinators-composed of mostly native, solitary bees-visit soybean fields and forage on their flowers within Iowa. © The Authors 2015. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. Transgender-inclusive measures of sex/gender for population surveys: Mixed-methods evaluation and recommendations.

    Science.gov (United States)

    Bauer, Greta R; Braimoh, Jessica; Scheim, Ayden I; Dharma, Christoffer

    2017-01-01

    Given that an estimated 0.6% of the U.S. population is transgender (trans) and that large health disparities for this population have been documented, government and research organizations are increasingly expanding measures of sex/gender to be trans inclusive. Options suggested for trans community surveys, such as expansive check-all-that-apply gender identity lists and write-in options that offer maximum flexibility, are generally not appropriate for broad population surveys. These require limited questions and a small number of categories for analysis. Limited evaluation has been undertaken of trans-inclusive population survey measures for sex/gender, including those currently in use. Using an internet survey and follow-up of 311 participants, and cognitive interviews from a maximum-diversity sub-sample (n = 79), we conducted a mixed-methods evaluation of two existing measures: a two-step question developed in the United States and a multidimensional measure developed in Canada. We found very low levels of item missingness, and no indicators of confusion on the part of cisgender (non-trans) participants for both measures. However, a majority of interview participants indicated problems with each question item set. Agreement between the two measures in assessment of gender identity was very high (K = 0.9081), but gender identity was a poor proxy for other dimensions of sex or gender among trans participants. Issues to inform measure development or adaptation that emerged from analysis included dimensions of sex/gender measured, whether non-binary identities were trans, Indigenous and cultural identities, proxy reporting, temporality concerns, and the inability of a single item to provide a valid measure of sex/gender. Based on this evaluation, we recommend that population surveys meant for multi-purpose analysis consider a new Multidimensional Sex/Gender Measure for testing that includes three simple items (one asked only of a small sub-group) to assess gender

  10. Transgender-inclusive measures of sex/gender for population surveys: Mixed-methods evaluation and recommendations.

    Directory of Open Access Journals (Sweden)

    Greta R Bauer

    Full Text Available Given that an estimated 0.6% of the U.S. population is transgender (trans and that large health disparities for this population have been documented, government and research organizations are increasingly expanding measures of sex/gender to be trans inclusive. Options suggested for trans community surveys, such as expansive check-all-that-apply gender identity lists and write-in options that offer maximum flexibility, are generally not appropriate for broad population surveys. These require limited questions and a small number of categories for analysis. Limited evaluation has been undertaken of trans-inclusive population survey measures for sex/gender, including those currently in use. Using an internet survey and follow-up of 311 participants, and cognitive interviews from a maximum-diversity sub-sample (n = 79, we conducted a mixed-methods evaluation of two existing measures: a two-step question developed in the United States and a multidimensional measure developed in Canada. We found very low levels of item missingness, and no indicators of confusion on the part of cisgender (non-trans participants for both measures. However, a majority of interview participants indicated problems with each question item set. Agreement between the two measures in assessment of gender identity was very high (K = 0.9081, but gender identity was a poor proxy for other dimensions of sex or gender among trans participants. Issues to inform measure development or adaptation that emerged from analysis included dimensions of sex/gender measured, whether non-binary identities were trans, Indigenous and cultural identities, proxy reporting, temporality concerns, and the inability of a single item to provide a valid measure of sex/gender. Based on this evaluation, we recommend that population surveys meant for multi-purpose analysis consider a new Multidimensional Sex/Gender Measure for testing that includes three simple items (one asked only of a small sub-group to

  11. Computational analysis in epilepsy neuroimaging: A survey of features and methods.

    Science.gov (United States)

    Kini, Lohith G; Gee, James C; Litt, Brian

    2016-01-01

    Epilepsy affects 65 million people worldwide, a third of whom have seizures that are resistant to anti-epileptic medications. Some of these patients may be amenable to surgical therapy or treatment with implantable devices, but this usually requires delineation of discrete structural or functional lesion(s), which is challenging in a large percentage of these patients. Advances in neuroimaging and machine learning allow semi-automated detection of malformations of cortical development (MCDs), a common cause of drug resistant epilepsy. A frequently asked question in the field is what techniques currently exist to assist radiologists in identifying these lesions, especially subtle forms of MCDs such as focal cortical dysplasia (FCD) Type I and low grade glial tumors. Below we introduce some of the common lesions encountered in patients with epilepsy and the common imaging findings that radiologists look for in these patients. We then review and discuss the computational techniques introduced over the past 10 years for quantifying and automatically detecting these imaging findings. Due to large variations in the accuracy and implementation of these studies, specific techniques are traditionally used at individual centers, often guided by local expertise, as well as selection bias introduced by the varying prevalence of specific patient populations in different epilepsy centers. We discuss the need for a multi-institutional study that combines features from different imaging modalities as well as computational techniques to definitively assess the utility of specific automated approaches to epilepsy imaging. We conclude that sharing and comparing these different computational techniques through a common data platform provides an opportunity to rigorously test and compare the accuracy of these tools across different patient populations and geographical locations. We propose that these kinds of tools, quantitative imaging analysis methods and open data platforms for

  12. Computational analysis in epilepsy neuroimaging: A survey of features and methods

    Directory of Open Access Journals (Sweden)

    Lohith G. Kini

    2016-01-01

    Advances in neuroimaging and machine learning allow semi-automated detection of malformations of cortical development (MCDs, a common cause of drug resistant epilepsy. A frequently asked question in the field is what techniques currently exist to assist radiologists in identifying these lesions, especially subtle forms of MCDs such as focal cortical dysplasia (FCD Type I and low grade glial tumors. Below we introduce some of the common lesions encountered in patients with epilepsy and the common imaging findings that radiologists look for in these patients. We then review and discuss the computational techniques introduced over the past 10 years for quantifying and automatically detecting these imaging findings. Due to large variations in the accuracy and implementation of these studies, specific techniques are traditionally used at individual centers, often guided by local expertise, as well as selection bias introduced by the varying prevalence of specific patient populations in different epilepsy centers. We discuss the need for a multi-institutional study that combines features from different imaging modalities as well as computational techniques to definitively assess the utility of specific automated approaches to epilepsy imaging. We conclude that sharing and comparing these different computational techniques through a common data platform provides an opportunity to rigorously test and compare the accuracy of these tools across different patient populations and geographical locations. We propose that these kinds of tools, quantitative imaging analysis methods and open data platforms for aggregating and sharing data and algorithms, can play a vital role in reducing the cost of care, the risks of invasive treatments, and improve overall outcomes for patients with epilepsy.

  13. A survey of upwind methods for flows with equilibrium and non-equilibrium chemistry and thermodynamics

    Science.gov (United States)

    Grossman, B.; Garrett, J.; Cinnella, P.

    1989-01-01

    Several versions of flux-vector split and flux-difference split algorithms were compared with regard to general applicability and complexity. Test computations were performed using curve-fit equilibrium air chemistry for an M = 5 high-temperature inviscid flow over a wedge, and an M = 24.5 inviscid flow over a blunt cylinder for test computations; for these cases, little difference in accuracy was found among the versions of the same flux-split algorithm. For flows with nonequilibrium chemistry, the effects of the thermodynamic model on the development of flux-vector split and flux-difference split algorithms were investigated using an equilibrium model, a general nonequilibrium model, and a simplified model based on vibrational relaxation. Several numerical examples are presented, including nonequilibrium air chemistry in a high-temperature shock tube and nonequilibrium hydrogen-air chemistry in a supersonic diffuser.

  14. Survey of pesticide residues in table grapes: Determination of processing factors, intake and risk assessment

    DEFF Research Database (Denmark)

    Poulsen, Mette Erecius; Hansen, H.K.; Sloth, Jens Jørgen

    2007-01-01

    , no significant effect was found for organophosphorus pesticides and pyrethroids, whereas the number of samples with residues of benzilates, phenylamids and triazoles was insufficient to demonstrate any significant effects. An intake calculation showed that the average intake from Italian grapes was 3.9 mu g day...

  15. Categorization of Survey Text Utilizing Natural Language Processing and Demographic Filtering

    Science.gov (United States)

    2017-09-01

    spouse family mother family mom family dad family father family kid family kids family job career employment career & and sailor . school...analysis: A practical introduction to information retrieval and text mining. New York, NY: ACM Books . 67 INITIAL DISTRIBUTION LIST 1. Defense

  16. Investigation of test methods, material properties, and processes for solar cell encapsulants. Annual report

    Energy Technology Data Exchange (ETDEWEB)

    Willis, P. B.; Baum, B.; White, R. A.

    1978-06-01

    Springborn Laboratories is engaged in a study of evaluating potentially useful encapsulating materials for the encapsulation task of the Low-Cost Solar Array project (LSA) funded by the Department of Energy. The goal of this program is to identify, evaluate, and recommend encapsulant materials (other than glass) and processes for the production of cost-effective, long-life photovoltaic solar modules. The results of an investigation of solar module encapsulation systems applicable to the Low-Cost Solar Array project 1986 cost and performance goals are presented. The 1986 cost goal for a 20 year life solar cell module is $0.50 per watt or $5 per square foot (in 1975 dollars). Out of this cost goal, $0.25 per square foot is currently allocated for the encapsulation in terms of raw materials, exclusive of labor. Assuming the flat-plate collector to be the most efficient module design, six basic construction elements were identified and their specific uses in module construction defined. In order to generate a comparative analysis, a uniform costing basis was established for each element. Extensive surveys into commercially available materials were then conducted in order to identify either general classes or specific products suitable for use for each construction element. The survey results were also useful in revealing price ranges for classes of materials and estimating the cost allocation for each element within the encapsulation cost goal.

  17. Survey of pesticide residues in table grapes: Determination of processing factors, intake and risk assessment

    DEFF Research Database (Denmark)

    Poulsen, Mette Erecius; Hansen, H.K.; Sloth, Jens Jørgen

    2007-01-01

    The differences in residue pattern between Italy and South Africa, the main exporters of table grapes to the Danish market, were investigated. The results showed no major differences with respect to the number of samples with residues, with residues being found in 54-78% of the samples. Exceedances......, no significant effect was found for organophosphorus pesticides and pyrethroids, whereas the number of samples with residues of benzilates, phenylamids and triazoles was insufficient to demonstrate any significant effects. An intake calculation showed that the average intake from Italian grapes was 3.9 mu g day......(-1) for pesticides and 21 mu g day(-1) for copper. Correspondingly, the intakes from South African grapes were 2.6 and 5.7 mu g day(-1) respectively. When the total exposure of pesticides from grapes were related to acceptable daily intake, expressed as the sum of Hazard Quotients, the exposure were...

  18. Automated Processing of Zebrafish Imaging Data: A Survey

    Science.gov (United States)

    Dickmeis, Thomas; Driever, Wolfgang; Geurts, Pierre; Hamprecht, Fred A.; Kausler, Bernhard X.; Ledesma-Carbayo, María J.; Marée, Raphaël; Mikula, Karol; Pantazis, Periklis; Ronneberger, Olaf; Santos, Andres; Stotzka, Rainer; Strähle, Uwe; Peyriéras, Nadine

    2013-01-01

    Abstract Due to the relative transparency of its embryos and larvae, the zebrafish is an ideal model organism for bioimaging approaches in vertebrates. Novel microscope technologies allow the imaging of developmental processes in unprecedented detail, and they enable the use of complex image-based read-outs for high-throughput/high-content screening. Such applications can easily generate Terabytes of image data, the handling and analysis of which becomes a major bottleneck in extracting the targeted information. Here, we describe the current state of the art in computational image analysis in the zebrafish system. We discuss the challenges encountered when handling high-content image data, especially with regard to data quality, annotation, and storage. We survey methods for preprocessing image data for further analysis, and describe selected examples of automated image analysis, including the tracking of cells during embryogenesis, heartbeat detection, identification of dead embryos, recognition of tissues and anatomical landmarks, and quantification of behavioral patterns of adult fish. We review recent examples for applications using such methods, such as the comprehensive analysis of cell lineages during early development, the generation of a three-dimensional brain atlas of zebrafish larvae, and high-throughput drug screens based on movement patterns. Finally, we identify future challenges for the zebrafish image analysis community, notably those concerning the compatibility of algorithms and data formats for the assembly of modular analysis pipelines. PMID:23758125

  19. A Survey of Alcohol Law Instructors' and Students' Perceptions on Social Learning and Training Methods

    Science.gov (United States)

    Altamirano, Jesus Manuel

    2013-01-01

    Alcoholic beverages in the State of Arizona are regulated by the Arizona Department of Liquor Licenses and Control (ADLLC). Education programs in the alcohol industry must align with the needs of students working in the industry and with the criteria set forth by the ADLLC. Prior research has concentrated on irresponsible alcohol consumption…

  20. The Employee Attitude Survey 2000: Perspectives on Its Process and Utility

    National Research Council Canada - National Science Library

    Hackworth, Carla

    2003-01-01

    .... Respondents represented all supervisory levels (i.e., non-supervisor, 46%; supervisor, 13%; and manager, 41%). Approximately two- thirds of the respondents indicated that they had served as a POC at some level of the agency...

  1. Controlled decomposition and oxidation: A treatment method for gaseous process effluents

    Science.gov (United States)

    Mckinley, Roger J. B., Sr.

    1990-01-01

    The safe disposal of effluent gases produced by the electronics industry deserves special attention. Due to the hazardous nature of many of the materials used, it is essential to control and treat the reactants and reactant by-products as they are exhausted from the process tool and prior to their release into the manufacturing facility's exhaust system and the atmosphere. Controlled decomposition and oxidation (CDO) is one method of treating effluent gases from thin film deposition processes. CDO equipment applications, field experience, and results of the use of CDO equipment and technological advances gained from the field experiences are discussed.

  2. Model Based Beamforming and Bayesian Inversion Signal Processing Methods for Seismic Localization of Underground Source

    DEFF Research Database (Denmark)

    Oh, Geok Lian

    properties such as the elastic wave speeds and soil densities. One processing method is casting the estimation problem into an inverse problem to solve for the unknown material parameters. The forward model for the seismic signals used in the literatures include ray tracing methods that consider only...... density values of the discretized ground medium, which leads to time-consuming computations and instability behaviour of the inversion process. In addition, the geophysics inverse problem is generally ill-posed due to non-exact forward model that introduces errors. The Bayesian inversion method through...... the probability density function permits the incorporation of a priori information about the parameters, and also allow for incorporation of theoretical errors. This opens up the possibilities of application of inverse paradigm in the real-world geophysics inversion problems. In this PhD study, the Bayesian...

  3. Gemini NIFS survey of feeding and feedback processes in nearby active galaxies - I. Stellar kinematics

    Science.gov (United States)

    Riffel, Rogemar A.; Storchi-Bergmann, Thaisa; Riffel, Rogerio; Dahmer-Hahn, Luis G.; Diniz, Marlon R.; Schönell, Astor J.; Dametto, Natacha Z.

    2017-09-01

    We use the Gemini Near-Infrared Integral Field Spectrograph (NIFS) to map the stellar kinematics of the inner few hundred parsecs of a sample of 16 nearby Seyfert galaxies, at a spatial resolution of tens of parsecs and spectral resolution of 40 km s- 1. We find that the line-of-sight (LOS) velocity fields for most galaxies are well reproduced by rotating disc models. The kinematic position angle (PA) derived for the LOS velocity field is consistent with the large-scale photometric PA. The residual velocities are correlated with the hard X-ray luminosity, suggesting that more luminous active galactic nuclei have a larger impact in the surrounding stellar dynamics. The central velocity dispersion values are usually higher than the rotation velocity amplitude, what we attribute to the strong contribution of bulge kinematics in these inner regions. For 50 per cent of the galaxies, we find an inverse correlation between the velocities and the h3 Gauss-Hermitte moment, implying red wings in the blueshifted side and blue wings in the redshifted side of the velocity field, attributed to the movement of the bulge stars lagging the rotation. Two of the 16 galaxies (NGC 5899 and Mrk 1066) show an S-shape zero velocity line, attributed to the gravitational potential of a nuclear bar. Velocity dispersion (σ) maps show rings of low-σ values (˜50-80 km s- 1) for four objects and 'patches' of low σ for six galaxies at 150-250 pc from the nucleus, attributed to young/ intermediate age stellar populations.

  4. Evaluation of silage-fed biogas process performance using microbiological and kinetic methods

    Energy Technology Data Exchange (ETDEWEB)

    Jarvis, Aa.

    1996-10-01

    In this study, different kinetic and microbiological methods were used to evaluate the growth and activity of key groups of bacteria degrading ley silage in one-phase and two-phase biogas processes. Emphasis was placed on studying the dynamic behaviour of different trophic groups resulting from the initiation of liquid recirculation in the processes. The microbiological methods included microscopy and most probable number (MPN) counts with different substrates. The kinetic methods included measurements of specific methanogenic activity (SMA) with acetate and H{sub 2}/CO{sub 2} as substrates, batch assays with trace element additions and measurement of conversion rates of mannitol and lactate in the digesters. In general, the initiation of liquid recirculation at first promoted the growth and/or activity of several trophic groups of bacteria, such as butyrate and propionate degraders and acetotrophic and hydrogenotrophic methanogens in the liquefaction/acidogenesis reactors of the two-phase processes. This was probably mainly due to the increased pH. However, after some time of liquid recirculation, an inhibition of some bacterial groups occurred, such as propionate degraders and methanogens in the methanogenic reactors of two-phase processes. This was probably due to increased concentrations of salts and free ammonia. The batch assays proved to be valuable tools in process optimization by the addition of trace elements. Here, the addition of cobalt significantly increased methane production from acetate. In this study, a more comprehensive understanding of the process behaviour in response to the initiation of liquid recirculation was achieved which could not have been obtained by only monitoring routine parameters such as pH, methane production and concentrations of organic acids and salts. 120 refs, 4 figs, 1 tab

  5. Improvement of mechanical and thermal properties of carbon nanotube composites through nanotube functionalization and processing methods

    International Nuclear Information System (INIS)

    Sahoo, Nanda Gopal; Cheng, Henry Kuo Feng; Cai Junwei; Li Lin; Chan, Siew Hwa; Zhao Jianhong; Yu Suzhu

    2009-01-01

    The effect of multi-walled carbon nanotubes (MWCNTs) and processing methods on the morphological, dynamic mechanical, mechanical, thermal and electrical properties of MWCNT/nylon 6 (PA6) composites has been investigated. The MWCNTs have been functionalized covalently and noncovalently for better dispersion in the polymer matrix. A homogeneous dispersion of MWCNTs was achieved in the PA6 matrix as evidenced by scanning electron microscopy. The strong interaction between the functionalized MWCNTs and the PA6 matrix greatly enhanced the dispersion as well as the interfacial adhesion. As a result, the overall mechanical performance of the composites could be improved. The incorporation of the MWCNTs effectively enhanced the crystallization of the PA6 matrix through heterogeneous nucleation. The present investigation revealed that the mechanical, thermal as well as electrical properties of MWCNT-filled polymer composites were strongly dependent on the state of dispersion, mixing and processing conditions, and interaction with the polymeric matrix.

  6. [Analysis of chondroitin sulfate content of Cervi Cornu Pantotrichum with different processing methods and different parts].

    Science.gov (United States)

    Gong, Rui-Ze; Wang, Yan-Hua; Sun, Yin-Shi

    2018-02-01

    The differences and the variations of chondroitin sulfate content in different parts of Cervi Cornu Pantotrichum(CCP) with different processing methods were investigated. The chondroitin sulfate from velvet was extracted by dilute alkali-concentrated salt method. Next, the chondroitin sulfate was digested by chondroitinase ABC.The contents of total chondroitin sulfate and chondroitin sulfate A, B and C in the samples were determined by high performance liquid chromatography(HPLC).The content of chondroitin sulfate in wax,powder,gauze,bone slices of CCP with freeze-drying processing is 14.13,11.99,1.74,0.32 g·kg⁻¹, respectively. The content of chondroitin sulfate in wax,powder,gauze,bone slices of CCP with boiling processing is 10.71,8.97,2.21,1.40 g·kg⁻¹, respectively. The content of chondroitin sulfate in wax,powder,gauze,bone slices of CCP without blood is 12.47,9.47,2.64,0.07 g·kg⁻¹, respectively. And the content of chondroitin sulfate in wax,powder,gauze,bone slices of CCP with blood is 8.22,4.39,0.87,0.28 g·kg⁻¹ respectively. The results indicated that the chondroitin sulfate content in different processing methods was significantly different.The content of chondroitin sulfate in CCP with freeze-drying is higher than that in CCP with boiling processing.The content of chondroitin sulfate in CCP without blood is higher than that in CCP with blood. The chondroitin sulfate content in differerent paris of the velvet with the same processing methods was arranged from high to low as: wax slices, powder, gauze slices, bone slices. Copyright© by the Chinese Pharmaceutical Association.

  7. Obstetric care and method of delivery in Mexico: results from the 2012 National Health and Nutrition Survey.

    Directory of Open Access Journals (Sweden)

    Ileana Heredia-Pi

    Full Text Available OBJECTIVE: To identify the current clinical, socio-demographic and obstetric factors associated with the various types of delivery strategies in Mexico. MATERIALS AND METHODS: This is a cross-sectional study based on the 2012 National Health and Nutrition Survey (ENSANUT of 6,736 women aged 12 to 49 years. Delivery types discussed in this paper include vaginal delivery, emergency cesarean section and planned cesarean section. Using bivariate analyses, sub-population group differences were identified. Logistic regression models were applied, including both binary and multinomial outcome variables from the survey. The logistic regression results identify those covariates associated with the type of delivery. RESULTS: 53.1% of institutional births in the period 2006 through 2012 were vaginal deliveries, 46.9% were either a planned or emergency cesarean sections. The highest rates of this procedure were among women who reported a complication during delivery (OR: 4.21; 95%CI: 3.66-4.84, between the ages of 35 and 49 at the time of their last child birth (OR: 2.54; 95%CI: 2.02-3.20 and women receiving care through private healthcare providers during delivery (OR: 2.36; 95%CI: 1.84-3.03. CONCLUSIONS: The existence of different socio-demographic and obstetric profiles among women who receive care for vaginal or cesarean delivery, are supported by the findings of the present study. The frequency of vaginal delivery is higher in indigenous women, when the care provider is public and, in women with two or more children at time of the most recent child birth. Planned cesarean deliveries are positively associated with years of schooling, a higher socioeconomic level, and higher age. The occurrence of emergency cesarean sections is elevated in women with a diagnosis of a health issue during pregnancy or delivery, and it is reduced in highly marginalized settings.

  8. Obstetric Care and Method of Delivery in Mexico: Results from the 2012 National Health and Nutrition Survey

    Science.gov (United States)

    Heredia-Pi, Ileana; Servan-Mori, Edson E.; Wirtz, Veronika J.; Avila-Burgos, Leticia; Lozano, Rafael

    2014-01-01

    Objective To identify the current clinical, socio-demographic and obstetric factors associated with the various types of delivery strategies in Mexico. Materials and Methods This is a cross-sectional study based on the 2012 National Health and Nutrition Survey (ENSANUT) of 6,736 women aged 12 to 49 years. Delivery types discussed in this paper include vaginal delivery, emergency cesarean section and planned cesarean section. Using bivariate analyses, sub-population group differences were identified. Logistic regression models were applied, including both binary and multinomial outcome variables from the survey. The logistic regression results identify those covariates associated with the type of delivery. Results 53.1% of institutional births in the period 2006 through 2012 were vaginal deliveries, 46.9% were either a planned or emergency cesarean sections. The highest rates of this procedure were among women who reported a complication during delivery (OR: 4.21; 95%CI: 3.66–4.84), between the ages of 35 and 49 at the time of their last child birth (OR: 2.54; 95%CI: 2.02–3.20) and women receiving care through private healthcare providers during delivery (OR: 2.36; 95%CI: 1.84–3.03). Conclusions The existence of different socio-demographic and obstetric profiles among women who receive care for vaginal or cesarean delivery, are supported by the findings of the present study. The frequency of vaginal delivery is higher in indigenous women, when the care provider is public and, in women with two or more children at time of the most recent child birth. Planned cesarean deliveries are positively associated with years of schooling, a higher socioeconomic level, and higher age. The occurrence of emergency cesarean sections is elevated in women with a diagnosis of a health issue during pregnancy or delivery, and it is reduced in highly marginalized settings. PMID:25101781

  9. Survey of potential chlorine production processes. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1979-04-01

    This report is part of the ongoing study of industrial electrochemical processes for the purpose of identifying methods of improving energy efficiencies. A computerized literature search of past and current chlorine generation methods was performed to identify basic chlorine production processes. Over 200 pertinent references are cited involving 20 separate and distinct chlorine processes. Each basic process is evaluated for its engineering and economic viability and energy efficiency. A flow diagram is provided for each basic process. Four criteria are used to determine the most promising processes: raw material availability, type and amount of energy required, by-product demand/disposal and status of development. The most promising processes are determined to be the membrane process (with and without catalytic electrodes), Kel-Chlor, Mobay (direct electrolysis of hydrogen chloride), the Shell process (catalytic oxidation of hydrogen chloride) and oxidation of ammonium chloride. Each of these processes is further studied to determine what activities may be pursued.

  10. Methods of assessing functioning of organizational and economic mechanism during innovation process implementation

    Directory of Open Access Journals (Sweden)

    Blinkov Maksim

    2017-01-01

    Full Text Available This paper proposes methods of assessing the efficiency of organizational and economic mechanism of an industrial enterprise when implementing innovation processes. These methods allow continuous monitoring at all stages of innovation process implementation, lead to reduction in costs of innovation activity and enable maximum use of the creative potential of enterprise personnel. The significance and attractiveness of this method is ensured by the fact that it can be applied by industrial companies in any market fields regardless of the lifecycle stage applicable to the studied goods, company and/or innovative process because the composition and the number of specific indicators can be adjusted by the work group both before the study and in the course of the company’s innovative activities (at any stage of their implementation. The multi-sided approach proposed for assessing the efficiency of organizational and economic mechanism of the industrial enterprise when implementing innovation processes ensures full and accurate assessment of the impact of certain factors on the final result.

  11. Water conservation and reuse using the Water Sources Diagram method for batch process: case studies

    Directory of Open Access Journals (Sweden)

    Fernando Luiz Pellegrini Pessoa

    2012-04-01

    Full Text Available The water resources management has been an important factor for the sustainability of industrial processes, since there is a growing need for the development of methodologies aimed at the conservation and rational use of water. The objective of this work was to apply the heuristic-algorithmic method called Water Sources Diagram (WSD, which is used to define the target of minimum water consumption, to batch processes. Scenarios with reuse of streams were generated and evaluated with application of the method from the data of water quantity and concentration of contaminants in the operations. Two case studies aiming to show the reduction of water consumption and wastewater generation, and final treatment costs besides investment in storage tanks, were presented. The scenarios showed great promising, achieving reduction up to 45% in water consumption and wastewater generation, and a reduction of around 37% on cost of storage tanks, without the need to allocate regeneration processes. Thus, the WSD method showed to be a relevant and flexible alternative regarding to systemic tools aimed at minimizing the consumption of water in industrial processes, playing an important role within a program of water resources management.

  12. The automated data processing architecture for the GPI Exoplanet Survey

    Science.gov (United States)

    Wang, Jason J.; Perrin, Marshall D.; Savransky, Dmitry; Arriaga, Pauline; Chilcote, Jeffrey K.; De Rosa, Robert J.; Millar-Blanchaer, Maxwell A.; Marois, Christian; Rameau, Julien; Wolff, Schuyler G.; Shapiro, Jacob; Ruffio, Jean-Baptiste; Graham, James R.; Macintosh, Bruce

    2017-09-01

    The Gemini Planet Imager Exoplanet Survey (GPIES) is a multi-year direct imaging survey of 600 stars to discover and characterize young Jovian exoplanets and their environments. We have developed an automated data architecture to process and index all data related to the survey uniformly. An automated and flexible data processing framework, which we term the GPIES Data Cruncher, combines multiple data reduction pipelines together to intelligently process all spectroscopic, polarimetric, and calibration data taken with GPIES. With no human intervention, fully reduced and calibrated data products are available less than an hour after the data are taken to expedite follow-up on potential objects of interest. The Data Cruncher can run on a supercomputer to reprocess all GPIES data in a single day as improvements are made to our data reduction pipelines. A backend MySQL database indexes all files, which are synced to the cloud, and a front-end web server allows for easy browsing of all files associated with GPIES. To help observers, quicklook displays show reduced data as they are processed in real-time, and chatbots on Slack post observing information as well as reduced data products. Together, the GPIES automated data processing architecture reduces our workload, provides real-time data reduction, optimizes our observing strategy, and maintains a homogeneously reduced dataset to study planet occurrence and instrument performance.

  13. Study and Testing of Processing Trajectory Measurement Method of Flexible Workpiece

    Directory of Open Access Journals (Sweden)

    Yaohua Deng

    2013-01-01

    Full Text Available Flexible workpiece includes the materials like elasticity spline, textile fabric, and polyurethane sponge, due to the fact that processing trajectory is composed by small arc or small line segment primitives and the deformation of the flexible workpiece during the processing trajectory, making the captured image of processing trajectory not clear, the edge of processing image over local uneven gray, and also the pixels of boundaries between the processing trajectory image edge and background organizations not obvious. This paper takes corner search of processing trajectory as the cut-in-point, slope angle curve of starting and terminal point of each primitive is also designed, put forward the search algorithm that regards Daubechies (4 as wavelet operator to conduct slope angle curve for multiple scales wavelet transform, by judging whether there is a point of the curve appears wavelet transform extremum to determine whether the point is a corner point based on wavelet edge modulus maxima extract principle. Finally, proposed a decomposition/reconstruction design method of FIR filters based on wavelet transform of processing image. Eight-tap transpose FIR filter is used to design the decomposition of Daubechies (4 and reconfigurable computing IP core. The IP core wavelet decomposition of the total time-consuming increases only 5.561% in comparsion with PC. Trajectory angle relative error is 2.2%, and the average measurement time is 212.38 ms.

  14. Method and Process Development of Advanced Atmospheric Plasma Spraying for Thermal Barrier Coatings

    Science.gov (United States)

    Mihm, Sebastian; Duda, Thomas; Gruner, Heiko; Thomas, Georg; Dzur, Birger

    2012-06-01

    Over the last few years, global economic growth has triggered a dramatic increase in the demand for resources, resulting in steady rise in prices for energy and raw materials. In the gas turbine manufacturing sector, process optimizations of cost-intensive production steps involve a heightened potential of savings and form the basis for securing future competitive advantages in the market. In this context, the atmospheric plasma spraying (APS) process for thermal barrier coatings (TBC) has been optimized. A constraint for the optimization of the APS coating process is the use of the existing coating equipment. Furthermore, the current coating quality and characteristics must not change so as to avoid new qualification and testing. Using experience in APS and empirically gained data, the process optimization plan included the variation of e.g. the plasma gas composition and flow-rate, the electrical power, the arrangement and angle of the powder injectors in relation to the plasma jet, the grain size distribution of the spray powder and the plasma torch movement procedures such as spray distance, offset and iteration. In particular, plasma properties (enthalpy, velocity and temperature), powder injection conditions (injection point, injection speed, grain size and distribution) and the coating lamination (coating pattern and spraying distance) are examined. The optimized process and resulting coating were compared to the current situation using several diagnostic methods. The improved process significantly reduces costs and achieves the requirement of comparable coating quality. Furthermore, a contribution was made towards better comprehension of the APS of ceramics and the definition of a better method for future process developments.

  15. Digital signal processing with kernel methods

    CERN Document Server

    Rojo-Alvarez, José Luis; Muñoz-Marí, Jordi; Camps-Valls, Gustavo

    2018-01-01

    A realistic and comprehensive review of joint approaches to machine learning and signal processing algorithms, with application to communications, multimedia, and biomedical engineering systems Digital Signal Processing with Kernel Methods reviews the milestones in the mixing of classical digital signal processing models and advanced kernel machines statistical learning tools. It explains the fundamental concepts from both fields of machine learning and signal processing so that readers can quickly get up to speed in order to begin developing the concepts and application software in their own research. Digital Signal Processing with Kernel Methods provides a comprehensive overview of kernel methods in signal processing, without restriction to any application field. It also offers example applications and detailed benchmarking experiments with real and synthetic datasets throughout. Readers can find further worked examples with Matlab source code on a website developed by the authors. * Presents the necess...

  16. Process improvement methods increase the efficiency, accuracy, and utility of a neurocritical care research repository.

    Science.gov (United States)

    O'Connor, Sydney; Ayres, Alison; Cortellini, Lynelle; Rosand, Jonathan; Rosenthal, Eric; Kimberly, W Taylor

    2012-08-01

    Reliable and efficient data repositories are essential for the advancement of research in Neurocritical care. Various factors, such as the large volume of patients treated within the neuro ICU, their differing length and complexity of hospital stay, and the substantial amount of desired information can complicate the process of data collection. We adapted the tools of process improvement to the data collection and database design of a research repository for a Neuroscience intensive care unit. By the Shewhart-Deming method, we implemented an iterative approach to improve the process of data collection for each element. After an initial design phase, we re-evaluated all data fields that were challenging or time-consuming to collect. We then applied root-cause analysis to optimize the accuracy and ease of collection, and to determine the most efficient manner of collecting the maximal amount of data. During a 6-month period, we iteratively analyzed the process of data collection for various data elements. For example, the pre-admission medications were found to contain numerous inaccuracies after comparison with a gold standard (sensitivity 71% and specificity 94%). Also, our first method of tracking patient admissions and discharges contained higher than expected errors (sensitivity 94% and specificity 93%). In addition to increasing accuracy, we focused on improving efficiency. Through repeated incremental improvements, we reduced the number of subject records that required daily monitoring from 40 to 6 per day, and decreased daily effort from 4.5 to 1.5 h/day. By applying process improvement methods to the design of a Neuroscience ICU data repository, we achieved a threefold improvement in efficiency and increased accuracy. Although individual barriers to data collection will vary from institution to institution, a focus on process improvement is critical to overcoming these barriers.

  17. BUSINESS PROCESS REENGINEERING AS THE METHOD OF PROCESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    O. Honcharova

    2013-09-01

    Full Text Available The article is devoted to the analysis of process management approach. The main understanding of process management approach has been researched in the article. The definition of process and process management has been given. Also the methods of business process improvement has been analyzed, among them are fast-analysis solution technology (FAST, benchmarking, reprojecting and reengineering. The main results of using business process improvement have been described in figures of reducing cycle time, costs and errors. Also the tasks of business process reengineering have been noticed. The main stages of business process reengineering have been noticed. The main efficiency results of business process reengineering and its success factors have been determined.

  18. Signal processing method and system for noise removal and signal extraction

    Science.gov (United States)

    Fu, Chi Yung; Petrich, Loren

    2009-04-14

    A signal processing method and system combining smooth level wavelet pre-processing together with artificial neural networks all in the wavelet domain for signal denoising and extraction. Upon receiving a signal corrupted with noise, an n-level decomposition of the signal is performed using a discrete wavelet transform to produce a smooth component and a rough component for each decomposition level. The n.sup.th level smooth component is then inputted into a corresponding neural network pre-trained to filter out noise in that component by pattern recognition in the wavelet domain. Additional rough components, beginning at the highest level, may also be retained and inputted into corresponding neural networks pre-trained to filter out noise in those components also by pattern recognition in the wavelet domain. In any case, an inverse discrete wavelet transform is performed on the combined output from all the neural networks to recover a clean signal back in the time domain.

  19. Collaboration processes and perceived effectiveness of integrated care projects in primary care: a longitudinal mixed-methods study.

    Science.gov (United States)

    Valentijn, Pim P; Ruwaard, Dirk; Vrijhoef, Hubertus J M; de Bont, Antoinette; Arends, Rosa Y; Bruijnzeels, Marc A

    2015-10-09

    Collaborative partnerships are considered an essential strategy for integrating local disjointed health and social services. Currently, little evidence is available on how integrated care arrangements between professionals and organisations are achieved through the evolution of collaboration processes over time. The first aim was to develop a typology of integrated care projects (ICPs) based on the final degree of integration as perceived by multiple stakeholders. The second aim was to study how types of integration differ in changes of collaboration processes over time and final perceived effectiveness. A longitudinal mixed-methods study design based on two data sources (surveys and interviews) was used to identify the perceived degree of integration and patterns in collaboration among 42 ICPs in primary care in The Netherlands. We used cluster analysis to identify distinct subgroups of ICPs based on the final perceived degree of integration from a professional, organisational and system perspective. With the use of ANOVAs, the subgroups were contrasted based on: 1) changes in collaboration processes over time (shared ambition, interests and mutual gains, relationship dynamics, organisational dynamics and process management) and 2) final perceived effectiveness (i.e. rated success) at the professional, organisational and system levels. The ICPs were classified into three subgroups with: 'United Integration Perspectives (UIP)', 'Disunited Integration Perspectives (DIP)' and 'Professional-oriented Integration Perspectives (PIP)'. ICPs within the UIP subgroup made the strongest increase in trust-based (mutual gains and relationship dynamics) as well as control-based (organisational dynamics and process management) collaboration processes and had the highest overall effectiveness rates. On the other hand, ICPs with the DIP subgroup decreased on collaboration processes and had the lowest overall effectiveness rates. ICPs within the PIP subgroup increased in control

  20. A Delphi Method Analysis to Create an Emergency Medicine Educational Patient Satisfaction Survey.

    Science.gov (United States)

    London, Kory S; Singal, Bonita; Fowler, Jennifer; Prepejchal, Rebecca; Simmons, Stefanie; Finefrock, Douglas

    2015-12-01

    Feedback on patient satisfaction (PS) as a means to monitor and improve performance in patient communication is lacking in residency training. A physician's promotion, compensation and job satisfaction may be impacted by his individual PS scores, once he is in practice. Many communication and satisfaction surveys exist but none focus on the emergency department setting for educational purposes. The goal of this project was to create an emergency medicine-based educational PS survey with strong evidence for content validity. We used the Delphi Method (DM) to obtain expert opinion via an iterative process of surveying. Questions were mined from four PS surveys as well as from group suggestion. The DM analysis determined the structure, content and appropriate use of the tool. The group used four-point Likert-type scales and Lynn's criteria for content validity to determine relevant questions from the stated goals. Twelve recruited experts participated in a series of seven surveys to achieve consensus. A 10-question, single-page survey with an additional page of qualitative questions and demographic questions was selected. Thirty one questions were judged to be relevant from an original 48-question list. Of these, the final 10 questions were chosen. Response rates for individual survey items was 99.5%. The DM produced a consensus survey with content validity evidence. Future work will be needed to obtain evidence for response process, internal structure and construct validity.

  1. Applying lean methods to improve quality and safety in surgical sterile instrument processing.

    Science.gov (United States)

    Blackmore, C Craig; Bishop, Robbi; Luker, Samuel; Williams, Barbara L

    2013-03-01

    Surgical instrument processing is critical to safe, high-quality surgical care but has received little attention in the medical literature. Typical hospitals have inventories in the tens of thousands of surgical instruments organized into thousands of instrument sets. The use of these instruments for multiple procedures per day leads to millions of instrument sets being reprocessed yearly in a single hospital. Errors in the processing of sterile instruments may lead to increased operative times and costs, as well as potentially contributing to surgical infections and perioperative morbidity. At Virginia Mason Medical Center (Seattle), a quality monitoring approach was developed to identify and categorize errors in sterile instrument processing, through use of a daily defect sheet. Lean methods were used to improve the quality of surgical instrument processing through redefining operator roles, alteration of the workspace, mistake-proofing, quality monitoring, staff training, and continuous feedback. To study the effectiveness of the quality improvement project, a before/after comparison of prospectively collected sterile processing error rates during a 37-month time frame was performed. Before the intervention, instrument processing errors occurred in 3.0% of surgical cases, decreasing to 1.5% at the final follow-up (p instrument processing errors are a barrier to the highest quality and safety in surgical care but are amenable to substantial improvement using Lean techniques.

  2. Design and Validation of the Quantum Mechanics Conceptual Survey

    Science.gov (United States)

    McKagan, S. B.; Perkins, K. K.; Wieman, C. E.

    2010-01-01

    The Quantum Mechanics Conceptual Survey (QMCS) is a 12-question survey of students' conceptual understanding of quantum mechanics. It is intended to be used to measure the relative effectiveness of different instructional methods in modern physics courses. In this paper, we describe the design and validation of the survey, a process that included…

  3. Systematic process synthesis and design methods for cost effective waste minimization

    International Nuclear Information System (INIS)

    Biegler, L.T.; Grossman, I.E.; Westerberg, A.W.

    1995-01-01

    We present progress on our work to develop synthesis methods to aid in the design of cost effective approaches to waste minimization. Work continues to combine the approaches of Douglas and coworkers and of Grossmann and coworkers on a hierarchical approach where bounding information allows it to fit within a mixed integer programming approach. We continue work on the synthesis of reactors and of flexible separation processes. In the first instance, we strive for methods we can use to reduce the production of potential pollutants, while in the second we look for ways to recover and recycle solvents

  4. Design and methodology of a mixed methods follow-up study to the 2014 Ghana Demographic and Health Survey

    OpenAIRE

    Staveteig, Sarah; Aryeetey, Richmond; Anie-Ansah, Michael; Ahiadeke, Clement; Ortiz, Ladys

    2017-01-01

    ABSTRACT Background: The intended meaning behind responses to standard questions posed in large-scale health surveys are not always well understood. Systematic follow-up studies, particularly those which pose a few repeated questions followed by open-ended discussions, are well positioned to gauge stability and consistency of data and to shed light on the intended meaning behind survey responses. Such follow-up studies require extensive coordination and face challenges in protecting responden...

  5. An assessment of the feasibility of developing and implementing an automated pavement distress survey system incorporating digital image processing.

    Science.gov (United States)

    1997-01-01

    The rational allocation of pavement maintenance resources requires the periodic assessment of the condition of all pavements. Traditional manual pavement distress surveys, which are based on visual inspection, are labor intensive, slow, and expensive...

  6. Modeling thermal inkjet and cell printing process using modified pseudopotential and thermal lattice Boltzmann methods

    Science.gov (United States)

    Sohrabi, Salman; Liu, Yaling

    2018-03-01

    Pseudopotential lattice Boltzmann methods (LBMs) can simulate a phase transition in high-density ratio multiphase flow systems. If coupled with thermal LBMs through equation of state, they can be used to study instantaneous phase transition phenomena with a high-temperature gradient where only one set of formulations in an LBM system can handle liquid, vapor, phase transition, and heat transport. However, at lower temperatures an unrealistic spurious current at the interface introduces instability and limits its application in real flow system. In this study, we proposed new modifications to the LBM system to minimize a spurious current which enables us to study nucleation dynamic at room temperature. To demonstrate the capabilities of this approach, the thermal ejection process is modeled as one example of a complex flow system. In an inkjet printer, a thermal pulse instantly heats up the liquid in a microfluidic chamber and nucleates bubble vapor providing the pressure pulse necessary to eject droplets at high speed. Our modified method can present a more realistic model of the explosive vaporization process since it can also capture a high-temperature/density gradient at nucleation region. Thermal inkjet technology has been successfully applied for printing cells, but cells are susceptible to mechanical damage or death as they squeeze out of the nozzle head. To study cell deformation, a spring network model, representing cells, is connected to the LBM through the immersed boundary method. Looking into strain and stress distribution of a cell membrane at its most deformed state, it is found that a high stretching rate effectively increases the rupture tension. In other words, membrane deformation energy is released through creation of multiple smaller nanopores rather than big pores. Overall, concurrently simulating multiphase flow, phase transition, heat transfer, and cell deformation in one unified LB platform, we are able to provide a better insight into the

  7. Methods and processes of developing the Strengthening the Reporting of Observational Studies in Epidemiology

    DEFF Research Database (Denmark)

    Sargeant, J. M.; O'Connor, A. M.; Dohoo, I. R.

    2016-01-01

    unique reporting requirements for observational studies in veterinary medicine related to health, production, welfare, and food safety. Design: Consensus meeting of experts. Setting: Mississauga, Canada. Participants: Seventeen experts from North America, Europe, and Australia. Methods: Experts completed...... a pre-meeting survey about whether items in the STROBE statement should be added to or modified to address unique issues related to observational studies in animal species with health, production, welfare, or food-safety outcomes. During the meeting, each STROBE item was discussed to determine whether...... used for other extensions of the STROBE statement. The use of this STROBE statement extension should improve reporting of observational studies in veterinary research by recognizing unique features of observational studies involving food-producing and companion animals, products of animal origin...

  8. Scheduling of Changes in Complex Engineering Design Process via Genetic Algorithm and Elementary Effects Method

    Directory of Open Access Journals (Sweden)

    Yuliang Li

    2014-06-01

    Full Text Available Engineering design changes constantly occur in a complex engineering design process. Designers have to put an appropriate procedure in place to handle these changes in order to realize successful product development in a timely and cost-effective manner. When many change propagation paths are present, selection of the best change evolution paths and distribution of change results to downstream tasks become critical to the progress management of the project. In this paper, based on the available change propagation simulation algorithm, a global sensitivity analysis method known as elementary effects (EE is employed to rank the importance of each potential propagation path with those involved design dependencies in the process. Further, an EE-based heuristic design dependency encoding method is applied to the genetic algorithm which is then adopted to schedule the change updating process. Finally, the optimal results obtained by the complete search and the heuristic dependency encoding methods are compared to illustrate the improvements and effectiveness of the latter method.

  9. Image processing methods for the structural detection and gradation of placental villi.

    Science.gov (United States)

    Swiderska-Chadaj, Zaneta; Markiewicz, Tomasz; Koktysz, Robert; Cierniak, Szczepan

    2017-08-03

    The context-based examination of stained tissue specimens is one of the most important procedures in histopathological practice. The development of image processing methods allows for the automation of this process. We propose a method of automatic segmentation of placental structures and assessment of edema present in placental structures from a spontaneous miscarriage. The presented method is based on texture analysis, mathematical morphology, and region growing operations that are applicable to the heterogeneous microscopic images representing histological slides of the placenta. The results presented in this study were obtained using a set of 50 images of single villi originating from 13 histological slides and was compared with the manual evaluation of the pathologist. In the presented experiments, various structures, such as villi, villous mesenchyme, trophoblast, collagen, and vessels have been recognized. Moreover, the gradation of villous edema for three classes (no villous edema, moderate villous edema, and massive villous edema) has been conducted. Villi images were correctly identified in 98.21%, villous mesenchyme was correctly identified in 83.95%, and the villi evaluation was correct in 74% for the edema degree and 86% for the number of vessels. The presented segmentation method may serve as a support for current manual diagnosis methods and reduce the bias related to individual, subjective assessment of experts. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Survey Efficiency of Ultraviolet and Zinc Oxide Process (UV/ZnO for Removal of Diazinon Pesticide from Aqueous Solution

    Directory of Open Access Journals (Sweden)

    Mohammad Hadi Dehghani

    2015-03-01

    Full Text Available The presence of persistent organic pollutants and toxics (e.g., pesticides in ground, surface, and drinking water resources combined with the inability of conventional treatment methods to remove these pollutants have led to the development of advanced oxidation processes. Nowadays, nanophotocatalyst processes are considered as clean and environmentally-friendly treatment methods that can be extensively used for removing contaminants. The objective of the present study was to determine the efficiency of the ultraviolet and zinc oxide (UV/ZnO process in the removal of diazinon pesticide from aqueous solutions. For the purposes of this study, samples were adjusted in a batch reactor at five different detention times. The pH levels used were 3, 7, and 9. Irradiation was performed using a 125 W medium-pressure mercury lamp. The diazinon concentrations of the samples were 100 and 500 µg/L and the concentrations of zinc oxide nanoparticles were 50, 100, and 150 mg/L. The highest degradation efficiency was observed at pH 7 (mean = 80.92 30.3, while the lowest was observed for pH 3 (mean 67.11 24.49. Results showed that the optimal concentration of nanoparticles (6-12 nm was 100 mg L-1.

  11. Statistical methods to assess and control processes and products during nuclear fuel fabrication

    International Nuclear Information System (INIS)

    Weidinger, H.

    1999-01-01

    Very good statistical tools and techniques are available today to access the quality and the reliability of fabrication process as the original sources for a good and reliable quality of the fabricated processes. Quality control charts of different types play a key role and the high capability of modern electronic data acquisition technologies proved, at least potentially, a high efficiency in the more or less online application of these methods. These techniques focus mainly on stability and the reliability of the fabrication process. In addition, relatively simple statistical tolls are available to access the capability of fabrication process, assuming they are stable, to fulfill the product specifications. All these techniques can only result in as good a product as the product design is able to describe the product requirements necessary for good performance. Therefore it is essential that product design is strictly and closely performance oriented. However, performance orientation is only successful through an open and effective cooperation with the customer who uses or applies those products. During the last one to two decades in the west, a multi-vendor strategy has been developed by the utility, sometimes leading to three different fuel vendors for one reactor core. This development resulted in better economic conditions for the user but did not necessarily increase an open attitude with the vendor toward the using utility. The responsibility of the utility increased considerably to ensure an adequate quality of the fuel they received. As a matter of fact, sometimes the utilities had to pay a high price because of unexpected performance problems. Thus the utilities are now learning that they need to increase their knowledge and experience in the area of nuclear fuel quality management and technology. This process started some time ago in the west. However, it now also reaches the utilities in the eastern countries. (author)

  12. Random Qualitative Validation: A Mixed-Methods Approach to Survey Validation

    Science.gov (United States)

    Van Duzer, Eric

    2012-01-01

    The purpose of this paper is to introduce the process and value of Random Qualitative Validation (RQV) in the development and interpretation of survey data. RQV is a method of gathering clarifying qualitative data that improves the validity of the quantitative analysis. This paper is concerned with validity in relation to the participants'…

  13. Method and apparatus for rapid adjustment of process gas inventory in gaseous diffusion cascades

    International Nuclear Information System (INIS)

    Dyer, R.H.; Fowler, A.H.; Vanstrum, P.R.

    1977-01-01

    The invention relates to an improved method and system for making relatively large and rapid adjustments in the process gas inventory of an electrically powered gaseous diffusion cascade in order to accommodate scheduled changes in the electrical power available for cascade operation. In the preferred form of the invention, the cascade is readied for a decrease in electrical input by simultaneously withdrawing substreams of the cascade B stream into respective process-gas-freezing and storage zones while decreasing the datum-pressure inputs to the positioning systems for the cascade control valves in proportion to the weight of process gas so removed. Consequently, the control valve positions are substantially unchanged by the reduction in invention, and there is minimal disturbance of the cascade isotopic gradient. The cascade is readied for restoration of the power cut by simultaneously evaporating the solids in the freezing zones to regenerate the process gas substreams and introducing them to the cascade A stream while increasing the aforementioned datum pressure inputs in proportion to the weight of process gas so returned. In the preferred form of the system for accomplishing these operations, heat exchangers are provided for freezing, storing, and evaporating the various substreams. Preferably, the heat exchangers are connected to use existing cascade auxiliary systems as a heat sink. A common control is employed to adjust and coordinate the necessary process gas transfers and datum pressure adjustments

  14. Detection and localization of leak of pipelines of RBMK reactor. Methods of processing of acoustic noise

    International Nuclear Information System (INIS)

    Tcherkaschov, Y.M.; Strelkov, B.P.; Chimanski, S.B.; Lebedev, V.I.; Belyanin, L.A.

    1997-01-01

    For realization of leak detection of input pipelines and output pipelines of RBMK reactor the method, based on detection and control of acoustic leak signals, was designed. In this report the review of methods of processing and analysis of acoustic noise is submitted. These methods were included in the software of the leak detection system and are used for the decision of the following problems: leak detection by method of sound pressure level in conditions of powerful background noise and strong attenuation of a signal; detection of a small leak in early stage by high-sensitivity correlation method; determination of a point of a sound source in conditions of strong reflection of a signal by a correlation method and sound pressure method; evaluation of leak size by the analysis of a sound level and point of a sound source. The work of considered techniques is illustrated on an example of test results of a fragment of the leak detection system. This test was executed on a Leningrad NPP, operated at power levels of 460, 700, 890 and 1000 MWe. 16 figs

  15. Application of risk analysis and quality control methods for improvement of lead molding process

    Directory of Open Access Journals (Sweden)

    H. Gołaś

    2016-10-01

    Full Text Available The aim of the paper is to highlight the significance of implication of risk analysis and quality control methods for the improvement of parameters of lead molding process. For this reason, Fault Mode and Effect Analysis (FMEA was developed in the conceptual stage of a new product TC-G100-NR. However, the final product was faulty (a complete lack of adhesion of brass insert to leak regardless of the previously defined potential problem and its preventive action. It contributed to the recognition of root causes, corrective actions and change of production parameters. It showed how these methods, level of their organization, systematic and rigorous study affect molding process parameters.

  16. Method and equipment for treating waste water resulting from the technological testing processes of NPP equipment

    International Nuclear Information System (INIS)

    Radulescu, M. C.; Valeca, S.; Iorga, C.

    2016-01-01

    Modern methods and technologies coupled together with advanced equipment for treating residual substances resulted from technological processes are mandatory measures for all industrial facilities. The correct management of the used working agents and of the all wastes resulted from the different technological process (preparation, use, collection, neutralization, discharge) is intended to reduce up to removal of their potential negative impact on the environment. The high pressure and temperature testing stands from INR intended for functional testing of nuclear components (fuel bundles, fuelling machines, etc.) were included in these measures since the use of oils, demineralized water chemically treated, greases, etc. This paper is focused on the method and equipment used at INR Pitesti in the chemical treatment of demineralized waters, as well as the equipment for collecting, neutralizing and discharging them after use. (authors)

  17. Work in process level definition: a method based on computer simulation and electre tri

    Directory of Open Access Journals (Sweden)

    Isaac Pergher

    2014-09-01

    Full Text Available This paper proposes a method for defining the levels of work in progress (WIP in productive environments managed by constant work in process (CONWIP policies. The proposed method combines the approaches of Computer Simulation and Electre TRI to support estimation of the adequate level of WIP and is presented in eighteen steps. The paper also presents an application example, performed on a metalworking company. The research method is based on Computer Simulation, supported by quantitative data analysis. The main contribution of the paper is its provision of a structured way to define inventories according to demand. With this method, the authors hope to contribute to the establishment of better capacity plans in production environments.

  18. Stochastic processes, multiscale modeling, and numerical methods for computational cellular biology

    CERN Document Server

    2017-01-01

    This book focuses on the modeling and mathematical analysis of stochastic dynamical systems along with their simulations. The collected chapters will review fundamental and current topics and approaches to dynamical systems in cellular biology. This text aims to develop improved mathematical and computational methods with which to study biological processes. At the scale of a single cell, stochasticity becomes important due to low copy numbers of biological molecules, such as mRNA and proteins that take part in biochemical reactions driving cellular processes. When trying to describe such biological processes, the traditional deterministic models are often inadequate, precisely because of these low copy numbers. This book presents stochastic models, which are necessary to account for small particle numbers and extrinsic noise sources. The complexity of these models depend upon whether the biochemical reactions are diffusion-limited or reaction-limited. In the former case, one needs to adopt the framework of s...

  19. Presentation of a method for consequence modeling and quantitative risk assessment of fire and explosion in process industry (Case study: Hydrogen Production Process

    Directory of Open Access Journals (Sweden)

    M J Jafari

    2013-05-01

     .Conclusion: Since the proposed method is applicable in all phases of process or system design