WorldWideScience

Sample records for program analytical segmentation

  1. What are Segments in Google Analytics

    Science.gov (United States)

    Segments find all sessions that meet a specific condition. You can then apply this segment to any report in Google Analytics (GA). Segments are a way of identifying sessions and users while filters identify specific events, like pageviews.

  2. Creating Web Area Segments with Google Analytics

    Science.gov (United States)

    Segments allow you to quickly access data for a predefined set of Sessions or Users, such as government or education users, or sessions in a particular state. You can then apply this segment to any report within the Google Analytics (GA) interface.

  3. Joint shape segmentation with linear programming

    KAUST Repository

    Huang, Qixing

    2011-01-01

    We present an approach to segmenting shapes in a heterogenous shape database. Our approach segments the shapes jointly, utilizing features from multiple shapes to improve the segmentation of each. The approach is entirely unsupervised and is based on an integer quadratic programming formulation of the joint segmentation problem. The program optimizes over possible segmentations of individual shapes as well as over possible correspondences between segments from multiple shapes. The integer quadratic program is solved via a linear programming relaxation, using a block coordinate descent procedure that makes the optimization feasible for large databases. We evaluate the presented approach on the Princeton segmentation benchmark and show that joint shape segmentation significantly outperforms single-shape segmentation techniques. © 2011 ACM.

  4. Joint shape segmentation with linear programming

    KAUST Repository

    Huang, Qixing; Koltun, Vladlen; Guibas, Leonidas

    2011-01-01

    program is solved via a linear programming relaxation, using a block coordinate descent procedure that makes the optimization feasible for large databases. We evaluate the presented approach on the Princeton segmentation benchmark and show that joint shape

  5. Programming system for analytic geometry

    International Nuclear Information System (INIS)

    Raymond, Jacques

    1970-01-01

    After having outlined the characteristics of computing centres which do not comply with engineering tasks, notably the time required by all different tasks to be performed when developing a software (assembly, compilation, link edition, loading, run), and identified constraints specific to engineering, the author identifies the characteristics a programming system should have to suit engineering tasks. He discussed existing conversational systems and their programming language, and their main drawbacks. Then, he presents a system which aims at facilitating programming and addressing problems of analytic geometry and trigonometry

  6. The MSCA Program: Developing Analytic Unicorns

    Science.gov (United States)

    Houghton, David M.; Schertzer, Clint; Beck, Scott

    2018-01-01

    Marketing analytics students who can communicate effectively with decision makers are in high demand. These "analytic unicorns" are hard to find. The Master of Science in Customer Analytics (MSCA) degree program at Xavier University seeks to fill that need. In this paper, we discuss the process of creating the MSCA program. We outline…

  7. Visual analytics for the exploration and assessment of segmentation errors

    NARCIS (Netherlands)

    Raidou, R.G.; Marcelis, F.J.J.; Breeuwer, M.; Gröller, M.E.; Vilanova Bartroli, A.

    2016-01-01

    Several diagnostic and treatment procedures require the segmentation of anatomical structures from medical images. However, the automatic model-based methods that are often employed, may produce inaccurate segmentations. These, if used as input for diagnosis or treatment, can have detrimental

  8. ANALYTIC WORD RECOGNITION WITHOUT SEGMENTATION BASED ON MARKOV RANDOM FIELDS

    NARCIS (Netherlands)

    Coisy, C.; Belaid, A.

    2004-01-01

    In this paper, a method for analytic handwritten word recognition based on causal Markov random fields is described. The words models are HMMs where each state corresponds to a letter; each letter is modelled by a NSHP­HMM (Markov field). Global models are build dynamically, and used for recognition

  9. Writing analytic element programs in Python.

    Science.gov (United States)

    Bakker, Mark; Kelson, Victor A

    2009-01-01

    The analytic element method is a mesh-free approach for modeling ground water flow at both the local and the regional scale. With the advent of the Python object-oriented programming language, it has become relatively easy to write analytic element programs. In this article, an introduction is given of the basic principles of the analytic element method and of the Python programming language. A simple, yet flexible, object-oriented design is presented for analytic element codes using multiple inheritance. New types of analytic elements may be added without the need for any changes in the existing part of the code. The presented code may be used to model flow to wells (with either a specified discharge or drawdown) and streams (with a specified head). The code may be extended by any hydrogeologist with a healthy appetite for writing computer code to solve more complicated ground water flow problems. Copyright © 2009 The Author(s). Journal Compilation © 2009 National Ground Water Association.

  10. 5 keys to business analytics program success

    CERN Document Server

    Boyer, John; Green, Brian; Harris, Tracy; Van De Vanter, Kay

    2012-01-01

    With business analytics is becoming increasingly strategic to all types of organizations and with many companies struggling to create a meaningful impact with this emerging technology, this work-based on the combined experience of 10 organizations that display excellence and expertise on the subject-shares the best practices, discusses the management aspects and sociology that drives success, and uncovers the five key aspects behind the success of some of the top business analytics programs in the industry. Readers will learn about numerous topics, including how to create and manage a changing

  11. Analytical program: 1975 Bikini radiological survey

    International Nuclear Information System (INIS)

    Mount, M.E.; Robison, W.L.; Thompson, S.E.; Hamby, K.O.; Prindle, A.L.; Levy, H.B.

    1976-01-01

    The analytical program for samples of soil, vegetation, and animal tissue collected during the June 1975 field survey of Bikini and Eneu islands is described. The phases of this program are discussed in chronological order: initial processing of samples, gamma spectrometry, and wet chemistry. Included are discussions of quality control programs, reproducibility of measurements, and comparisons of gamma spectrometry with wet chemistry determinations of 241 Am. Wet chemistry results are used to examine differences in Pu:Am ratios and Pu-isotope ratios as a function of the type of sample and the location where samples were collected

  12. Differential segmentation responses to an alcohol social marketing program.

    Science.gov (United States)

    Dietrich, Timo; Rundle-Thiele, Sharyn; Schuster, Lisa; Drennan, Judy; Russell-Bennett, Rebekah; Leo, Cheryl; Gullo, Matthew J; Connor, Jason P

    2015-10-01

    This study seeks to establish whether meaningful subgroups exist within a 14-16 year old adolescent population and if these segments respond differently to the Game On: Know Alcohol (GOKA) intervention, a school-based alcohol social marketing program. This study is part of a larger cluster randomized controlled evaluation of the GOKA program implemented in 14 schools in 2013/2014. TwoStep cluster analysis was conducted to segment 2,114 high school adolescents (14-16 years old) on the basis of 22 demographic, behavioral, and psychographic variables. Program effects on knowledge, attitudes, behavioral intentions, social norms, alcohol expectancies, and drinking refusal self-efficacy of identified segments were subsequently examined. Three segments were identified: (1) Abstainers, (2) Bingers, and (3) Moderate Drinkers. Program effects varied significantly across segments. The strongest positive change effects post-participation were observed for Bingers, while mixed effects were evident for Moderate Drinkers and Abstainers. These findings provide preliminary empirical evidence supporting the application of social marketing segmentation in alcohol education programs. Development of targeted programs that meet the unique needs of each of the three identified segments will extend the social marketing footprint in alcohol education. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Pre-analytical and analytical variation of drug determination in segmented hair using ultra-performance liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Nielsen, Marie Katrine Klose; Johansen, Sys Stybe; Linnet, Kristian

    2014-01-01

    Assessment of total uncertainty of analytical methods for the measurements of drugs in human hair has mainly been derived from the analytical variation. However, in hair analysis several other sources of uncertainty will contribute to the total uncertainty. Particularly, in segmental hair analysis pre-analytical variations associated with the sampling and segmentation may be significant factors in the assessment of the total uncertainty budget. The aim of this study was to develop and validate a method for the analysis of 31 common drugs in hair using ultra-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) with focus on the assessment of both the analytical and pre-analytical sampling variations. The validated method was specific, accurate (80-120%), and precise (CV≤20%) across a wide linear concentration range from 0.025-25 ng/mg for most compounds. The analytical variation was estimated to be less than 15% for almost all compounds. The method was successfully applied to 25 segmented hair specimens from deceased drug addicts showing a broad pattern of poly-drug use. The pre-analytical sampling variation was estimated from the genuine duplicate measurements of two bundles of hair collected from each subject after subtraction of the analytical component. For the most frequently detected analytes, the pre-analytical variation was estimated to be 26-69%. Thus, the pre-analytical variation was 3-7 folds larger than the analytical variation (7-13%) and hence the dominant component in the total variation (29-70%). The present study demonstrated the importance of including the pre-analytical variation in the assessment of the total uncertainty budget and in the setting of the 95%-uncertainty interval (±2CVT). Excluding the pre-analytical sampling variation could significantly affect the interpretation of results from segmental hair analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  14. New digital demodulator with matched filters and curve segmentation techniques for BFSK demodulation: Analytical description

    Directory of Open Access Journals (Sweden)

    Jorge Torres Gómez

    2015-09-01

    Full Text Available The present article relates in general to digital demodulation of Binary Frequency Shift Keying (BFSK. The objective of the present research is to obtain a new processing method for demodulating BFSK-signals in order to reduce hardware complexity in comparison with other methods reported. The solution proposed here makes use of the matched filter theory and curve segmentation algorithms. This paper describes the integration and configuration of a Sampler Correlator and curve segmentation blocks in order to obtain a digital receiver for a proper demodulation of the received signal. The proposed solution is shown to strongly reduce hardware complexity. In this part a presentation of the proposed solution regarding the analytical expressions is addressed. The paper covers in detail the elements needed for properly configuring the system. In a second part it is presented the implementation of the system for FPGA technology and the simulation results in order to validate the overall performance.

  15. A Big Data Analytics Methodology Program in the Health Sector

    Science.gov (United States)

    Lawler, James; Joseph, Anthony; Howell-Barber, H.

    2016-01-01

    The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…

  16. Segmented fuel irradiation program: investigation on advanced materials

    International Nuclear Information System (INIS)

    Uchida, H.; Goto, K.; Sabate, R.; Abeta, S.; Baba, T.; Matias, E. de; Alonso, J.

    1999-01-01

    The Segmented Fuel Irradiation Program, started in 1991, is a collaboration between the Japanese organisations Nuclear Power Engineering Corporation (NUPEC), the Kansai Electric Power Co., Inc. (KEPCO) representing other Japanese utilities, and Mitsubishi Heavy Industries, Ltd. (MHI); and the Spanish Organisations Empresa Nacional de Electricidad, S.A. (ENDESA) representing A.N. Vandellos 2, and Empresa Nacional Uranio, S.A. (ENUSA); with the collaboration of Westinghouse. The objective of the Program is to make substantial contribution to the development of advanced cladding and fuel materials for better performance at high burn-up and under operational power transients. For this Program, segmented fuel rods were selected as the most appropriate vehicle to accomplish the aforementioned objective. Thus, a large number of fuel and cladding combinations are provided while minimising the total amount of new material, at the same time, facilitating an eventual irradiation extension in a test reactor. The Program consists of three major phases: phase I: design, licensing, fabrication and characterisation of the assemblies carrying the segmented rods (1991 - 1994); phase II: base irradiation of the assemblies at Vandellos 2 NPP, and on-site examination at the end of four cycles (1994-1999). Phase III: ramp testing at the Studsvik facilities and hot cell PIE (1996-2001). The main fuel design features whose effects on fuel behaviour are being analysed are: alloy composition (MDA and ZIRLO vs. Zircaloy-4); tubing texture; pellet grain size. The Program is progressing satisfactorily as planned. The base irradiation is completed in the first quarter of 1999, and so far, tests and inspections already carried out are providing useful information on the behaviour of the new materials. Also, the Program is delivering a well characterized fuel material, irradiated in a commercial reactor, which can be further used in other fuel behaviour experiments. The paper presents the main

  17. Analytic central path, sensitivity analysis and parametric linear programming

    NARCIS (Netherlands)

    A.G. Holder; J.F. Sturm; S. Zhang (Shuzhong)

    1998-01-01

    textabstractIn this paper we consider properties of the central path and the analytic center of the optimal face in the context of parametric linear programming. We first show that if the right-hand side vector of a standard linear program is perturbed, then the analytic center of the optimal face

  18. Lymph node segmentation by dynamic programming and active contours.

    Science.gov (United States)

    Tan, Yongqiang; Lu, Lin; Bonde, Apurva; Wang, Deling; Qi, Jing; Schwartz, Lawrence H; Zhao, Binsheng

    2018-03-03

    Enlarged lymph nodes are indicators of cancer staging, and the change in their size is a reflection of treatment response. Automatic lymph node segmentation is challenging, as the boundary can be unclear and the surrounding structures complex. This work communicates a new three-dimensional algorithm for the segmentation of enlarged lymph nodes. The algorithm requires a user to draw a region of interest (ROI) enclosing the lymph node. Rays are cast from the center of the ROI, and the intersections of the rays and the boundary of the lymph node form a triangle mesh. The intersection points are determined by dynamic programming. The triangle mesh initializes an active contour which evolves to low-energy boundary. Three radiologists independently delineated the contours of 54 lesions from 48 patients. Dice coefficient was used to evaluate the algorithm's performance. The mean Dice coefficient between computer and the majority vote results was 83.2%. The mean Dice coefficients between the three radiologists' manual segmentations were 84.6%, 86.2%, and 88.3%. The performance of this segmentation algorithm suggests its potential clinical value for quantifying enlarged lymph nodes. © 2018 American Association of Physicists in Medicine.

  19. ATLAST ULE mirror segment performance analytical predictions based on thermally induced distortions

    Science.gov (United States)

    Eisenhower, Michael J.; Cohen, Lester M.; Feinberg, Lee D.; Matthews, Gary W.; Nissen, Joel A.; Park, Sang C.; Peabody, Hume L.

    2015-09-01

    The Advanced Technology Large-Aperture Space Telescope (ATLAST) is a concept for a 9.2 m aperture space-borne observatory operating across the UV/Optical/NIR spectra. The primary mirror for ATLAST is a segmented architecture with pico-meter class wavefront stability. Due to its extraordinarily low coefficient of thermal expansion, a leading candidate for the primary mirror substrate is Corning's ULE® titania-silicate glass. The ATLAST ULE® mirror substrates will be maintained at `room temperature' during on orbit flight operations minimizing the need for compensation of mirror deformation between the manufacturing temperature and the operational temperatures. This approach requires active thermal management to maintain operational temperature while on orbit. Furthermore, the active thermal control must be sufficiently stable to prevent time-varying thermally induced distortions in the mirror substrates. This paper describes a conceptual thermal management system for the ATLAST 9.2 m segmented mirror architecture that maintains the wavefront stability to less than 10 pico-meters/10 minutes RMS. Thermal and finite element models, analytical techniques, accuracies involved in solving the mirror figure errors, and early findings from the thermal and thermal-distortion analyses are presented.

  20. Pre-analytical and analytical variation of drug determination in segmented hair using ultra-performance liquid chromatography-tandem mass spectrometry

    DEFF Research Database (Denmark)

    Nielsen, Marie Katrine Klose; Johansen, Sys Stybe; Linnet, Kristian

    2014-01-01

    variation was estimated to be less than 15% for almost all compounds. The method was successfully applied to 25 segmented hair specimens from deceased drug addicts showing a broad pattern of poly-drug use. The pre-analytical sampling variation was estimated from the genuine duplicate measurements of two...

  1. FASP, an analytic resource appraisal program for petroleum play analysis

    Science.gov (United States)

    Crovelli, R.A.; Balay, R.H.

    1986-01-01

    An analytic probabilistic methodology for resource appraisal of undiscovered oil and gas resources in play analysis is presented in a FORTRAN program termed FASP. This play-analysis methodology is a geostochastic system for petroleum resource appraisal in explored as well as frontier areas. An established geologic model considers both the uncertainty of the presence of the assessed hydrocarbon and its amount if present. The program FASP produces resource estimates of crude oil, nonassociated gas, dissolved gas, and gas for a geologic play in terms of probability distributions. The analytic method is based upon conditional probability theory and many laws of expectation and variance. ?? 1986.

  2. Exact analytical modeling of magnetic vector potential in surface inset permanent magnet DC machines considering magnet segmentation

    Science.gov (United States)

    Jabbari, Ali

    2018-01-01

    Surface inset permanent magnet DC machine can be used as an alternative in automation systems due to their high efficiency and robustness. Magnet segmentation is a common technique in order to mitigate pulsating torque components in permanent magnet machines. An accurate computation of air-gap magnetic field distribution is necessary in order to calculate machine performance. An exact analytical method for magnetic vector potential calculation in surface inset permanent magnet machines considering magnet segmentation has been proposed in this paper. The analytical method is based on the resolution of Laplace and Poisson equations as well as Maxwell equation in polar coordinate by using sub-domain method. One of the main contributions of the paper is to derive an expression for the magnetic vector potential in the segmented PM region by using hyperbolic functions. The developed method is applied on the performance computation of two prototype surface inset magnet segmented motors with open circuit and on load conditions. The results of these models are validated through FEM method.

  3. Hanford performance evaluation program for Hanford site analytical services

    International Nuclear Information System (INIS)

    Markel, L.P.

    1995-09-01

    The U.S. Department of Energy (DOE) Order 5700.6C, Quality Assurance, and Title 10 of the Code of Federal Regulations, Part 830.120, Quality Assurance Requirements, states that it is the responsibility of DOE contractors to ensure that ''quality is achieved and maintained by those who have been assigned the responsibility for performing the work.'' Hanford Analytical Services Quality Assurance Plan (HASQAP) is designed to meet the needs of the Richland Operations Office (RL) for maintaining a consistent level of quality for the analytical chemistry services provided by contractor and commmercial analytical laboratory operations. Therefore, services supporting Hanford environmental monitoring, environmental restoration, and waste management analytical services shall meet appropriate quality standards. This performance evaluation program will monitor the quality standards of all analytical laboratories supporting the Hanforad Site including on-site and off-site laboratories. The monitoring and evaluation of laboratory performance can be completed by the use of several tools. This program will discuss the tools that will be utilized for laboratory performance evaluations. Revision 0 will primarily focus on presently available programs using readily available performance evaluation materials provided by DOE, EPA or commercial sources. Discussion of project specific PE materials and evaluations will be described in section 9.0 and Appendix A

  4. Shielded analytical laboratory activities supporting waste isolation programs

    International Nuclear Information System (INIS)

    McCown, J.J.

    1985-08-01

    The Shielded Analytical Laboratory (SAL) is a six cell manipulator-equipped facility which was built in 1962 as an addition to the 325 Radiochemistry Bldg. in the 300 Area at Hanford. The facility provides the capability for handling a wide variety of radioactive materials and performing chemical dissolutions, separations and analyses on nuclear fuels, components, waste forms and materials from R and D programs

  5. Research on analytical model and design formulas of permanent magnetic bearings based on Halbach array with arbitrary segmented magnetized angle

    International Nuclear Information System (INIS)

    Wang, Nianxian; Wang, Dongxiong; Chen, Kuisheng; Wu, Huachun

    2016-01-01

    The bearing capacity of permanent magnetic bearings can be improved efficiently by using the Halbach array magnetization. However, the research on analytical model of Halbach array PMBs with arbitrary segmented magnetized angle has not been developed. The application of Halbach array PMBs has been limited by the absence of the analytical model and design formulas. In this research, the Halbach array PMBs with arbitrary segmented magnetized angle has been studied. The magnetization model of bearings is established. The magnetic field distribution model of the permanent magnet array is established by using the scalar magnetic potential model. On the basis of this, the bearing force model and the bearing stiffness model of the PMBs are established based on the virtual displacement method. The influence of the pair of magnetic rings in one cycle and the structure parameters of PMBs on the maximal bearing capacity and support stiffness characteristics are studied. The reference factors for the design process of PMBs have been given. Finally, the theoretical model and the conclusions are verified by the finite element analysis.

  6. Determination Public Acceptance Segmentation for Nuclear Power Program Interest

    International Nuclear Information System (INIS)

    Syirrazie Che Soh; Aini Wahidah Abdul Wahab

    2012-01-01

    This paper is focus to discuss segmentation aspect among inter-disciplinary group of public. This discussion is the pre-stage to ensure the right initiative strategies are implemented to gain public interest and acceptance towards on developing nuclear power plant. The applied strategies are implemented based on different interest among the different groups of public. These strategies may increase public acceptance level towards developing nuclear power plant. (author)

  7. Segmentation of Portuguese customers’ expectations from fitness programs

    Directory of Open Access Journals (Sweden)

    Ricardo Gouveia Rodrigues

    2017-10-01

    Full Text Available Expectations towards fitness exercises are the major factor in customer satisfaction in the service sector in question. The purpose of this study is to present a segmentation framework for fitness customers, based on their individual expectations. The survey was designed and validated to evaluate individual expectations towards exercises. The study included a randomly recruited sample of 723 subjects (53% males; 47% females; 42.1±19.7 years. Factor analysis and cluster analysis with Ward’s cluster method with squared Euclidean distance were used to analyse the data obtained. Four components were extracted (performance, enjoyment, beauty and health explaining 68.7% of the total variance and three distinct segments were found: Exercise Lovers (n=312, Disinterested (n=161 and Beauty Seekers (n=250. All the factors identified have a significant contribution to differentiate the clusters, the first and third clusters being most similar. The segmentation framework obtained based on customer expectations allows better understanding of customers’ profiles, thus helping the fitness industry develop services more suitable for each type of customers. A follow-up study was conducted 5 years later and the results concur with the initial study.

  8. Automatic segmentation of closed-contour features in ophthalmic images using graph theory and dynamic programming

    Science.gov (United States)

    Chiu, Stephanie J.; Toth, Cynthia A.; Bowes Rickman, Catherine; Izatt, Joseph A.; Farsiu, Sina

    2012-01-01

    This paper presents a generalized framework for segmenting closed-contour anatomical and pathological features using graph theory and dynamic programming (GTDP). More specifically, the GTDP method previously developed for quantifying retinal and corneal layer thicknesses is extended to segment objects such as cells and cysts. The presented technique relies on a transform that maps closed-contour features in the Cartesian domain into lines in the quasi-polar domain. The features of interest are then segmented as layers via GTDP. Application of this method to segment closed-contour features in several ophthalmic image types is shown. Quantitative validation experiments for retinal pigmented epithelium cell segmentation in confocal fluorescence microscopy images attests to the accuracy of the presented technique. PMID:22567602

  9. Evaluation of analytical results on DOE Quality Assessment Program Samples

    International Nuclear Information System (INIS)

    Jaquish, R.E.; Kinnison, R.R.; Mathur, S.P.; Sastry, R.

    1985-01-01

    Criteria were developed for evaluating the participants analytical results in the DOE Quality Assessment Program (QAP). Historical data from previous QAP studies were analyzed using descriptive statistical methods to determine the interlaboratory precision that had been attained. Performance criteria used in other similar programs were also reviewed. Using these data, precision values and control limits were recommended for each type of analysis performed in the QA program. Results of the analysis performed by the QAP participants on the November 1983 samples were statistically analyzed and evaluated. The Environmental Measurements Laboratory (EML) values were used as the known values and 3-sigma precision values were used as control limits. Results were submitted by 26 participating laboratories for 49 different radionuclide media combinations. The participants reported 419 results and of these, 350 or 84% were within control limits. Special attention was given to the data from gamma spectral analysis of air filters and water samples. both normal probability and box plots were prepared for each nuclide to help evaluate the distribution of the data. Results that were outside the expected range were identified and suggestions made that laboratories check calculations, and procedures on these results

  10. Fully automated segmentation of left ventricle using dual dynamic programming in cardiac cine MR images

    Science.gov (United States)

    Jiang, Luan; Ling, Shan; Li, Qiang

    2016-03-01

    Cardiovascular diseases are becoming a leading cause of death all over the world. The cardiac function could be evaluated by global and regional parameters of left ventricle (LV) of the heart. The purpose of this study is to develop and evaluate a fully automated scheme for segmentation of LV in short axis cardiac cine MR images. Our fully automated method consists of three major steps, i.e., LV localization, LV segmentation at end-diastolic phase, and LV segmentation propagation to the other phases. First, the maximum intensity projection image along the time phases of the midventricular slice, located at the center of the image, was calculated to locate the region of interest of LV. Based on the mean intensity of the roughly segmented blood pool in the midventricular slice at each phase, end-diastolic (ED) and end-systolic (ES) phases were determined. Second, the endocardial and epicardial boundaries of LV of each slice at ED phase were synchronously delineated by use of a dual dynamic programming technique. The external costs of the endocardial and epicardial boundaries were defined with the gradient values obtained from the original and enhanced images, respectively. Finally, with the advantages of the continuity of the boundaries of LV across adjacent phases, we propagated the LV segmentation from the ED phase to the other phases by use of dual dynamic programming technique. The preliminary results on 9 clinical cardiac cine MR cases show that the proposed method can obtain accurate segmentation of LV based on subjective evaluation.

  11. BeamOptics. A program for analytical beam optics

    International Nuclear Information System (INIS)

    Autin, B.; Carli, C.; D'Amico, T.; Groebner, O.; Martini, M.; Wildner, E.

    1998-01-01

    Analytical beam optics deals with the basic properties of the magnetic modules which compose particle accelerators in the same way as light optics was developed for telescopes, microscopes, or other instruments. The difference between photon and charged-particle optics lies in the nature of the field which acts upon the particle. The magnets of accelerators do not have the rotational symmetry of glass lenses and the computational problems are much more difficult. For this reason, the symbolic program BeamOptics has been written to assist the user in finding the parameters of systems whose complexity is better treated by computer than by hand. Symbolic results may be hard to interpret. Thin-lens models have been adopted because their description is algebraic and emphasis has been put on the existence of solutions, the number of solutions, and simple yet unknown special schemes. The program can also be applied to real machines with long elements. In that case, it works with numerical data but the results are accessible through continuous functions which provide the machine parameters at arbitrary positions along the reference orbit. The code is organized to be implemented in accelerator controls and has functions to correct all the first-order perturbations using a universal procedure. (orig.)

  12. Visual programming for next-generation sequencing data analytics.

    Science.gov (United States)

    Milicchio, Franco; Rose, Rebecca; Bian, Jiang; Min, Jae; Prosperi, Mattia

    2016-01-01

    High-throughput or next-generation sequencing (NGS) technologies have become an established and affordable experimental framework in biological and medical sciences for all basic and translational research. Processing and analyzing NGS data is challenging. NGS data are big, heterogeneous, sparse, and error prone. Although a plethora of tools for NGS data analysis has emerged in the past decade, (i) software development is still lagging behind data generation capabilities, and (ii) there is a 'cultural' gap between the end user and the developer. Generic software template libraries specifically developed for NGS can help in dealing with the former problem, whilst coupling template libraries with visual programming may help with the latter. Here we scrutinize the state-of-the-art low-level software libraries implemented specifically for NGS and graphical tools for NGS analytics. An ideal developing environment for NGS should be modular (with a native library interface), scalable in computational methods (i.e. serial, multithread, distributed), transparent (platform-independent), interoperable (with external software interface), and usable (via an intuitive graphical user interface). These characteristics should facilitate both the run of standardized NGS pipelines and the development of new workflows based on technological advancements or users' needs. We discuss in detail the potential of a computational framework blending generic template programming and visual programming that addresses all of the current limitations. In the long term, a proper, well-developed (although not necessarily unique) software framework will bridge the current gap between data generation and hypothesis testing. This will eventually facilitate the development of novel diagnostic tools embedded in routine healthcare.

  13. Coordinated experimental/analytical program for investigating margins to failure of Category I reinforced concrete structures

    International Nuclear Information System (INIS)

    Endebrock, E.; Dove, R.; Anderson, C.A.

    1981-01-01

    The material presented in this paper deals with a coordinated experimental/analytical program designed to provide information needed for making margins to failure assessments of seismic Category I reinforced concrete structures. The experimental program is emphasized and background information that lead to this particular experimental approach is presented. Analytical tools being developed to supplement the experimental program are discussed. 16 figures

  14. Cavity contour segmentation in chest radiographs using supervised learning and dynamic programming

    Energy Technology Data Exchange (ETDEWEB)

    Maduskar, Pragnya, E-mail: pragnya.maduskar@radboudumc.nl; Hogeweg, Laurens; Sánchez, Clara I.; Ginneken, Bram van [Diagnostic Image Analysis Group, Radboud University Medical Center, Nijmegen, 6525 GA (Netherlands); Jong, Pim A. de [Department of Radiology, University Medical Center Utrecht, 3584 CX (Netherlands); Peters-Bax, Liesbeth [Department of Radiology, Radboud University Medical Center, Nijmegen, 6525 GA (Netherlands); Dawson, Rodney [University of Cape Town Lung Institute, Cape Town 7700 (South Africa); Ayles, Helen [Department of Infectious and Tropical Diseases, London School of Hygiene and Tropical Medicine, London WC1E 7HT (United Kingdom)

    2014-07-15

    Purpose: Efficacy of tuberculosis (TB) treatment is often monitored using chest radiography. Monitoring size of cavities in pulmonary tuberculosis is important as the size predicts severity of the disease and its persistence under therapy predicts relapse. The authors present a method for automatic cavity segmentation in chest radiographs. Methods: A two stage method is proposed to segment the cavity borders, given a user defined seed point close to the center of the cavity. First, a supervised learning approach is employed to train a pixel classifier using texture and radial features to identify the border pixels of the cavity. A likelihood value of belonging to the cavity border is assigned to each pixel by the classifier. The authors experimented with four different classifiers:k-nearest neighbor (kNN), linear discriminant analysis (LDA), GentleBoost (GB), and random forest (RF). Next, the constructed likelihood map was used as an input cost image in the polar transformed image space for dynamic programming to trace the optimal maximum cost path. This constructed path corresponds to the segmented cavity contour in image space. Results: The method was evaluated on 100 chest radiographs (CXRs) containing 126 cavities. The reference segmentation was manually delineated by an experienced chest radiologist. An independent observer (a chest radiologist) also delineated all cavities to estimate interobserver variability. Jaccard overlap measure Ω was computed between the reference segmentation and the automatic segmentation; and between the reference segmentation and the independent observer's segmentation for all cavities. A median overlap Ω of 0.81 (0.76 ± 0.16), and 0.85 (0.82 ± 0.11) was achieved between the reference segmentation and the automatic segmentation, and between the segmentations by the two radiologists, respectively. The best reported mean contour distance and Hausdorff distance between the reference and the automatic segmentation were

  15. Cavity contour segmentation in chest radiographs using supervised learning and dynamic programming

    International Nuclear Information System (INIS)

    Maduskar, Pragnya; Hogeweg, Laurens; Sánchez, Clara I.; Ginneken, Bram van; Jong, Pim A. de; Peters-Bax, Liesbeth; Dawson, Rodney; Ayles, Helen

    2014-01-01

    Purpose: Efficacy of tuberculosis (TB) treatment is often monitored using chest radiography. Monitoring size of cavities in pulmonary tuberculosis is important as the size predicts severity of the disease and its persistence under therapy predicts relapse. The authors present a method for automatic cavity segmentation in chest radiographs. Methods: A two stage method is proposed to segment the cavity borders, given a user defined seed point close to the center of the cavity. First, a supervised learning approach is employed to train a pixel classifier using texture and radial features to identify the border pixels of the cavity. A likelihood value of belonging to the cavity border is assigned to each pixel by the classifier. The authors experimented with four different classifiers:k-nearest neighbor (kNN), linear discriminant analysis (LDA), GentleBoost (GB), and random forest (RF). Next, the constructed likelihood map was used as an input cost image in the polar transformed image space for dynamic programming to trace the optimal maximum cost path. This constructed path corresponds to the segmented cavity contour in image space. Results: The method was evaluated on 100 chest radiographs (CXRs) containing 126 cavities. The reference segmentation was manually delineated by an experienced chest radiologist. An independent observer (a chest radiologist) also delineated all cavities to estimate interobserver variability. Jaccard overlap measure Ω was computed between the reference segmentation and the automatic segmentation; and between the reference segmentation and the independent observer's segmentation for all cavities. A median overlap Ω of 0.81 (0.76 ± 0.16), and 0.85 (0.82 ± 0.11) was achieved between the reference segmentation and the automatic segmentation, and between the segmentations by the two radiologists, respectively. The best reported mean contour distance and Hausdorff distance between the reference and the automatic segmentation were

  16. Defining Audience Segments for Extension Programming Using Reported Water Conservation Practices

    Science.gov (United States)

    Monaghan, Paul; Ott, Emily; Wilber, Wendy; Gouldthorpe, Jessica; Racevskis, Laila

    2013-01-01

    A tool from social marketing can help Extension agents understand distinct audience segments among their constituents. Defining targeted audiences for Extension programming is a first step to influencing behavior change among the public. An online survey was conducted using an Extension email list for urban households receiving a monthly lawn and…

  17. Improved dynamic-programming-based algorithms for segmentation of masses in mammograms

    International Nuclear Information System (INIS)

    Dominguez, Alfonso Rojas; Nandi, Asoke K.

    2007-01-01

    In this paper, two new boundary tracing algorithms for segmentation of breast masses are presented. These new algorithms are based on the dynamic programming-based boundary tracing (DPBT) algorithm proposed in Timp and Karssemeijer, [S. Timp and N. Karssemeijer, Med. Phys. 31, 958-971 (2004)] The DPBT algorithm contains two main steps: (1) construction of a local cost function, and (2) application of dynamic programming to the selection of the optimal boundary based on the local cost function. The validity of some assumptions used in the design of the DPBT algorithm is tested in this paper using a set of 349 mammographic images. Based on the results of the tests, modifications to the computation of the local cost function have been designed and have resulted in the Improved-DPBT (IDPBT) algorithm. A procedure for the dynamic selection of the strength of the components of the local cost function is presented that makes these parameters independent of the image dataset. Incorporation of this dynamic selection procedure has produced another new algorithm which we have called ID 2 PBT. Methods for the determination of some other parameters of the DPBT algorithm that were not covered in the original paper are presented as well. The merits of the new IDPBT and ID 2 PBT algorithms are demonstrated experimentally by comparison against the DPBT algorithm. The segmentation results are evaluated with base on the area overlap measure and other segmentation metrics. Both of the new algorithms outperform the original DPBT; the improvements in the algorithms performance are more noticeable around the values of the segmentation metrics corresponding to the highest segmentation accuracy, i.e., the new algorithms produce more optimally segmented regions, rather than a pronounced increase in the average quality of all the segmented regions

  18. One size (never) fits all: segment differences observed following a school-based alcohol social marketing program.

    Science.gov (United States)

    Dietrich, Timo; Rundle-Thiele, Sharyn; Leo, Cheryl; Connor, Jason

    2015-04-01

    According to commercial marketing theory, a market orientation leads to improved performance. Drawing on the social marketing principles of segmentation and audience research, the current study seeks to identify segments to examine responses to a school-based alcohol social marketing program. A sample of 371 year 10 students (aged: 14-16 years; 51.4% boys) participated in a prospective (pre-post) multisite alcohol social marketing program. Game On: Know Alcohol (GO:KA) program included 6, student-centered, and interactive lessons to teach adolescents about alcohol and strategies to abstain or moderate drinking. A repeated measures design was used. Baseline demographics, drinking attitudes, drinking intentions, and alcohol knowledge were cluster analyzed to identify segments. Change on key program outcome measures and satisfaction with program components were assessed by segment. Three segments were identified; (1) Skeptics, (2) Risky Males, (3) Good Females. Segments 2 and 3 showed greatest change in drinking attitudes and intentions. Good Females reported highest satisfaction with all program components and Skeptics lowest program satisfaction with all program components. Three segments, each differing on psychographic and demographic variables, exhibited different change patterns following participation in GO:KA. Post hoc analysis identified that satisfaction with program components differed by segment offering opportunities for further research. © 2015, American School Health Association.

  19. Dynamic programming in parallel boundary detection with application to ultrasound intima-media segmentation.

    Science.gov (United States)

    Zhou, Yuan; Cheng, Xinyao; Xu, Xiangyang; Song, Enmin

    2013-12-01

    Segmentation of carotid artery intima-media in longitudinal ultrasound images for measuring its thickness to predict cardiovascular diseases can be simplified as detecting two nearly parallel boundaries within a certain distance range, when plaque with irregular shapes is not considered. In this paper, we improve the implementation of two dynamic programming (DP) based approaches to parallel boundary detection, dual dynamic programming (DDP) and piecewise linear dual dynamic programming (PL-DDP). Then, a novel DP based approach, dual line detection (DLD), which translates the original 2-D curve position to a 4-D parameter space representing two line segments in a local image segment, is proposed to solve the problem while maintaining efficiency and rotation invariance. To apply the DLD to ultrasound intima-media segmentation, it is imbedded in a framework that employs an edge map obtained from multiplication of the responses of two edge detectors with different scales and a coupled snake model that simultaneously deforms the two contours for maintaining parallelism. The experimental results on synthetic images and carotid arteries of clinical ultrasound images indicate improved performance of the proposed DLD compared to DDP and PL-DDP, with respect to accuracy and efficiency. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Learning Analytics: Potential for Enhancing School Library Programs

    Science.gov (United States)

    Boulden, Danielle Cadieux

    2015-01-01

    Learning analytics has been defined as the measurement, collection, analysis, and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs. The potential use of data and learning analytics in educational contexts has caught the attention of educators and…

  1. Endocardium and Epicardium Segmentation in MR Images Based on Developed Otsu and Dynamic Programming

    Directory of Open Access Journals (Sweden)

    Shengzhou XU

    2014-03-01

    Full Text Available In order to accurately extract the endocardium and epicardium of the left ventricle from cardiac magnetic resonance (MR images, a method based on developed Otsu and dynamic programming has been proposed. First, regions with high gray value are divided into several left ventricle candidate regions by the developed Otsu algorithm, which based on constraining the search range of the ideal segmentation threshold. Then, left ventricular blood pool is selected from the candidate regions and its convex hull is found out as the endocardium. The epicardium is derived by applying dynamic programming method to find a closed path with minimum local cost. The local cost function of the dynamic programming method consists of two factors: boundary gradient and shape features. In order to improve the accuracy of segmentation, a non-maxima gradient suppression technique is adopted to get the boundary gradient. The experimental result of 138 MR images show that the method proposed has high accuracy and robustness.

  2. ORBITALES. A program for the calculation of wave functions with an analytical central potential

    International Nuclear Information System (INIS)

    Yunta Carretero; Rodriguez Mayquez, E.

    1974-01-01

    In this paper is described the objective, basis, carrying out in FORTRAN language and use of the program ORBITALES. This program calculate atomic wave function in the case of ths analytical central potential (Author) 8 refs

  3. One Size (Never) Fits All: Segment Differences Observed Following a School-Based Alcohol Social Marketing Program

    Science.gov (United States)

    Dietrich, Timo; Rundle-Thiele, Sharyn; Leo, Cheryl; Connor, Jason

    2015-01-01

    Background: According to commercial marketing theory, a market orientation leads to improved performance. Drawing on the social marketing principles of segmentation and audience research, the current study seeks to identify segments to examine responses to a school-based alcohol social marketing program. Methods: A sample of 371 year 10 students…

  4. ZFITTER - an analytical program for fermion-pair production

    International Nuclear Information System (INIS)

    Riemann, T.

    1992-10-01

    I discuss the semi-analytical codes which have been developed for the Z line-shape analysis at LEP I. They are applied for a model-independent and, when using a weak library, a Standard Model interpretation of the data. Some of them are applicable for New Physics searches. The package ZF I TT ER serves as an example, and comparisons of the codes are discussed. The degrees of freedom of the line shape and of asymmetries are made explicit. (orig.)

  5. System and Method for Providing a Climate Data Analytic Services Application Programming Interface Distribution Package

    Science.gov (United States)

    Schnase, John L. (Inventor); Duffy, Daniel Q. (Inventor); Tamkin, Glenn S. (Inventor)

    2016-01-01

    A system, method and computer-readable storage devices for providing a climate data analytic services application programming interface distribution package. The example system can provide various components. The system provides a climate data analytic services application programming interface library that enables software applications running on a client device to invoke the capabilities of a climate data analytic service. The system provides a command-line interface that provides a means of interacting with a climate data analytic service by issuing commands directly to the system's server interface. The system provides sample programs that call on the capabilities of the application programming interface library and can be used as templates for the construction of new client applications. The system can also provide test utilities, build utilities, service integration utilities, and documentation.

  6. Web Analytics

    Science.gov (United States)

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  7. The accelerated site technology deployment program presents the segmented gate system

    International Nuclear Information System (INIS)

    Patteson, Raymond; Maynor, Doug; Callan, Connie

    2000-01-01

    The Department of Energy (DOE) is working to accelerate the acceptance and application of innovative technologies that improve the way the nation manages its environmental remediation problems. The DOE Office of Science and Technology established the Accelerated Site Technology Deployment Program (ASTD) to help accelerate the acceptance and implementation of new and innovative soil and ground water remediation technologies. Coordinated by the Department of Energy's Idaho Office, the ASTD Program reduces many of the classic barriers to the deployment of new technologies by involving government, industry, and regulatory agencies in the assessment, implementation, and validation of innovative technologies. The paper uses the example of the Segmented Gate System (SGS) to illustrate how the ASTD program works. The SGS was used to cost effectively separate clean and contaminated soil for four different radionuclides: plutonium, uranium, thorium, and cesium. Based on those results, it has been proposed to use the SGS at seven other DOE sites across the country

  8. A new 2D segmentation method based on dynamic programming applied to computer aided detection in mammography

    International Nuclear Information System (INIS)

    Timp, Sheila; Karssemeijer, Nico

    2004-01-01

    Mass segmentation plays a crucial role in computer-aided diagnosis (CAD) systems for classification of suspicious regions as normal, benign, or malignant. In this article we present a robust and automated segmentation technique--based on dynamic programming--to segment mass lesions from surrounding tissue. In addition, we propose an efficient algorithm to guarantee resulting contours to be closed. The segmentation method based on dynamic programming was quantitatively compared with two other automated segmentation methods (region growing and the discrete contour model) on a dataset of 1210 masses. For each mass an overlap criterion was calculated to determine the similarity with manual segmentation. The mean overlap percentage for dynamic programming was 0.69, for the other two methods 0.60 and 0.59, respectively. The difference in overlap percentage was statistically significant. To study the influence of the segmentation method on the performance of a CAD system two additional experiments were carried out. The first experiment studied the detection performance of the CAD system for the different segmentation methods. Free-response receiver operating characteristics analysis showed that the detection performance was nearly identical for the three segmentation methods. In the second experiment the ability of the classifier to discriminate between malignant and benign lesions was studied. For region based evaluation the area A z under the receiver operating characteristics curve was 0.74 for dynamic programming, 0.72 for the discrete contour model, and 0.67 for region growing. The difference in A z values obtained by the dynamic programming method and region growing was statistically significant. The differences between other methods were not significant

  9. The use of mixed-integer programming for inverse treatment planning with pre-defined field segments

    International Nuclear Information System (INIS)

    Bednarz, Greg; Michalski, Darek; Houser, Chris; Huq, M. Saiful; Xiao Ying; Rani, Pramila Anne; Galvin, James M.

    2002-01-01

    Complex intensity patterns generated by traditional beamlet-based inverse treatment plans are often very difficult to deliver. In the approach presented in this work the intensity maps are controlled by pre-defining field segments to be used for dose optimization. A set of simple rules was used to define a pool of allowable delivery segments and the mixed-integer programming (MIP) method was used to optimize segment weights. The optimization problem was formulated by combining real variables describing segment weights with a set of binary variables, used to enumerate voxels in targets and critical structures. The MIP method was compared to the previously used Cimmino projection algorithm. The field segmentation approach was compared to an inverse planning system with a traditional beamlet-based beam intensity optimization. In four complex cases of oropharyngeal cancer the segmental inverse planning produced treatment plans, which competed with traditional beamlet-based IMRT plans. The mixed-integer programming provided mechanism for imposition of dose-volume constraints and allowed for identification of the optimal solution for feasible problems. Additional advantages of the segmental technique presented here are: simplified dosimetry, quality assurance and treatment delivery. (author)

  10. Actinide analytical program for characterization of Hanford waste

    International Nuclear Information System (INIS)

    Johnson, S.J.; Winters, W.I.

    1977-01-01

    The objective of this program has been to develop faster, more accurate methods for the concentration and determination of actinides at their maximum permissible concentration (MPC) levels in a controlled zone. These analyses are needed to characterize various forms of Hanford high rad waste and to support characterization of products and effluents from new waste management processes. The most acceptable methods developed for the determination of 239 Pu, 238 Pu, 237 Np, 241 Am, and 243 Cm employ solvent extraction with the addition of tracer isotopes. Plutonium and neptunium are extracted from acidified waste solutions into Aliquat-336. Americium and curium are then extracted from the waste solution at the same acidity into dihexyl-N,N-diethylcarbamylmethylenephosphonate (DHDECMP). After back extraction into an aqueous matrix, these actinides are electrodeposited on steel disks for alpha energy analysis. Total uranium and total thorium are also isolated by solvent extraction and determined spectrophotometrically

  11. 3D analytical field calculation using triangular magnet segments applied to a skewed linear permanent magnet actuator

    NARCIS (Netherlands)

    Janssen, J.L.G.; Paulides, J.J.H.; Lomonova, E.

    2010-01-01

    This paper presents novel analytical expressions which describe the 3D magnetic field of arbitrarily magnetized triangular-shaped charged surfaces. These versatile expressions are suitable to model triangularshaped permanent magnets and can be expanded to any polyhedral shape. Many applications are

  12. 3D Analytical field calculation using triangular magnet segments applied to a skewed linear permanent magnet actuator

    NARCIS (Netherlands)

    Janssen, J.L.G.; Paulides, J.J.H.; Lomonova, E.

    2009-01-01

    This paper presents novel analytical expressions which describe the 3D magnetic field of arbitrarily magnetized triangular-shaped charged surfaces. These versatile expressions are suitable to model triangularshaped permanent magnets and can be expanded to any polyhedral shape. Many applications are

  13. LASL analytical chemistry program for fissionable materials safeguards

    International Nuclear Information System (INIS)

    Jackson, D.D.; Marsh, S.F.

    1979-01-01

    Gas-solid reactions at elevated temperature, used previously to convert uranium in refractory forms to species readily soluble in acid, are being applied to thorium materials. A microgram-sensitive spectrophotometric method was developed for determining uranium and the LASL Automated Spectrophotometer has been modified to use it. The instrument now is functional for determining milligram amounts of plutonium, and milligram and microgram amounts of uranium. Construction of an automated controlled-potential-coulometric analyzer has been completed. It is giving design performance of 0.1% relative standard deviation for the determination of plutonium using a method developed especially for the instrument. A method has been developed for the microcomplexometric titration of uranium in its stable (VI) oxidation state. A color probe analyzer assembled for this titration also has been used for microcomplexometric titration of thorium. The present status of reference materials prepared for NBS and for the SALE program, as well as examples of working reference materials prepared for use with nondestructive analyzers, is given. The interlaboratory measured value of the 239 Pu half-life is 24,119 y. Just completed measurement of the half life of 241 Pu is 14.38 y. Measurement of the 240 Pu half life is in progress

  14. The Solid* toolset for software visual analytics of program structure and metrics comprehension : From research prototype to product

    NARCIS (Netherlands)

    Reniers, Dennie; Voinea, Lucian; Ersoy, Ozan; Telea, Alexandru

    2014-01-01

    Software visual analytics (SVA) tools combine static program analysis and fact extraction with information visualization to support program comprehension. However, building efficient and effective SVA tools is highly challenging, as it involves extensive software development in program analysis,

  15. Standard guide for establishing a quality assurance program for analytical chemistry laboratories within the nuclear industry

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2006-01-01

    1.1 This guide covers the establishment of a quality assurance (QA) program for analytical chemistry laboratories within the nuclear industry. Reference to key elements of ANSI/ISO/ASQC Q9001, Quality Systems, provides guidance to the functional aspects of analytical laboratory operation. When implemented as recommended, the practices presented in this guide will provide a comprehensive QA program for the laboratory. The practices are grouped by functions, which constitute the basic elements of a laboratory QA program. 1.2 The essential, basic elements of a laboratory QA program appear in the following order: Section Organization 5 Quality Assurance Program 6 Training and Qualification 7 Procedures 8 Laboratory Records 9 Control of Records 10 Control of Procurement 11 Control of Measuring Equipment and Materials 12 Control of Measurements 13 Deficiencies and Corrective Actions 14

  16. Quality assurance of analytical, scientific, and design computer programs for nuclear power plants

    International Nuclear Information System (INIS)

    1994-06-01

    This Standard applies to the design and development, modification, documentation, execution, and configuration management of computer programs used to perform analytical, scientific, and design computations during the design and analysis of safety-related nuclear power plant equipment, systems, structures, and components as identified by the owner. 2 figs

  17. Quality assurance of analytical, scientific, and design computer programs for nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-06-01

    This Standard applies to the design and development, modification, documentation, execution, and configuration management of computer programs used to perform analytical, scientific, and design computations during the design and analysis of safety-related nuclear power plant equipment, systems, structures, and components as identified by the owner. 2 figs.

  18. Online Programs as Tools to Improve Parenting: A meta-analytic review

    NARCIS (Netherlands)

    Prof. Dr. Jo J.M.A Hermanns; Prof. Dr. Ruben R.G. Fukkink; dr. Christa C.C. Nieuwboer

    2013-01-01

    Background. A number of parenting programs, aimed at improving parenting competencies,have recently been adapted or designed with the use of online technologies. Although webbased services have been claimed to hold promise for parent support, a meta-analytic review of online parenting interventions

  19. Online programs as tools to improve parenting: A meta-analytic review

    NARCIS (Netherlands)

    Nieuwboer, C.C.; Fukkink, R.G.; Hermanns, J.M.A.

    2013-01-01

    Background: A number of parenting programs, aimed at improving parenting competencies, have recently been adapted or designed with the use of online technologies. Although web-based services have been claimed to hold promise for parent support, a meta-analytic review of online parenting

  20. Handling movement epenthesis and hand segmentation ambiguities in continuous sign language recognition using nested dynamic programming.

    Science.gov (United States)

    Yang, Ruiduo; Sarkar, Sudeep; Loeding, Barbara

    2010-03-01

    We consider two crucial problems in continuous sign language recognition from unaided video sequences. At the sentence level, we consider the movement epenthesis (me) problem and at the feature level, we consider the problem of hand segmentation and grouping. We construct a framework that can handle both of these problems based on an enhanced, nested version of the dynamic programming approach. To address movement epenthesis, a dynamic programming (DP) process employs a virtual me option that does not need explicit models. We call this the enhanced level building (eLB) algorithm. This formulation also allows the incorporation of grammar models. Nested within this eLB is another DP that handles the problem of selecting among multiple hand candidates. We demonstrate our ideas on four American Sign Language data sets with simple background, with the signer wearing short sleeves, with complex background, and across signers. We compared the performance with Conditional Random Fields (CRF) and Latent Dynamic-CRF-based approaches. The experiments show more than 40 percent improvement over CRF or LDCRF approaches in terms of the frame labeling rate. We show the flexibility of our approach when handling a changing context. We also find a 70 percent improvement in sign recognition rate over the unenhanced DP matching algorithm that does not accommodate the me effect.

  1. Waste Tank Organic Safety Program: Analytical methods development. Progress report, FY 1994

    International Nuclear Information System (INIS)

    Campbell, J.A.; Clauss, S.A.; Grant, K.E.

    1994-09-01

    The objectives of this task are to develop and document extraction and analysis methods for organics in waste tanks, and to extend these methods to the analysis of actual core samples to support the Waste Tank organic Safety Program. This report documents progress at Pacific Northwest Laboratory (a) during FY 1994 on methods development, the analysis of waste from Tank 241-C-103 (Tank C-103) and T-111, and the transfer of documented, developed analytical methods to personnel in the Analytical Chemistry Laboratory (ACL) and 222-S laboratory. This report is intended as an annual report, not a completed work

  2. Requirements for quality control of analytical data for the Environmental Restoration Program

    International Nuclear Information System (INIS)

    Engels, J.

    1992-12-01

    The Environmental Restoration (ER) Program was established for the investigation and remediation of inactive US Department of Energy (DOE) sites and facilities that have been declared surplus in terms of their previous uses. The purpose of this document is to Specify ER requirements for quality control (QC) of analytical data. Activities throughout all phases of the investigation may affect the quality of the final data product, thus are subject to control specifications. Laboratory control is emphasized in this document, and field concerns will be addressed in a companion document Energy Systems, in its role of technical coordinator and at the request of DOE-OR, extends the application of these requirements to all participants in ER activities. Because every instance and concern may not be addressed in this document, participants are encouraged to discuss any questions with the ER Quality Assurance (QA) Office, the Analytical Environmental Support Group (AESG), or the Analytical Project Office (APO)

  3. A general strategy for performing temperature-programming in high performance liquid chromatography--prediction of segmented temperature gradients.

    Science.gov (United States)

    Wiese, Steffen; Teutenberg, Thorsten; Schmidt, Torsten C

    2011-09-28

    In the present work it is shown that the linear elution strength (LES) model which was adapted from temperature-programming gas chromatography (GC) can also be employed to predict retention times for segmented-temperature gradients based on temperature-gradient input data in liquid chromatography (LC) with high accuracy. The LES model assumes that retention times for isothermal separations can be predicted based on two temperature gradients and is employed to calculate the retention factor of an analyte when changing the start temperature of the temperature gradient. In this study it was investigated whether this approach can also be employed in LC. It was shown that this approximation cannot be transferred to temperature-programmed LC where a temperature range from 60°C up to 180°C is investigated. Major relative errors up to 169.6% were observed for isothermal retention factor predictions. In order to predict retention times for temperature gradients with different start temperatures in LC, another relationship is required to describe the influence of temperature on retention. Therefore, retention times for isothermal separations based on isothermal input runs were predicted using a plot of the natural logarithm of the retention factor vs. the inverse temperature and a plot of the natural logarithm of the retention factor vs. temperature. It could be shown that a plot of lnk vs. T yields more reliable isothermal/isocratic retention time predictions than a plot of lnk vs. 1/T which is usually employed. Hence, in order to predict retention times for temperature-gradients with different start temperatures in LC, two temperature gradient and two isothermal measurements have been employed. In this case, retention times can be predicted with a maximal relative error of 5.5% (average relative error: 2.9%). In comparison, if the start temperature of the simulated temperature gradient is equal to the start temperature of the input data, only two temperature

  4. Analytical SN solutions in heterogeneous slabs using symbolic algebra computer programs

    International Nuclear Information System (INIS)

    Warsa, J.S.

    2002-01-01

    A modern symbolic algebra computer program, MAPLE, is used to compute solutions to the well-known analytical discrete ordinates, or S N , solutions in one-dimensional, slab geometry. Symbolic algebra programs compute the solutions with arbitrary precision and are free of spatial discretization error so they can be used to investigate new discretizations for one-dimensional slab, geometry S N methods. Pointwise scalar flux solutions are computed for several sample calculations of interest. Sample MAPLE command scripts are provided to illustrate how easily the theory can be translated into a working solution and serve as a complete tool capable of computing analytical S N solutions for mono-energetic, one-dimensional transport problems

  5. Optimization of hot water transport and distribution networks by analytical method: OPTAL program

    International Nuclear Information System (INIS)

    Barreau, Alain; Caizergues, Robert; Moret-Bailly, Jean

    1977-06-01

    This report presents optimization studies of hot water transport and distribution network by minimizing operating cost. Analytical optimization is used: Lagrange's method of undetermined multipliers. Optimum diameter of each pipe is calculated for minimum network operating cost. The characteristics of the computer program used for calculations, OPTAL, are given in this report. An example of network is calculated and described: 52 branches and 27 customers. Results are discussed [fr

  6. Application gives the technique the analytic tree in the evaluation the effectiveness programs to radiological protection

    International Nuclear Information System (INIS)

    Perez Gonzalez, F.; Perez Velazquez, R.S.; Fornet Rodriguez, O.; Mustelier Hechevarria, A.; Miller Clemente, A.

    1998-01-01

    In the work we develop the IAEA recommendations in the application the analytic tree as instrument for the evaluation the effectiveness the occupational radiological protection programs. Is reflected like it has been assimilated and converted that technique in daily work istruments in the evaluation process the security conditions in the institutions that apply the nuclear techniques with a view to its autorization on the part of the regulatory organ

  7. Laboratory quality assurance and its role in the safeguards analytical laboratory evaluation (SALE) program

    International Nuclear Information System (INIS)

    Delvin, W.L.; Pietri, C.E.

    1981-07-01

    Since the late 1960's, strong emphasis has been given to quality assurance in the nuclear industry, particularly to that part involved in nuclear reactors. This emphasis has had impact on the analytical chemistry laboratory because of the importance of analytical measurements in the certification and acceptance of materials used in the fabrication and construction of reactor components. Laboratory quality assurance, in which the principles of quality assurance are applied to laboratory operations, has a significant role to play in processing, fabrication, and construction programs of the nuclear industry. That role impacts not only process control and material certification, but also safeguards and nuclear materials accountability. The implementation of laboratory quality assurance is done through a program plan that specifies how the principles of quality assurance are to be applied. Laboratory quality assurance identifies weaknesses and deficiencies in laboratory operations and provides confidence in the reliability of laboratory results. Such confidence in laboratory measurements is essential to the proper evaluation of laboratories participating in the Safeguards Analytical Laboratory Evaluation (SALE) Program

  8. A prequalifying program for evaluating the analytical performance of commercial laboratories

    International Nuclear Information System (INIS)

    Reith, C.C.; Bishop, C.T.

    1987-01-01

    Soil and water samples were spiked with known activities of radionuclides and sent to seven commercial laboratories that had expressed an interest in analyzing environmental samples for the Waste Isolation Pilot Plant (WIPP). This Prequalifying Program was part of the selection process for an analytical subcontractor for a three-year program of baseline radiological surveillance around the WIPP site. Both media were spiked at three different activity levels with several transuranic radionuclides, as well as tritium, fission products, and activation products. Laboratory performance was evaluated by calculating relative error for each radionuclide in each sample, assigning grade values, and compiling grades into report cards for each candidate. Results for the five laboratories completing the Prequalifying Program were pooled to reveal differing degrees of difficulty among the treatments and radionuclides. Interlaboratory comparisons revealed systematic errors in the performance of one candidate. The final report cards contained clear differences among overall grades for the five laboratories, enabling analytical performance to be used as a quantitative criterion in the selection of an analytical subcontractor. (author)

  9. Flammable gas safety program. Analytical methods development: FY 1993 progress report

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, J.A.; Clauss, S.; Grant, K.; Hoopes, V.; Lerner, B.; Lucke, R.; Mong, G.; Rau, J.; Steele, R.

    1994-01-01

    This report describes the status of developing analytical methods to account for the organic constituents in Hanford waste tanks, with particular emphasis on those tanks that have been assigned to the Flammable Gas Watch List. Six samples of core segments from Tank 101-SY, obtained during the window E core sampling, have been analyzed for organic constituents. Four of the samples were from the upper region, or convective layer, of the tank and two were from the lower, nonconvective layer. The samples were analyzed for chelators, chelator fragments, and several carboxylic acids by derivatization gas chromatography/mass spectrometry (GC/MS). The major components detected were ethylenediaminetetraacetic acid (EDTA), nitroso-iminodiacetic acid (NIDA), nitrilotriacetic acid (NTA), citric acid (CA), succinic acid (SA), and ethylenediaminetriacetic acid (ED3A). The chelator of highest concentration was EDTA in all six samples analyzed. Liquid chromatography (LC) was used to quantitate low molecular weight acids (LMWA) including oxalic, formic, glycolic, and acetic acids, which are present in the waste as acid salts. From 23 to 61% of the total organic carbon (TOC) in the samples analyzed was accounted for by these acids. Oxalate constituted approximately 40% of the TOC in the nonconvective layer samples. Oxalate was found to be approximately 3 to 4 times higher in concentration in the nonconvective layer than in the convective layer. During FY 1993, LC methods for analyzing LWMA, and two chelators N-(2-hydroxyethyl) ethylenediaminetriacetic acid and EDTA, were transferred to personnel in the Analytical Chemistry Laboratory and the 222-S laboratory.

  10. Analytical Services Fiscal Year 1996 Multi-year Program Plan Fiscal Year Work Plan WBS 1.5.1, Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    This document contains the Fiscal Year 1996 Work Plan and Multi-Year Program Plan for the Analytical Services Program at the Hanford Reservation in Richland, Washington. The Analytical Services Program provides vital support to the Hanford Site mission and provides technically sound, defensible, cost effective, high quality analytical chemistry data for the site programs. This report describes the goals and strategies for continuance of the Analytical Services Program through fiscal year 1996 and beyond.

  11. Analytical Services Fiscal Year 1996 Multi-year Program Plan Fiscal Year Work Plan WBS 1.5.1, Revision 1

    International Nuclear Information System (INIS)

    1995-09-01

    This document contains the Fiscal Year 1996 Work Plan and Multi-Year Program Plan for the Analytical Services Program at the Hanford Reservation in Richland, Washington. The Analytical Services Program provides vital support to the Hanford Site mission and provides technically sound, defensible, cost effective, high quality analytical chemistry data for the site programs. This report describes the goals and strategies for continuance of the Analytical Services Program through fiscal year 1996 and beyond

  12. A Comparison of Two Commercial Volumetry Software Programs in the Analysis of Pulmonary Ground-Glass Nodules: Segmentation Capability and Measurement Accuracy

    Science.gov (United States)

    Kim, Hyungjin; Lee, Sang Min; Lee, Hyun-Ju; Goo, Jin Mo

    2013-01-01

    Objective To compare the segmentation capability of the 2 currently available commercial volumetry software programs with specific segmentation algorithms for pulmonary ground-glass nodules (GGNs) and to assess their measurement accuracy. Materials and Methods In this study, 55 patients with 66 GGNs underwent unenhanced low-dose CT. GGN segmentation was performed by using 2 volumetry software programs (LungCARE, Siemens Healthcare; LungVCAR, GE Healthcare). Successful nodule segmentation was assessed visually and morphologic features of GGNs were evaluated to determine factors affecting segmentation by both types of software. In addition, the measurement accuracy of the software programs was investigated by using an anthropomorphic chest phantom containing simulated GGNs. Results The successful nodule segmentation rate was significantly higher in LungCARE (90.9%) than in LungVCAR (72.7%) (p = 0.012). Vascular attachment was a negatively influencing morphologic feature of nodule segmentation for both software programs. As for measurement accuracy, mean relative volume measurement errors in nodules ≥ 10 mm were 14.89% with LungCARE and 19.96% with LungVCAR. The mean relative attenuation measurement errors in nodules ≥ 10 mm were 3.03% with LungCARE and 5.12% with LungVCAR. Conclusion LungCARE shows significantly higher segmentation success rates than LungVCAR. Measurement accuracy of volume and attenuation of GGNs is acceptable in GGNs ≥ 10 mm by both software programs. PMID:23901328

  13. A comparison of two commercial volumetry software programs in the analysis of pulmonary ground-glass nodules: Segmentation capability and measurement accuracy

    International Nuclear Information System (INIS)

    Kim, Hyung Jin; Park, Chang Min; Lee, Sang Min; Lee, Hyun Joo; Goo, Jin Mo

    2013-01-01

    To compare the segmentation capability of the 2 currently available commercial volumetry software programs with specific segmentation algorithms for pulmonary ground-glass nodules (GGNs) and to assess their measurement accuracy. In this study, 55 patients with 66 GGNs underwent unenhanced low-dose CT. GGN segmentation was performed by using 2 volumetry software programs (LungCARE, Siemens Healthcare; LungVCAR, GE Healthcare). Successful nodule segmentation was assessed visually and morphologic features of GGNs were evaluated to determine factors affecting segmentation by both types of software. In addition, the measurement accuracy of the software programs was investigated by using an anthropomorphic chest phantom containing simulated GGNs. The successful nodule segmentation rate was significantly higher in LungCARE (90.9%) than in LungVCAR (72.7%) (p = 0.012). Vascular attachment was a negatively influencing morphologic feature of nodule segmentation for both software programs. As for measurement accuracy, mean relative volume measurement errors in nodules ≥ 10 mm were 14.89% with LungCARE and 19.96% with LungVCAR. The mean relative attenuation measurement errors in nodules ≥ 10 mm were 3.03% with LungCARE and 5.12% with LungVCAR. LungCARE shows significantly higher segmentation success rates than LungVCAR. Measurement accuracy of volume and attenuation of GGNs is acceptable in GGNs ≥ 10 mm by both software programs.

  14. A comparison of two commercial volumetry software programs in the analysis of pulmonary ground-glass nodules: Segmentation capability and measurement accuracy

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyung Jin; Park, Chang Min; Lee, Sang Min; Lee, Hyun Joo; Goo, Jin Mo [Dept. of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul National University Medical Research Center, Seoul (Korea, Republic of)

    2013-08-15

    To compare the segmentation capability of the 2 currently available commercial volumetry software programs with specific segmentation algorithms for pulmonary ground-glass nodules (GGNs) and to assess their measurement accuracy. In this study, 55 patients with 66 GGNs underwent unenhanced low-dose CT. GGN segmentation was performed by using 2 volumetry software programs (LungCARE, Siemens Healthcare; LungVCAR, GE Healthcare). Successful nodule segmentation was assessed visually and morphologic features of GGNs were evaluated to determine factors affecting segmentation by both types of software. In addition, the measurement accuracy of the software programs was investigated by using an anthropomorphic chest phantom containing simulated GGNs. The successful nodule segmentation rate was significantly higher in LungCARE (90.9%) than in LungVCAR (72.7%) (p = 0.012). Vascular attachment was a negatively influencing morphologic feature of nodule segmentation for both software programs. As for measurement accuracy, mean relative volume measurement errors in nodules ≥ 10 mm were 14.89% with LungCARE and 19.96% with LungVCAR. The mean relative attenuation measurement errors in nodules ≥ 10 mm were 3.03% with LungCARE and 5.12% with LungVCAR. LungCARE shows significantly higher segmentation success rates than LungVCAR. Measurement accuracy of volume and attenuation of GGNs is acceptable in GGNs ≥ 10 mm by both software programs.

  15. A Comparison of Four Software Programs for Implementing Decision Analytic Cost-Effectiveness Models.

    Science.gov (United States)

    Hollman, Chase; Paulden, Mike; Pechlivanoglou, Petros; McCabe, Christopher

    2017-08-01

    The volume and technical complexity of both academic and commercial research using decision analytic modelling has increased rapidly over the last two decades. The range of software programs used for their implementation has also increased, but it remains true that a small number of programs account for the vast majority of cost-effectiveness modelling work. We report a comparison of four software programs: TreeAge Pro, Microsoft Excel, R and MATLAB. Our focus is on software commonly used for building Markov models and decision trees to conduct cohort simulations, given their predominance in the published literature around cost-effectiveness modelling. Our comparison uses three qualitative criteria as proposed by Eddy et al.: "transparency and validation", "learning curve" and "capability". In addition, we introduce the quantitative criterion of processing speed. We also consider the cost of each program to academic users and commercial users. We rank the programs based on each of these criteria. We find that, whilst Microsoft Excel and TreeAge Pro are good programs for educational purposes and for producing the types of analyses typically required by health technology assessment agencies, the efficiency and transparency advantages of programming languages such as MATLAB and R become increasingly valuable when more complex analyses are required.

  16. New analytical methodology to reach the actinide determination accuracy ({+-} 2%) required by the OSMOSE program

    Energy Technology Data Exchange (ETDEWEB)

    Boyer-Deslys, V.; Combaluzier, T.; Dalier, V.; Martin, J.C.; Viallesoubranne, C. [DRCP/SE2A/LAMM, CEA/VALRHO - Marcoule, BP 17171, 30207 Bagnols-sur-Ceze (France); Crozet, M. [LEHA, CEA/VALRHO - Marcoule, BP 17171, 30207 Bagnols-sur-Ceze (France)

    2008-07-01

    This article describes the analytical procedure optimized by LAMM (Laboratory for Analysis and Materials Methodology) in order to characterize the actinide-doped pellets used in the Osmose (Oscillation in Minerve of isotopes in eupraxis spectra) program (developed for transmutation reactor physics). Osmose aims at providing precise experimental data (absorption cross sections) for heavy nuclides (atomic mass from 232 to 245). This procedure requires the use of the analytical equipment and expertise of the LAMM: TIMS (Thermal Ionization Mass Spectrometer), ICP (Inductively Coupled Plasma) QMS (Quadrupole Mass Spectrometer), SFMS (Sector Field Mass Spectrometer), AES (Atomic Emission Spectrometer), alpha spectrometry and photo-gravimetric analysis. These techniques have met all the specification requirements: extended uncertainties (k=2) below {+-} 2% on the uranium and dopant concentrations, the impurity concentration and the americium-241 concentration.

  17. Evaluating the use of programming games for building early analytical thinking skills

    Directory of Open Access Journals (Sweden)

    H. Tsalapatas

    2015-11-01

    Full Text Available Analytical thinking is a transversal skill that helps learners excel academically independently of theme area. It is on high demand in the world of work especially in innovation related sectors. It involves finding a viable solution to a problem by identifying goals, parameters, and resources available for deployment. These are strategy elements in game play. They further constitute good practices in programming. This work evaluates how serious games based on visual programming as a solution synthesis tool within exploration, inquiry, and collaboration can help learners build structured mindsets. It analyses how a visual programming environment that supports experimentation for building intuition on potential solutions to logical puzzles, and then encourages learners to synthesize a solution interactively, helps learners through gaming principles to build self-esteem on their problem solving ability, to develop algorithmic thinking capacity, and to stay engaged in learning.

  18. Community-Based Mental Health and Behavioral Programs for Low-Income Urban Youth: A Meta-Analytic Review

    Science.gov (United States)

    Farahmand, Farahnaz K.; Duffy, Sophia N.; Tailor, Megha A.; Dubois, David L.; Lyon, Aaron L.; Grant, Kathryn E.; Zarlinski, Jennifer C.; Masini, Olivia; Zander, Keith J.; Nathanson, Alison M.

    2012-01-01

    A meta-analytic review of 33 studies and 41 independent samples was conducted of the effectiveness of community-based mental health and behavioral programs for low-income urban youth. Findings indicated positive effects, with an overall mean effect of 0.25 at post-test. While this is comparable to previous meta-analytic intervention research with…

  19. Evaluation of Respiratory Protection Program in Petrochemical Industries: Application of Analytic Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Hadi Kolahi

    2018-03-01

    Full Text Available Background: Respiratory protection equipment (RPE is the last resort to control exposure to workplace air pollutants. A comprehensive respiratory protection program (RPP ensures that RPE is selected, used, and cared properly. Therefore, RPP must be well integrated into the occupational health and safety requirements. In this study, we evaluated the implementation of RPP in Iranian petrochemical industries to identify the required solutions to improve the current status of respiratory protection. Methods: This cross-sectional study was conducted among 24 petrochemical industries in Iran. The survey instrument was a checklist extracted from the Occupational Safety and Health Administration respiratory protection standard. An index, Respiratory Protection Program Index (RPPI, was developed and weighted by analytic hierarchy process to determine the compliance rate (CR of provided respiratory protection measures with the RPP standard. Data analysis was performed using Excel 2010. Results: The most important element of RPP, according to experts, was respiratory hazard evaluation. The average value of RPPI in the petrochemical plants was 49 ± 15%. The highest and lowest of CR among RPP elements were RPE selection and medical evaluation, respectively. Conclusion: None of studied petrochemical industries implemented RPP completely. This can lead to employees' overexposure to hazardous workplace air contaminants. Increasing awareness of employees and employers through training is suggested by this study to improve such conditions. Keywords: analytic hierarchy process, petrochemical industries, respiratory protection program

  20. Program to develop analytical tools for environmental and safety assessment of nuclear material shipping container systems

    International Nuclear Information System (INIS)

    Butler, T.A.

    1978-11-01

    This paper describes a program for developing analytical techniques to evaluate the response of nuclear material shipping containers to severe accidents. Both lumped-mass and finite element techniques are employed to predict shipping container and shipping container-carrier response to impact. The general impact problem is computationally expensive because of its nonlinear, three-dimensional nature. This expense is minimized by using approximate models to parametrically identify critical cases before more exact analyses are performed. The computer codes developed for solving the problem are being experimentally substantiated with test data from full-scale and scale-model container drop tests. 6 figures, 1 table

  1. Generalized Analytical Program of Thyristor Phase Control Circuit with Series and Parallel Resonance Load

    OpenAIRE

    Nakanishi, Sen-ichiro; Ishida, Hideaki; Himei, Toyoji

    1981-01-01

    The systematic analytical method is reqUired for the ac phase control circuit by means of an inverse parallel thyristor pair which has a series and parallel L-C resonant load, because the phase control action causes abnormal and interesting phenomena, such as an extreme increase of voltage and current, an unique increase and decrease of contained higher harmonics, and a wide variation of power factor, etc. In this paper, the program for the analysis of the thyristor phase control circuit with...

  2. Analytical methods manual for the Mineral Resource Surveys Program, U.S. Geological Survey

    Science.gov (United States)

    Arbogast, Belinda F.

    1996-01-01

    The analytical methods validated by the Mineral Resource Surveys Program, Geologic Division, is the subject of this manual. This edition replaces the methods portion of Open-File Report 90-668 published in 1990. Newer methods may be used which have been approved by the quality assurance (QA) project and are on file with the QA coordinator.This manual is intended primarily for use by laboratory scientists; this manual can also assist laboratory users to evaluate the data they receive. The analytical methods are written in a step by step approach so that they may be used as a training tool and provide detailed documentation of the procedures for quality assurance. A "Catalog of Services" is available for customer (submitter) use with brief listings of:the element(s)/species determined,method of determination,reference to cite,contact person,summary of the technique,and analyte concentration range.For a copy please contact the Branch office at (303) 236-1800 or fax (303) 236-3200.

  3. Development and implementation of information systems for the DOE's National Analytical Management Program (NAMP)

    International Nuclear Information System (INIS)

    Streets, W. E.

    1999-01-01

    The Department of Energy (DOE) faces a challenging environmental management effort, including environmental protection, environmental restoration, waste management, and decommissioning. This effort requires extensive sampling and analysis to determine the type and level of contamination and the appropriate technology for cleanup, and to verify compliance with environmental regulations. Data obtained from these sampling and analysis activities are used to support environmental management decisions. Confidence in the data is critical, having legal, regulatory, and therefore, economic impact. To promote quality in the planning, management, and performance of these sampling and analysis operations, DOE's Office of Environmental Management (EM) has established the National Analytical Management Program (NAMP). With a focus on reducing the estimated costs of over $200M per year for EM's analytical services, NAMP has been charged with developing products that will decrease the costs for DOE complex-wide environmental management while maintaining quality in all aspects of the analytical data generation. As part of this thrust to streamline operations, NAMP is developing centralized information systems that will allow DOE complex personnel to share information about EM contacts at the various sites, pertinent methodologies for environmental restoration and waste management, costs of analyses, and performance of contracted laboratories

  4. Use of the analytical tree technique to develop a radiological protection program

    International Nuclear Information System (INIS)

    Domenech N, H.; Jova S, L.

    1996-01-01

    The results obtained by the Cuban Center for Radiological Protection and Hygiene by using an analytical tree technique to develop its general operational radiation protection program are presented. By the application of this method, some factors such as the organization of the radiation protection services, the provision of administrative requirements, the existing general laboratories requirements, the viability of resources and the current documentation was evaluated. Main components were considered such as: complete normative and regulatory documentation; automatic radiological protection data management; scope of 'on the-job'and radiological protection training for the personnel; previous radiological appraisal for the safety performance of the works and application of dose constrains for the personnel and the public. The detailed development of the program allowed to identify the basic aims to be achieved in its maintenance and improvement. (authors). 3 refs

  5. Examinations on Applications of Manual Calculation Programs on Lung Cancer Radiation Therapy Using Analytical Anisotropic Algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jung Min; Kim, Dae Sup; Hong, Dong Ki; Back, Geum Mun; Kwak, Jung Won [Dept. of Radiation Oncology, , Seoul (Korea, Republic of)

    2012-03-15

    There was a problem with using MU verification programs for the reasons that there were errors of MU when using MU verification programs based on Pencil Beam Convolution (PBC) Algorithm with radiation treatment plans around lung using Analytical Anisotropic Algorithm (AAA). On this study, we studied the methods that can verify the calculated treatment plans using AAA. Using Eclipse treatment planning system (Version 8.9, Varian, USA), for each 57 fields of 7 cases of Lung Stereotactic Body Radiation Therapy (SBRT), we have calculated using PBC and AAA with dose calculation algorithm. By developing MU of established plans, we compared and analyzed with MU of manual calculation programs. We have analyzed relationship between errors and 4 variables such as field size, lung path distance of radiation, Tumor path distance of radiation, effective depth that can affect on errors created from PBC algorithm and AAA using commonly used programs. Errors of PBC algorithm have showned 0.2{+-}1.0% and errors of AAA have showned 3.5{+-}2.8%. Moreover, as a result of analyzing 4 variables that can affect on errors, relationship in errors between lung path distance and MU, connection coefficient 0.648 (P=0.000) has been increased and we could calculate MU correction factor that is A.E=L.P 0.00903+0.02048 and as a result of replying for manual calculation program, errors of 3.5{+-}2.8% before the application has been decreased within 0.4{+-}2.0%. On this study, we have learned that errors from manual calculation program have been increased as lung path distance of radiation increases and we could verified MU of AAA with a simple method that is called MU correction factor.

  6. Examinations on Applications of Manual Calculation Programs on Lung Cancer Radiation Therapy Using Analytical Anisotropic Algorithm

    International Nuclear Information System (INIS)

    Kim, Jung Min; Kim, Dae Sup; Hong, Dong Ki; Back, Geum Mun; Kwak, Jung Won

    2012-01-01

    There was a problem with using MU verification programs for the reasons that there were errors of MU when using MU verification programs based on Pencil Beam Convolution (PBC) Algorithm with radiation treatment plans around lung using Analytical Anisotropic Algorithm (AAA). On this study, we studied the methods that can verify the calculated treatment plans using AAA. Using Eclipse treatment planning system (Version 8.9, Varian, USA), for each 57 fields of 7 cases of Lung Stereotactic Body Radiation Therapy (SBRT), we have calculated using PBC and AAA with dose calculation algorithm. By developing MU of established plans, we compared and analyzed with MU of manual calculation programs. We have analyzed relationship between errors and 4 variables such as field size, lung path distance of radiation, Tumor path distance of radiation, effective depth that can affect on errors created from PBC algorithm and AAA using commonly used programs. Errors of PBC algorithm have showned 0.2±1.0% and errors of AAA have showned 3.5±2.8%. Moreover, as a result of analyzing 4 variables that can affect on errors, relationship in errors between lung path distance and MU, connection coefficient 0.648 (P=0.000) has been increased and we could calculate MU correction factor that is A.E=L.P 0.00903+0.02048 and as a result of replying for manual calculation program, errors of 3.5±2.8% before the application has been decreased within 0.4±2.0%. On this study, we have learned that errors from manual calculation program have been increased as lung path distance of radiation increases and we could verified MU of AAA with a simple method that is called MU correction factor.

  7. A program wide framework for evaluating data driven teaching and learning - earth analytics approaches, results and lessons learned

    Science.gov (United States)

    Wasser, L. A.; Gold, A. U.

    2017-12-01

    There is a deluge of earth systems data available to address cutting edge science problems yet specific skills are required to work with these data. The Earth analytics education program, a core component of Earth Lab at the University of Colorado - Boulder - is building a data intensive program that provides training in realms including 1) interdisciplinary communication and collaboration 2) earth science domain knowledge including geospatial science and remote sensing and 3) reproducible, open science workflows ("earth analytics"). The earth analytics program includes an undergraduate internship, undergraduate and graduate level courses and a professional certificate / degree program. All programs share the goals of preparing a STEM workforce for successful earth analytics driven careers. We are developing an program-wide evaluation framework that assesses the effectiveness of data intensive instruction combined with domain science learning to better understand and improve data-intensive teaching approaches using blends of online, in situ, asynchronous and synchronous learning. We are using targeted online search engine optimization (SEO) to increase visibility and in turn program reach. Finally our design targets longitudinal program impacts on participant career tracts over time.. Here we present results from evaluation of both an interdisciplinary undergrad / graduate level earth analytics course and and undergraduate internship. Early results suggest that a blended approach to learning and teaching that includes both synchronous in-person teaching and active classroom hands-on learning combined with asynchronous learning in the form of online materials lead to student success. Further we will present our model for longitudinal tracking of participant's career focus overtime to better understand long-term program impacts. We also demonstrate the impact of SEO optimization on online content reach and program visibility.

  8. Analytical approaches used in stream benthic macroinvertebrate biomonitoring programs of State agencies in the United States

    Science.gov (United States)

    Carter, James L.; Resh, Vincent H.

    2013-01-01

    Biomonitoring programs based on benthic macroinvertebrates are well-established worldwide. Their value, however, depends on the appropriateness of the analytical techniques used. All United States State, benthic macroinvertebrate biomonitoring programs were surveyed regarding the purposes of their programs, quality-assurance and quality-control procedures used, habitat and water-chemistry data collected, treatment of macroinvertebrate data prior to analysis, statistical methods used, and data-storage considerations. State regulatory mandates (59 percent of programs), biotic index development (17 percent), and Federal requirements (15 percent) were the most frequently reported purposes of State programs, with the specific tasks of satisfying the requirements for 305b/303d reports (89 percent), establishment and monitoring of total maximum daily loads, and developing biocriteria being the purposes most often mentioned. Most states establish reference sites (81 percent), but classify them using State-specific methods. The most often used technique for determining the appropriateness of a reference site was Best Professional Judgment (86 percent of these states). Macroinvertebrate samples are almost always collected by using a D-frame net, and duplicate samples are collected from approximately 10 percent of sites for quality assurance and quality control purposes. Most programs have macroinvertebrate samples processed by contractors (53 percent) and have identifications confirmed by a second taxonomist (85 percent). All States collect habitat data, with most using the Rapid Bioassessment Protocol visual-assessment approach, which requires ~1 h/site. Dissolved oxygen, pH, and conductivity are measured in more than 90 percent of programs. Wide variation exists in which taxa are excluded from analyses and the level of taxonomic resolution used. Species traits, such as functional feeding groups, are commonly used (96 percent), as are tolerance values for organic pollution

  9. Quality assurance programs developed and implemented by the US Department of Energy`s Analytical Services Program for environmental restoration and waste management activities

    Energy Technology Data Exchange (ETDEWEB)

    Lillian, D.; Bottrell, D. [Dept. of Energy, Germntown, MD (United States)

    1993-12-31

    The U.S. Department of Energy`s (DOE`s) Office of Environmental Restoration and Waste Management (EM) has been tasked with addressing environmental contamination and waste problems facing the Department. A key element of any environmental restoration or waste management program is environmental data. An effective and efficient sampling and analysis program is required to generate credible environmental data. The bases for DOE`s EM Analytical Services Program (ASP) are contained in the charter and commitments in Secretary of Energy Notice SEN-13-89, EM program policies and requirements, and commitments to Congress and the Office of Inspector General (IG). The Congressional commitment by DOE to develop and implement an ASP was in response to concerns raised by the Chairman of the Congressional Environment, Energy, and Natural Resources Subcommittee, and the Chairman of the Congressional Oversight and Investigations Subcommittee of the Committee on Energy and Commerce, regarding the production of analytical data. The development and implementation of an ASP also satisfies the IG`s audit report recommendations on environmental analytical support, including development and implementation of a national strategy for acquisition of quality sampling and analytical services. These recommendations were endorsed in Departmental positions, which further emphasize the importance of the ASP to EM`s programs. In September 1990, EM formed the Laboratory Management Division (LMD) in the Office of Technology Development to provide the programmatic direction needed to establish and operate an EM-wide ASP program. In January 1992, LMD issued the {open_quotes}Analytical Services Program Five-Year Plan.{close_quotes} This document described LMD`s strategy to ensure the production of timely, cost-effective, and credible environmental data. This presentation describes the overall LMD Analytical Services Program and, specifically, the various QA programs.

  10. Quality assurance programs developed and implemented by the US Department of Energy's Analytical Services Program for environmental restoration and waste management activities

    International Nuclear Information System (INIS)

    Lillian, D.; Bottrell, D.

    1993-01-01

    The U.S. Department of Energy's (DOE's) Office of Environmental Restoration and Waste Management (EM) has been tasked with addressing environmental contamination and waste problems facing the Department. A key element of any environmental restoration or waste management program is environmental data. An effective and efficient sampling and analysis program is required to generate credible environmental data. The bases for DOE's EM Analytical Services Program (ASP) are contained in the charter and commitments in Secretary of Energy Notice SEN-13-89, EM program policies and requirements, and commitments to Congress and the Office of Inspector General (IG). The Congressional commitment by DOE to develop and implement an ASP was in response to concerns raised by the Chairman of the Congressional Environment, Energy, and Natural Resources Subcommittee, and the Chairman of the Congressional Oversight and Investigations Subcommittee of the Committee on Energy and Commerce, regarding the production of analytical data. The development and implementation of an ASP also satisfies the IG's audit report recommendations on environmental analytical support, including development and implementation of a national strategy for acquisition of quality sampling and analytical services. These recommendations were endorsed in Departmental positions, which further emphasize the importance of the ASP to EM's programs. In September 1990, EM formed the Laboratory Management Division (LMD) in the Office of Technology Development to provide the programmatic direction needed to establish and operate an EM-wide ASP program. In January 1992, LMD issued the open-quotes Analytical Services Program Five-Year Plan.close quotes This document described LMD's strategy to ensure the production of timely, cost-effective, and credible environmental data. This presentation describes the overall LMD Analytical Services Program and, specifically, the various QA programs

  11. Analytical scale purification of zirconia colloidal suspension using field programmed sedimentation field flow fractionation.

    Science.gov (United States)

    Van-Quynh, Alexandra; Blanchart, Philippe; Battu, Serge; Clédat, Dominique; Cardot, Philippe

    2006-03-03

    Sedimentation field flow fractionation was used to obtain purified fractions from a polydispersed zirconia colloidal suspension in the potential purpose of optical material hybrid coating. The zirconia particle size ranged from 50/70 nm to 1000 nm. It exhibited a log-Gaussian particle size distribution (in mass or volume) and a 115% polydispersity index (P.I.). Time dependent eluted fractions of the original zirconia colloidal suspension were collected. The particle size distribution of each fraction was determined with scanning electron microscopy and Coulter sub-micron particle sizer (CSPS). These orthogonal techniques generated similar data. From fraction average elution times and granulometry measurements, it was shown that zirconia colloids are eluted according to the Brownian elution mode. The four collected fractions have a Gaussian like distribution and respective average size and polydispersity index of 153 nm (P.I. = 34.7%); 188 nm (P.I. = 27.9%); 228 nm (P.I. = 22.6%), and 276 nm (P.I. = 22.3%). These data demonstrate the strong size selectivity of SdFFF operated with programmed field of exponential profile for sorting particles in the sub-micron range. Using this technique, the analytical production of zirconia of given average size and reduced polydispersity is possible.

  12. Segmenting the Adult Education Market.

    Science.gov (United States)

    Aurand, Tim

    1994-01-01

    Describes market segmentation and how the principles of segmentation can be applied to the adult education market. Indicates that applying segmentation techniques to adult education programs results in programs that are educationally and financially satisfying and serve an appropriate population. (JOW)

  13. On the Multilevel Nature of Meta-Analysis: A Tutorial, Comparison of Software Programs, and Discussion of Analytic Choices.

    Science.gov (United States)

    Pastor, Dena A; Lazowski, Rory A

    2018-01-01

    The term "multilevel meta-analysis" is encountered not only in applied research studies, but in multilevel resources comparing traditional meta-analysis to multilevel meta-analysis. In this tutorial, we argue that the term "multilevel meta-analysis" is redundant since all meta-analysis can be formulated as a special kind of multilevel model. To clarify the multilevel nature of meta-analysis the four standard meta-analytic models are presented using multilevel equations and fit to an example data set using four software programs: two specific to meta-analysis (metafor in R and SPSS macros) and two specific to multilevel modeling (PROC MIXED in SAS and HLM). The same parameter estimates are obtained across programs underscoring that all meta-analyses are multilevel in nature. Despite the equivalent results, not all software programs are alike and differences are noted in the output provided and estimators available. This tutorial also recasts distinctions made in the literature between traditional and multilevel meta-analysis as differences between meta-analytic choices, not between meta-analytic models, and provides guidance to inform choices in estimators, significance tests, moderator analyses, and modeling sequence. The extent to which the software programs allow flexibility with respect to these decisions is noted, with metafor emerging as the most favorable program reviewed.

  14. Learning About Love: A Meta-Analytic Study of Individually-Oriented Relationship Education Programs for Adolescents and Emerging Adults.

    Science.gov (United States)

    Simpson, David M; Leonhardt, Nathan D; Hawkins, Alan J

    2018-03-01

    Despite recent policy initiatives and substantial federal funding of individually oriented relationship education programs for youth, there have been no meta-analytic reviews of this growing field. This meta-analytic study draws on 17 control-group studies and 13 one-group/pre-post studies to evaluate the effectiveness of relationship education programs on adolescents' and emerging adults' relationship knowledge, attitudes, and skills. Overall, control-group studies produced a medium effect (d = .36); one-group/pre-post studies also produced a medium effect (d = .47). However, the lack of studies with long-term follow-ups of relationship behaviors in the young adult years is a serious weakness in the field, limiting what we can say about the value of these programs for helping youth achieve their aspirations for healthy romantic relationships and stable marriages.

  15. The PLUS family: A set of computer programs to evaluate analytical solutions of the diffusion equation and thermoelasticity

    International Nuclear Information System (INIS)

    Montan, D.N.

    1987-02-01

    This report is intended to describe, document and provide instructions for the use of new versions of a set of computer programs commonly referred to as the PLUS family. These programs were originally designed to numerically evaluate simple analytical solutions of the diffusion equation. The new versions include linear thermo-elastic effects from thermal fields calculated by the diffusion equation. After the older versions of the PLUS family were documented a year ago, it was realized that the techniques employed in the programs were well suited to the addition of linear thermo-elastic phenomena. This has been implemented and this report describes the additions. 3 refs., 14 figs

  16. ORBITALES. A program for the calculation of wave functions with an analytical central potential; ORBITALES. Programa de calculo de Funciones de Onda para una Potencial Central Analitico

    Energy Technology Data Exchange (ETDEWEB)

    Carretero, Yunta; Rodriguez Mayquez, E

    1974-07-01

    In this paper is described the objective, basis, carrying out in FORTRAN language and use of the program ORBITALES. This program calculate atomic wave function in the case of ths analytical central potential (Author) 8 refs.

  17. Comparability between NQA-1 and the QA programs for analytical laboratories within the nuclear industry and EPA hazardous waste laboratories

    International Nuclear Information System (INIS)

    English, S.L.; Dahl, D.R.

    1989-01-01

    There is increasing cooperation between the Department of Energy (DOE), Department of Defense (DOD), and the Environmental Protection Agency (EPA) in the activities associated with monitoring and clean-up of hazardous wastes. Pacific Northwest Laboratory (PNL) examined the quality assurance/quality control programs that the EPA requires of the private sector when performing routine analyses of hazardous wastes to confirm how or if the requirements correspond with PNL's QA program based upon NQA-1. This paper presents the similarities and differences between NQA-1 and the QA program identified in ASTM-C1009-83, Establishing a QA Program for Analytical Chemistry Laboratories within the Nuclear Industry; EPA QAMS-005/80, Interim Guidelines and Specifications for Preparing Quality Assurance Project Plans, which is referenced in Statements of Work for CERCLA analytical activities; and Chapter 1 of SW-846, which is used in analyses of RCRA samples. The EPA QA programs for hazardous waste analyses are easily encompassed within an already established NQA-1 QA program. A few new terms are introduced and there is an increased emphasis upon the QC/verification, but there are many of the same basic concepts in all the programs

  18. Multidendritic sensory neurons in the adult Drosophila abdomen: origins, dendritic morphology, and segment- and age-dependent programmed cell death

    Directory of Open Access Journals (Sweden)

    Sugimura Kaoru

    2009-10-01

    Full Text Available Abstract Background For the establishment of functional neural circuits that support a wide range of animal behaviors, initial circuits formed in early development have to be reorganized. One way to achieve this is local remodeling of the circuitry hardwiring. To genetically investigate the underlying mechanisms of this remodeling, one model system employs a major group of Drosophila multidendritic sensory neurons - the dendritic arborization (da neurons - which exhibit dramatic dendritic pruning and subsequent growth during metamorphosis. The 15 da neurons are identified in each larval abdominal hemisegment and are classified into four categories - classes I to IV - in order of increasing size of their receptive fields and/or arbor complexity at the mature larval stage. Our knowledge regarding the anatomy and developmental basis of adult da neurons is still fragmentary. Results We identified multidendritic neurons in the adult Drosophila abdomen, visualized the dendritic arbors of the individual neurons, and traced the origins of those cells back to the larval stage. There were six da neurons in abdominal hemisegment 3 or 4 (A3/4 of the pharate adult and the adult just after eclosion, five of which were persistent larval da neurons. We quantitatively analyzed dendritic arbors of three of the six adult neurons and examined expression in the pharate adult of key transcription factors that result in the larval class-selective dendritic morphologies. The 'baseline design' of A3/4 in the adult was further modified in a segment-dependent and age-dependent manner. One of our notable findings is that a larval class I neuron, ddaE, completed dendritic remodeling in A2 to A4 and then underwent caspase-dependent cell death within 1 week after eclosion, while homologous neurons in A5 and in more posterior segments degenerated at pupal stages. Another finding is that the dendritic arbor of a class IV neuron, v'ada, was immediately reshaped during post

  19. European specialist porphyria laboratories: diagnostic strategies, analytical quality, clinical interpretation, and reporting as assessed by an external quality assurance program.

    Science.gov (United States)

    Aarsand, Aasne K; Villanger, Jørild H; Støle, Egil; Deybach, Jean-Charles; Marsden, Joanne; To-Figueras, Jordi; Badminton, Mike; Elder, George H; Sandberg, Sverre

    2011-11-01

    The porphyrias are a group of rare metabolic disorders whose diagnosis depends on identification of specific patterns of porphyrin precursor and porphyrin accumulation in urine, blood, and feces. Diagnostic tests for porphyria are performed by specialized laboratories in many countries. Data regarding the analytical and diagnostic performance of these laboratories are scarce. We distributed 5 sets of multispecimen samples from different porphyria patients accompanied by clinical case histories to 18-21 European specialist porphyria laboratories/centers as part of a European Porphyria Network organized external analytical and postanalytical quality assessment (EQA) program. The laboratories stated which analyses they would normally have performed given the case histories and reported results of all porphyria-related analyses available, interpretative comments, and diagnoses. Reported diagnostic strategies initially showed considerable diversity, but the number of laboratories applying adequate diagnostic strategies increased during the study period. We found an average interlaboratory CV of 50% (range 12%-152%) for analytes in absolute concentrations. Result normalization by forming ratios to the upper reference limits did not reduce this variation. Sixty-five percent of reported results were within biological variation-based analytical quality specifications. Clinical interpretation of the obtained analytical results was accurate, and most laboratories established the correct diagnosis in all distributions. Based on a case-based EQA scheme, variations were apparent in analytical and diagnostic performance between European specialist porphyria laboratories. Our findings reinforce the use of EQA schemes as an essential tool to assess both analytical and diagnostic processes and thereby to improve patient care in rare diseases.

  20. Implementation of evidence-based home visiting programs aimed at reducing child maltreatment: A meta-analytic review.

    Science.gov (United States)

    Casillas, Katherine L; Fauchier, Angèle; Derkash, Bridget T; Garrido, Edward F

    2016-03-01

    In recent years there has been an increase in the popularity of home visitation programs as a means of addressing risk factors for child maltreatment. The evidence supporting the effectiveness of these programs from several meta-analyses, however, is mixed. One potential explanation for this inconsistency explored in the current study involves the manner in which these programs were implemented. In the current study we reviewed 156 studies associated with 9 different home visitation program models targeted to caregivers of children between the ages of 0 and 5. Meta-analytic techniques were used to determine the impact of 18 implementation factors (e.g., staff selection, training, supervision, fidelity monitoring, etc.) and four study characteristics (publication type, target population, study design, comparison group) in predicting program outcomes. Results from analyses revealed that several implementation factors, including training, supervision, and fidelity monitoring, had a significant effect on program outcomes, particularly child maltreatment outcomes. Study characteristics, including the program's target population and the comparison group employed, also had a significant effect on program outcomes. Implications of the study's results for those interested in implementing home visitation programs are discussed. A careful consideration and monitoring of program implementation is advised as a means of achieving optimal study results. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Schedule Analytics

    Science.gov (United States)

    2016-04-30

    Warfare, Naval Sea Systems Command Acquisition Cycle Time : Defining the Problem David Tate, Institute for Defense Analyses Schedule Analytics Jennifer...research was comprised of the following high- level steps :  Identify and review primary data sources 1...research. However, detailed reviews of the OMB IT Dashboard data revealed that schedule data is highly aggregated. Program start date and program end date

  2. Experience with Dismantling of the Analytic Cell in the JRTF Decommissioning Program

    International Nuclear Information System (INIS)

    Annoh, Akio; Nemoto, Koichi; Tajiri, Hideo; Saito, Keiichiro; Miyajima, Kazutoshi; Myodo, Masato

    2003-01-01

    The analytic cell was mainly used for process control analysis of the reprocessing process and for the measurement of fuel burn up ratio in JAERI's Reprocessing Test Facility (JRTF). The analytic cell was a heavy shielded one and equipped with a conveyor. The cell was alpha and beta(gamma)contaminated. For dismantling of analytic cells, it is very important to establish a method to remove the heavy shield safely and reduce the exposure. At first, a green house was set up to prevent the spread out of contamination, and next, the analytic cell was dismantled. Depending on the contamination condition, the workers wore protective suits such as air ventilated-suits for prevention of internal exposure and vinyl chloride aprons, lead aprons in order to reduce external exposure. From the work carried out, various data such as needed manpower for the activities, the collective dose of workers by external exposure, the amount of radioactive wastes and the relation between the weight of the shield and its dismantling efficiency were obtained and input for the database. The method of dismantling and the experience with the dismantling of the analytic cell in the JRTF, carried out during 2001 and 2002, are described in this paper

  3. Flammable gas safety program. Analytical methods development: FY 1994 progress report

    International Nuclear Information System (INIS)

    Campbell, J.A.; Clauss, S.; Grant, K.; Hoopes, V.; Lerner, B.; Lucke, R.; Mong, G.; Rau, J.; Wahl, K.; Steele, R.

    1994-09-01

    This report describes the status of developing analytical methods to account for the organic components in Hanford waste tanks, with particular focus on tanks assigned to the Flammable Gas Watch List. The methods that have been developed are illustrated by their application to samples obtained from Tank 241-SY-101 (Tank 101-SY)

  4. Data Acquisition Programming (LabVIEW): An Aid to Teaching Instrumental Analytical Chemistry.

    Science.gov (United States)

    Gostowski, Rudy

    A course was developed at Austin Peay State University (Tennessee) which offered an opportunity for hands-on experience with the essential components of modern analytical instruments. The course aimed to provide college students with the skills necessary to construct a simple model instrument, including the design and fabrication of electronic…

  5. Effectiveness of mentoring programs for youth: a meta-analytic review.

    Science.gov (United States)

    DuBois, David L; Holloway, Bruce E; Valentine, Jeffrey C; Cooper, Harris

    2002-04-01

    We used meta-analysis to review 55 evaluations of the effects of mentoring programs on youth. Overall, findings provide evidence of only a modest or small benefit of program participation for the average youth. Program effects are enhanced significantly, however, when greater numbers of both theory-based and empirically based "best practices" are utilized and when strong relationships are formed between mentors and youth. Youth from backgrounds of environmental risk and disadvantage appear most likely to benefit from participation in mentoring programs. Outcomes for youth at-risk due to personal vulnerabilities have varied substantially in relation to program characteristics, with a noteworthy potential evident for poorly implemented programs to actually have an adverse effect on such youth. Recommendations include greater adherence to guidelines for the design and implementation of effective mentoring programs as well as more in-depth assessment of relationship and contextual factors in the evaluation of programs.

  6. Supporting Imagers' VOICE: A National Training Program in Comparative Effectiveness Research and Big Data Analytics.

    Science.gov (United States)

    Kang, Stella K; Rawson, James V; Recht, Michael P

    2017-12-05

    Provided methodologic training, more imagers can contribute to the evidence basis on improved health outcomes and value in diagnostic imaging. The Value of Imaging Through Comparative Effectiveness Research Program was developed to provide hands-on, practical training in five core areas for comparative effectiveness and big biomedical data research: decision analysis, cost-effectiveness analysis, evidence synthesis, big data principles, and applications of big data analytics. The program's mixed format consists of web-based modules for asynchronous learning as well as in-person sessions for practical skills and group discussion. Seven diagnostic radiology subspecialties and cardiology are represented in the first group of program participants, showing the collective potential for greater depth of comparative effectiveness research in the imaging community. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  7. AI based HealthCare Platform for Real Time, Predictive and Prescriptive Analytics using Reactive Programming

    Science.gov (United States)

    Kaur, Jagreet; Singh Mann, Kulwinder, Dr.

    2018-01-01

    AI in Healthcare needed to bring real, actionable insights and Individualized insights in real time for patients and Doctors to support treatment decisions., We need a Patient Centred Platform for integrating EHR Data, Patient Data, Prescriptions, Monitoring, Clinical research and Data. This paper proposes a generic architecture for enabling AI based healthcare analytics Platform by using open sources Technologies Apache beam, Apache Flink Apache Spark, Apache NiFi, Kafka, Tachyon, Gluster FS, NoSQL- Elasticsearch, Cassandra. This paper will show the importance of applying AI based predictive and prescriptive analytics techniques in Health sector. The system will be able to extract useful knowledge that helps in decision making and medical monitoring in real-time through an intelligent process analysis and big data processing.

  8. The National Shipbuilding Research Program. Development of a Quick TBT Analytical Method

    Science.gov (United States)

    2000-08-16

    Development of a Quick TBT Analytical Method 09/25/2000 Page 3 of 38 Executive Summary Concern about the toxic effect of tributyltin have caused the...Antifouling Paints on the Environment Tributyl tin ( TBT ) has been shown to be highly toxic to certain aquatic organisms at concentrations measured in the...paints, developed in the 1960s, contains the organotin tributyltin ( TBT ), which has been proven to cause deformations in oysters and sex changes in

  9. Predicting Success: How Predictive Analytics Are Transforming Student Support and Success Programs

    Science.gov (United States)

    Boerner, Heather

    2015-01-01

    Every year, Lone Star College in Texas hosts a "Men of Honor" program to provide assistance and programming to male students, but particularly those who are Hispanic and black, in hopes their academic performance will improve. Lone Star might have kept directing its limited resources toward these students--and totally missed the subset…

  10. Integrating the Full Range of Security Cooperation Programs into Air Force Planning: An Analytic Primer

    Science.gov (United States)

    2011-01-01

    Exchange Program ( IEP ) Authority 10 U.S.C. §2358, “Research and development projects” Processes and agreements IEP agreements with the...2015.4, “Defense Research, Development, Test and Evaluation (RDT&E) Information Exchange Program ( IEP )”; DoDD 5134.1, “Under Secretary of Defense for

  11. Complex of GRAD programs for analytical calculation of radiation defects generation in solids

    International Nuclear Information System (INIS)

    Suvorov, A.L.; Zabolotnyj, V.T.; Babaev, V.P.

    1989-01-01

    Complex of programms for analytical calculation of generation of radiation defects (GRAD) in solids, and also of their recombination during cascade area relaxation and postradiation annealing, of mass removing by atomic collisions in volume (mixing) and through the surface (sputtering), of structure - phase state and property changes is suggested. The complex volume is less than 10 KBytes and it may be realized by computer of any type. Satisfactional agreement with more wide range of experimental data in comparison with tradition models is obtained. 27 refs.; 2 figs

  12. Quality assurance program for surveillance of fast reactor mixed oxide fuel analytical chemistry

    International Nuclear Information System (INIS)

    Rein, J.E.; Zeigler, R.K.; Waterbury, G.R.; McClung, W.E.; Praetorius, P.R.; Delvin, W.L.

    1976-01-01

    An effective quality assurance program for the chemical analysis of nuclear fuel is essential to assure that the fuel will meet the strict chemical specifications required for optimum reactor performance. Such a program has been in operation since 1972 for the fuels manufactured for the Fast Flux Test Facility. This program, through the use of common quality control and calibration standards, has consistently provided high levels of agreement among laboratories in all areas of analysis. The paper presented gives a summary of the chemical specifications for the fuel and source material, an outline of the requirements for laboratory qualifications and the preparation of calibration and quality control materials, general administration details of the plan, and examples where the program has been useful in solving laboratory problems

  13. A Concept of Constructing a Common Information Space for High Tech Programs Using Information Analytical Systems

    Science.gov (United States)

    Zakharova, Alexandra A.; Kolegova, Olga A.; Nekrasova, Maria E.

    2016-04-01

    The paper deals with the issues in program management used for engineering innovative products. The existing project management tools were analyzed. The aim is to develop a decision support system that takes into account the features of program management used for high-tech products: research intensity, a high level of technical risks, unpredictable results due to the impact of various external factors, availability of several implementing agencies. The need for involving experts and using intelligent techniques for information processing is demonstrated. A conceptual model of common information space to support communication between members of the collaboration on high-tech programs has been developed. The structure and objectives of the information analysis system “Geokhod” were formulated with the purpose to implement the conceptual model of common information space in the program “Development and production of new class mining equipment - “Geokhod”.

  14. A development of simulation and analytical program for through-diffusion experiments for a single layer of diffusion media

    International Nuclear Information System (INIS)

    Sato, Haruo

    2001-01-01

    A program (TDROCK1. FOR) for simulation and analysis of through-diffusion experiments for a single layer of diffusion media was developed. This program was made by Pro-Fortran language, which was suitable for scientific and technical calculations, and relatively easy explicit difference method was adopted for an analysis. In the analysis, solute concentration in the tracer cell as a function of time that we could not treat to date can be input and the decrease in the solute concentration as a function of time by diffusion from the tracer cell to the measurement cell, the solute concentration distribution in the porewater of diffusion media and the solute concentration in the measurement cell as a function of time can be calculated. In addition, solution volume in both cells and diameter and thickness of the diffusion media are also variable as an input condition. This simulation program could well explain measured result by simulating solute concentration in the measurement cell as a function of time for case which apparent and effective diffusion coefficients were already known. Based on this, the availability and applicability of this program to actual analysis and simulation were confirmed. This report describes the theoretical treatment for the through-diffusion experiments for a single layer of diffusion media, analytical model, an example of source program and the manual. (author)

  15. Active Segmentation.

    Science.gov (United States)

    Mishra, Ajay; Aloimonos, Yiannis

    2009-01-01

    The human visual system observes and understands a scene/image by making a series of fixations. Every fixation point lies inside a particular region of arbitrary shape and size in the scene which can either be an object or just a part of it. We define as a basic segmentation problem the task of segmenting that region containing the fixation point. Segmenting the region containing the fixation is equivalent to finding the enclosing contour- a connected set of boundary edge fragments in the edge map of the scene - around the fixation. This enclosing contour should be a depth boundary.We present here a novel algorithm that finds this bounding contour and achieves the segmentation of one object, given the fixation. The proposed segmentation framework combines monocular cues (color/intensity/texture) with stereo and/or motion, in a cue independent manner. The semantic robots of the immediate future will be able to use this algorithm to automatically find objects in any environment. The capability of automatically segmenting objects in their visual field can bring the visual processing to the next level. Our approach is different from current approaches. While existing work attempts to segment the whole scene at once into many areas, we segment only one image region, specifically the one containing the fixation point. Experiments with real imagery collected by our active robot and from the known databases 1 demonstrate the promise of the approach.

  16. Model studies on segmental movement in lumbar spine using a semi-automated program for volume fusion.

    Science.gov (United States)

    Svedmark, P; Weidenhielm, L; Nemeth, G; Tullberg, T; Noz, M E; Maguire, G Q; Zeleznik, M P; Olivecrona, H

    2008-01-01

    To validate a new non-invasive CT method for measuring segmental translations in lumbar spine in a phantom using plastic vertebrae with tantalum markers and human vertebrae. One hundred and four CT volumes were acquired of a phantom incorporating three lumbar vertebrae. Lumbar segmental translation was simulated by altering the position of one vertebra in all three cardinal axes between acquisitions. The CT volumes were combined into 64 case pairs, simulating lumbar segmental movement of up to 3 mm between acquisitions. The relative movement between the vertebrae was evaluated visually and numerically using a volume fusion image post-processing tool. Results were correlated to direct measurements of the phantom. On visual inspection, translation of at least 1 mm or more could be safely detected and correlated with separation between the vertebrae in three dimensions. There were no significant differences between plastic and human vertebrae. Numerically, the accuracy limit for all the CT measurements of the 3D segmental translations was 0.56 mm (median: 0.12; range: -0.76 to +0.49 mm). The accuracy for the sagittal axis was 0.45 mm (median: 0.10; range: -0.46 to +0.62 mm); the accuracy for the coronal axis was 0.46 mm (median: 0.09; range: -0.66 to +0.69 mm); and the accuracy for the axial axis was 0.45 mm (median: 0.05; range: -0.72 to + 0.62 mm). The repeatability, calculated over 10 cases, was 0.35 mm (median: 0.16; range: -0.26 to +0.30 mm). The accuracy of this non-invasive method is better than that of current routine methods for detecting segmental movements. The method allows both visual and numerical evaluation of such movements. Further studies are needed to validate this method in patients.

  17. What Can Schools, Colleges, and Youth Programs Do with Predictive Analytics? Practitioner Brief

    Science.gov (United States)

    Balu, Rekha; Porter, Kristin

    2017-01-01

    Many low-income young people are not reaching important milestones for success (for example, completing a program or graduating from school on time). But the social-service organizations and schools that serve them often struggle to identify who is at more or less risk. These institutions often either over- or underestimate risk, missing…

  18. Segmentation: Identification of consumer segments

    DEFF Research Database (Denmark)

    Høg, Esben

    2005-01-01

    It is very common to categorise people, especially in the advertising business. Also traditional marketing theory has taken in consumer segments as a favorite topic. Segmentation is closely related to the broader concept of classification. From a historical point of view, classification has its...... origin in other sciences as for example biology, anthropology etc. From an economic point of view, it is called segmentation when specific scientific techniques are used to classify consumers to different characteristic groupings. What is the purpose of segmentation? For example, to be able to obtain...... a basic understanding of grouping people. Advertising agencies may use segmentation totarget advertisements, while food companies may usesegmentation to develop products to various groups of consumers. MAPP has for example investigated the positioning of fish in relation to other food products...

  19. Advanced organic analysis and analytical methods development: FY 1995 progress report. Waste Tank Organic Safety Program

    International Nuclear Information System (INIS)

    Wahl, K.L.; Campbell, J.A.; Clauss, S.A.

    1995-09-01

    This report describes the work performed during FY 1995 by Pacific Northwest Laboratory in developing and optimizing analysis techniques for identifying organics present in Hanford waste tanks. The main focus was to provide a means for rapidly obtaining the most useful information concerning the organics present in tank waste, with minimal sample handling and with minimal waste generation. One major focus has been to optimize analytical methods for organic speciation. Select methods, such as atmospheric pressure chemical ionization mass spectrometry and matrix-assisted laser desorption/ionization mass spectrometry, were developed to increase the speciation capabilities, while minimizing sample handling. A capillary electrophoresis method was developed to improve separation capabilities while minimizing additional waste generation. In addition, considerable emphasis has been placed on developing a rapid screening tool, based on Raman and infrared spectroscopy, for determining organic functional group content when complete organic speciation is not required. This capability would allow for a cost-effective means to screen the waste tanks to identify tanks that require more specialized and complete organic speciation to determine tank safety

  20. Segmental Vitiligo.

    Science.gov (United States)

    van Geel, Nanja; Speeckaert, Reinhart

    2017-04-01

    Segmental vitiligo is characterized by its early onset, rapid stabilization, and unilateral distribution. Recent evidence suggests that segmental and nonsegmental vitiligo could represent variants of the same disease spectrum. Observational studies with respect to its distribution pattern point to a possible role of cutaneous mosaicism, whereas the original stated dermatomal distribution seems to be a misnomer. Although the exact pathogenic mechanism behind the melanocyte destruction is still unknown, increasing evidence has been published on the autoimmune/inflammatory theory of segmental vitiligo. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Programs and analytical methods for the U.S. Geological Survey acid-rain quality-assurance project. Water Resources Investigation

    International Nuclear Information System (INIS)

    See, R.B.; Willoughby, T.C.; Brooks, M.H.; Gordon, J.D.

    1990-01-01

    The U.S. Geological Survey operates four programs to provide external quality-assurance of wet deposition monitoring by the National Atmospheric Deposition Program and the National Trends Network. An intersite-comparison program assesses the precision and bias of onsite determinations of pH and specific conductance made by site operators. A blind-audit program is used to assess the effect of routine sample-handling procedures and transportation on the precision and bias of wet-deposition data. An interlaboratory-comparison program is used to assess analytical results from three or more laboratories, which routinely analyze wet-deposition samples from the major North American networks, to determine if comparability exists between laboratory analytical results and to provide estimates of the analytical precision of each laboratory. A collocated-sampler program is used to estimate the precision of wet/dry precipitation sampling throughout the National Atmospheric Deposition Program and the National Trends Network, to assess the variability of diverse spatial arrays, and to evaluate the impact of violations of specific site criteria. The report documents the procedures and analytical methods used in these four quality-assurance programs

  2. Industrial Guidelines for Undertaking a Hard-Core Employment Program: An Analytic Case Study of the Experience of an Urban Industrial Organization.

    Science.gov (United States)

    Feifer, Irwin; And Others

    Based on an analytically evaluative case study of a New York City furniture department store's experiences with a Manpower Administration contract, this report deals with the development and progress of the program as analyzed by one investigator through interviews with almost all of the participants in the program. As a result of the study,…

  3. Hazardous Waste Remedial Actions Program requirements for quality control of analytical data

    International Nuclear Information System (INIS)

    Miller, M.S.; Zolyniak, J.W.

    1988-08-01

    The Hazardous Waste Remedial Action Program (HAZWRAP) is involved in performing field investigations and sample analysis pursuant to the NCP for the Department of Energy and other federal agencies. The purpose of this document is to specify the requirements for the control of the accuracy, precision and completeness of the samples, and data from the point of collection through analysis. The requirements include data reduction and reporting of the resulting environmentally related data. Because every instance and concern may not be addressed in this document, HAZWRAP subcontractors are encouraged to discuss any questions with the HAZWRAP Project Manager hereafter identified as the Project Manager

  4. Analytic programming with FMRI data: a quick-start guide for statisticians using R.

    Science.gov (United States)

    Eloyan, Ani; Li, Shanshan; Muschelli, John; Pekar, Jim J; Mostofsky, Stewart H; Caffo, Brian S

    2014-01-01

    Functional magnetic resonance imaging (fMRI) is a thriving field that plays an important role in medical imaging analysis, biological and neuroscience research and practice. This manuscript gives a didactic introduction to the statistical analysis of fMRI data using the R project, along with the relevant R code. The goal is to give statisticians who would like to pursue research in this area a quick tutorial for programming with fMRI data. References of relevant packages and papers are provided for those interested in more advanced analysis.

  5. Rasplav. A unique OECD/Russian experimental/analytical program in severe accident management/mitigation

    International Nuclear Information System (INIS)

    Speis, T.P.; Behbahani, A.

    1995-01-01

    In 1994 the Nuclear Energy Agency of the Organization for Economic Cooperation and Development (NEA-OECD) sponsored the RASPLAV project in Russia to carry out an integrated program of experiments and analyses to address the conditions under which a degraded/molten core can be retained inside the reactor pressure vessel via cooling of the vessel from outside. The background, the objectives, the technical issues associated with, the utilization of the results and the benefits to the participating countries are discussed, involving Russian partnership with the most advanced OECD member countries in a project which will be carried out at the Kurchatov Institute in Russia. (author). 1 fig

  6. Development of novel segmented-plate linearly tunable MEMS capacitors

    International Nuclear Information System (INIS)

    Shavezipur, M; Khajepour, A; Hashemi, S M

    2008-01-01

    In this paper, novel MEMS capacitors with flexible moving electrodes and high linearity and tunability are presented. The moving plate is divided into small and rigid segments connected to one another by connecting beams at their end nodes. Under each node there is a rigid step which selectively limits the vertical displacement of the node. A lumped model is developed to analytically solve the governing equations of coupled structural-electrostatic physics with mechanical contact. Using the analytical solver, an optimization program finds the best set of step heights that provides the highest linearity. Analytical and finite element analyses of two capacitors with three-segmented- and six-segmented-plate confirm that the segmentation technique considerably improves the linearity while the tunability remains as high as that of a conventional parallel-plate capacitor. Moreover, since the new designs require customized fabrication processes, to demonstrate the applicability of the proposed technique for standard processes, a modified capacitor with flexible steps designed for PolyMUMPs is introduced. Dimensional optimization of the modified design results in a combination of high linearity and tunability. Constraining the displacement of the moving plate can be extended to more complex geometries to obtain smooth and highly linear responses

  7. Program Performance Assessment System (PPAS). External reviewers' report of the consultants' meeting on analytical quality control services

    International Nuclear Information System (INIS)

    2001-01-01

    In reviewing the recommendations of previous Consultants' Meetings concerning the AQCS program, it is apparent that there has been a clear and consistent agreement on what the objectives of the AQCS activities should be. The mission statement as given in the Agency's 'Blue Book 1997-1998' states 'To assist analytical laboratories in Member States in maintaining/improving the quality of their analytical measurements, to achieve internationally acceptable levels of quality assurance and to develop and supply appropriate reference standards to achieve these objectives'. In concert with this mission statement, the consultants have endorsed an elaboration of these objectives for both the Agency' s laboratories and Member State laboratories as outlined in the 1994 Consultants' Report (KONA, HI, USA) which includes: the improvement of the reliability of results for the intended purposes; the enhancement of the comparability of results from one measurement laboratory to another; the attainment of compatibility of results in physical and chemical sciences with specific coverage of international standards for food and agriculture, human health, environment, industry, earth sciences, radiation safety, and safeguards activities; the demonstration of quality measurement systems sufficient for laboratory/analyst accreditation or acceptance, and; the establishment of traceability of radioactivity measurements and chemical analyses to the international SI system of measurements

  8. Current status of JAERI program on development of ultra-trace-analytical technology for safeguards environmental samples

    International Nuclear Information System (INIS)

    Adachi, T.; Usuda, S.; Watanabe, K.

    2001-01-01

    Full text: In order to contribute to the strengthened safeguards system based on the Program 93+2 of the IAEA, Japan Atomic Energy Research Institute (JAERI) is developing analytical technology for ultra-trace amounts of nuclear materials in environmental samples, and constructed the CLEAR facility (Clean Laboratory for Environmental Analysis and Research) for this purpose. The development of the technology is carried out, at existing laboratories for time being, in the following fields: screening, bulk analysis and particle analysis. The screening aims at estimating the amounts of nuclear materials in environmental samples to be introduced into the clean rooms, and is the first step to avoid cross-contamination among the samples and contamination of the clean rooms themselves. In addition to ordinary radiation spectrometry, Compton suppression technique was applied to low energy γ- and X-ray measurements, and sufficient reduction in background level has been demonstrated. Another technique in examination is imaging-plate method, which is a kind of autoradiography and suitable for determination of radioactive-particle distribution in the samples as well as for semiquantitative determination. As for the bulk analysis, the efforts are temporally made on uranium in swipe samples. Preliminary examination for optimization of sample pre-treatment conditions is in progress. At present, ashing by low-temperature-plasma method gives better results than high-temperature ashing or acid leaching. For the isotopic ratio measurement, instrumental performance of inductively-coupled plasma mass spectrometry (ICP-MS) are mainly examined because sample preparation for ICP-MS is simpler than that for thermal ionization mass spectrometry (TIMS). It was found by our measurement that the swipe material (TexWipe TX304, usually used by IAEA) contains un-negligible uranium blank with large deviation (2-6 ng/sheet). This would introduce significant uncertainty in the trace analysis. JAERI

  9. An automated data handling process integrating spreadsheets and word processors with analytical programs

    International Nuclear Information System (INIS)

    Fisher, G.F.; Bennett, L.G.I.

    1994-01-01

    A data handling process utilizing software programs that are commercially available for use on MS-DOS microcomputers was developed to reduce the time, energy and labour required to tabulate the final results of trace analyses. The elimination of hand computations reduced the possibility of transcription errors since, once the γ-ray spectrum analysis results are obtained and saved to a hard disk of a microcomputer, they can be manipulated very easily with little possibility of distortion. The 8 step process permitted the selection of each element of interest's best concentration value based upon its associated peak area. Calculated concentration values were automatically compared against the sample's determination limit. Unsatisfactory values were flagged for latter review and adjustment by the user. In the final step, a file was created which identified the samples with their appropriate particulars (i.e. source, sample, date, etc.), and the trace element concentration were displayed. This final file contained a fully formatted summary table that listed all of the sample's results and particulars such that it could be printed or imported into a word processor for inclusion in a report. In the illustrated application of analyzing wear debris in oil-lubricated systems, over 13,000 individual numbers were processed to arrive at final concentration estimates of 19 trace elements in 80 samples. The system works very well for the elements that were analyzed in this investigation. The usefulness of commercially available spreadsheets and word processors for this task was demonstrated. (author) 5 refs.; 2 figs.; 5 tabs

  10. Analysis of linear measurements on 3D surface models using CBCT data segmentation obtained by automatic standard pre-set thresholds in two segmentation software programs: an in vitro study.

    Science.gov (United States)

    Poleti, Marcelo Lupion; Fernandes, Thais Maria Freire; Pagin, Otávio; Moretti, Marcela Rodrigues; Rubira-Bullen, Izabel Regina Fischer

    2016-01-01

    The aim of this in vitro study was to evaluate the reliability and accuracy of linear measurements on three-dimensional (3D) surface models obtained by standard pre-set thresholds in two segmentation software programs. Ten mandibles with 17 silica markers were scanned for 0.3-mm voxels in the i-CAT Classic (Imaging Sciences International, Hatfield, PA, USA). Twenty linear measurements were carried out by two observers two times on the 3D surface models: the Dolphin Imaging 11.5 (Dolphin Imaging & Management Solutions, Chatsworth, CA, USA), using two filters(Translucent and Solid-1), and in the InVesalius 3.0.0 (Centre for Information Technology Renato Archer, Campinas, SP, Brazil). The physical measurements were made by another observer two times using a digital caliper on the dry mandibles. Excellent intra- and inter-observer reliability for the markers, physical measurements, and 3D surface models were found (intra-class correlation coefficient (ICC) and Pearson's r ≥ 0.91). The linear measurements on 3D surface models by Dolphin and InVesalius software programs were accurate (Dolphin Solid-1 > InVesalius > Dolphin Translucent). The highest absolute and percentage errors were obtained for the variable R1-R1 (1.37 mm) and MF-AC (2.53 %) in the Dolphin Translucent and InVesalius software, respectively. Linear measurements on 3D surface models obtained by standard pre-set thresholds in the Dolphin and InVesalius software programs are reliable and accurate compared with physical measurements. Studies that evaluate the reliability and accuracy of the 3D models are necessary to ensure error predictability and to establish diagnosis, treatment plan, and prognosis in a more realistic way.

  11. Development of an automated 3D segmentation program for volume quantification of body fat distribution using CT

    International Nuclear Information System (INIS)

    Ohshima, Shunsuke; Yamamoto, Shuji; Yamaji, Taiki

    2008-01-01

    The objective of this study was to develop a computing tool for full-automatic segmentation of body fat distributions on volumetric CT images. We developed an algorithm to automatically identify the body perimeter and the inner contour that separates visceral fat from subcutaneous fat. Diaphragmatic surfaces can be extracted by model-based segmentation to match the bottom surface of the lung in CT images for determination of the upper limitation of the abdomen. The functions for quantitative evaluation of abdominal obesity or obesity-related metabolic syndrome were implemented with a prototype three-dimensional (3D) image processing workstation. The volumetric ratios of visceral fat to total fat and visceral fat to subcutaneous fat for each subject can be calculated. Additionally, color intensity mapping of subcutaneous areas and the visceral fat layer is quite obvious in understanding the risk of abdominal obesity with the 3D surface display. Preliminary results obtained have been useful in medical checkups and have contributed to improved efficiency in checking obesity throughout the whole range of the abdomen with 3D visualization and analysis. (author)

  12. CONCH: A Visual Basic program for interactive processing of ion-microprobe analytical data

    Science.gov (United States)

    Nelson, David R.

    2006-11-01

    A Visual Basic program for flexible, interactive processing of ion-microprobe data acquired for quantitative trace element, 26Al- 26Mg, 53Mn- 53Cr, 60Fe- 60Ni and U-Th-Pb geochronology applications is described. Default but editable run-tables enable software identification of secondary ion species analyzed and for characterization of the standard used. Counts obtained for each species may be displayed in plots against analysis time and edited interactively. Count outliers can be automatically identified via a set of editable count-rejection criteria and displayed for assessment. Standard analyses are distinguished from Unknowns by matching of the analysis label with a string specified in the Set-up dialog, and processed separately. A generalized routine writes background-corrected count rates, ratios and uncertainties, plus weighted means and uncertainties for Standards and Unknowns, to a spreadsheet that may be saved as a text-delimited file. Specialized routines process trace-element concentration, 26Al- 26Mg, 53Mn- 53Cr, 60Fe- 60Ni, and Th-U disequilibrium analysis types, and U-Th-Pb isotopic data obtained for zircon, titanite, perovskite, monazite, xenotime and baddeleyite. Correction to measured Pb-isotopic, Pb/U and Pb/Th ratios for the presence of common Pb may be made using measured 204Pb counts, or the 207Pb or 208Pb counts following subtraction from these of the radiogenic component. Common-Pb corrections may be made automatically, using a (user-specified) common-Pb isotopic composition appropriate for that on the sample surface, or for that incorporated within the mineral at the time of its crystallization, depending on whether the 204Pb count rate determined for the Unknown is substantially higher than the average 204Pb count rate for all session standards. Pb/U inter-element fractionation corrections are determined using an interactive log e-log e plot of common-Pb corrected 206Pb/ 238U ratios against any nominated fractionation-sensitive species pair

  13. Mixed segmentation

    DEFF Research Database (Denmark)

    Hansen, Allan Grutt; Bonde, Anders; Aagaard, Morten

    content analysis and audience segmentation in a single-source perspective. The aim is to explain and understand target groups in relation to, on the one hand, emotional response to commercials or other forms of audio-visual communication and, on the other hand, living preferences and personality traits...

  14. Integrated multi-choice goal programming and multi-segment goal programming for supplier selection considering imperfect-quality and price-quantity discounts in a multiple sourcing environment

    Science.gov (United States)

    Chang, Ching-Ter; Chen, Huang-Mu; Zhuang, Zheng-Yun

    2014-05-01

    Supplier selection (SS) is a multi-criteria and multi-objective problem, in which multi-segment (e.g. imperfect-quality discount (IQD) and price-quantity discount (PQD)) and multi-aspiration level problems may be significantly important; however, little attention had been given to dealing with both of them simultaneously in the past. This study proposes a model for integrating multi-choice goal programming and multi-segment goal programming to solve the above-mentioned problems by providing the following main contributions: (1) it allows decision-makers to set multiple aspiration levels on the right-hand side of each goal to suit real-world situations, (2) the PQD and IQD conditions are considered in the proposed model simultaneously and (3) the proposed model can solve a SS problem with n suppliers where each supplier offers m IQD with r PQD intervals, where only ? extra binary variables are required. The usefulness of the proposed model is explained using a real case. The results indicate that the proposed model not only can deal with a SS problem with multi-segment and multi-aspiration levels, but also can help the decision-maker to find the appropriate order quantities for each supplier by considering cost, quality and delivery.

  15. N286.7-99, A Canadian standard specifying software quality management system requirements for analytical, scientific, and design computer programs and its implementation at AECL

    International Nuclear Information System (INIS)

    Abel, R.

    2000-01-01

    Analytical, scientific, and design computer programs (referred to in this paper as 'scientific computer programs') are developed for use in a large number of ways by the user-engineer to support and prove engineering calculations and assumptions. These computer programs are subject to frequent modifications inherent in their application and are often used for critical calculations and analysis relative to safety and functionality of equipment and systems. N286.7-99(4) was developed to establish appropriate quality management system requirements to deal with the development, modification, and application of scientific computer programs. N286.7-99 provides particular guidance regarding the treatment of legacy codes

  16. Mentoring Programs to Affect Delinquency and Associated Outcomes of Youth At-Risk: A Comprehensive Meta-Analytic Reviewi

    Science.gov (United States)

    Tolan, Patrick H.; Henry, David B.; Schoeny, Michael S.; Lovegrove, Peter; Nichols, Emily

    2013-01-01

    Objectives To conduct a meta-analytic review of selective and indicated mentoring interventions for effects for youth at risk on delinquency and key associated outcomes (aggression, drug use, academic functioning). We also undertook the first systematic evaluation of intervention implementation features and organization and tested for effects of theorized key processes of mentor program effects. Methods Campbell Collaboration review inclusion criteria and procedures were used to search and evaluate the literature. Criteria included a sample defined as at-risk for delinquency due to individual behavior such as aggression or conduct problems or environmental characteristics such as residence in high-crime community. Studies were required to be random assignment or strong quasi-experimental design. Of 163 identified studies published 1970 - 2011, 46 met criteria for inclusion. Results Mean effects sizes were significant and positive for each outcome category (ranging form d =.11 for Academic Achievement to d = .29 for Aggression). Heterogeneity in effect sizes was noted for all four outcomes. Stronger effects resulted when mentor motivation was professional development but not by other implementation features. Significant improvements in effects were found when advocacy and emotional support mentoring processes were emphasized. Conclusions This popular approach has significant impact on delinquency and associated outcomes for youth at-risk for delinquency. While evidencing some features may relate to effects, the body of literature is remarkably lacking in details about specific program features and procedures. This persistent state of limited reporting seriously impedes understanding about how mentoring is beneficial and ability to maximize its utility. PMID:25386111

  17. Prototype implementation of segment assembling software

    Directory of Open Access Journals (Sweden)

    Pešić Đorđe

    2018-01-01

    Full Text Available IT education is very important and a lot of effort is put into the development of tools for helping students to acquire programming knowledge and for helping teachers in automating the examination process. This paper describes a prototype of the program segment assembling software used in the context of making tests in the field of algorithmic complexity. The proposed new program segment assembling model uses rules and templates. A template is a simple program segment. A rule defines combining method and data dependencies if they exist. One example of program segment assembling by the proposed system is given. Graphical user interface is also described.

  18. Tank 241-S-102, Core 232 analytical results for the final report

    Energy Technology Data Exchange (ETDEWEB)

    STEEN, F.H.

    1998-11-04

    This document is the analytical laboratory report for tank 241-S-102 push mode core segments collected between March 5, 1998 and April 2, 1998. The segments were subsampled and analyzed in accordance with the Tank 241-S-102 Retained Gas Sampler System Sampling and Analysis Plan (TSAP) (McCain, 1998), Letter of Instruction for Compatibility Analysis of Samples from Tank 241-S-102 (LOI) (Thompson, 1998) and the Data Quality Objectives for Tank Farms Waste Compatibility Program (DQO) (Mulkey and Miller, 1998). The analytical results are included in the data summary table (Table 1).

  19. Information-Analytic support of the programs of eliminating the consequences of the Chernobyl accident: gained experience and its future application

    International Nuclear Information System (INIS)

    Arutyunyan, R.V.; Bolshov, L.A.; Linge, I.I.; Abalkina, I.L.; Simonov, A.V.; Pavlovsky, O.A.

    1996-01-01

    On the initial stage of eliminating the consequences of the Chernobyl accident, the role of system-analytic and information support in the decision making process for protection of the population and rehabilitation of territories was. to a certain extent, underestimated. Starting from 1991, activity in system-analytic support was the part of the USSR (later on, Russian) stage programs. This activity covered three direction: development of the Central bank of the generalized data on the radiation catastrophes; development, implementation, and maintenance of the control informational system for the Federal bodies; computer-system integration

  20. ADVANCED CLUSTER BASED IMAGE SEGMENTATION

    Directory of Open Access Journals (Sweden)

    D. Kesavaraja

    2011-11-01

    Full Text Available This paper presents efficient and portable implementations of a useful image segmentation technique which makes use of the faster and a variant of the conventional connected components algorithm which we call parallel Components. In the Modern world majority of the doctors are need image segmentation as the service for various purposes and also they expect this system is run faster and secure. Usually Image segmentation Algorithms are not working faster. In spite of several ongoing researches in Conventional Segmentation and its Algorithms might not be able to run faster. So we propose a cluster computing environment for parallel image Segmentation to provide faster result. This paper is the real time implementation of Distributed Image Segmentation in Clustering of Nodes. We demonstrate the effectiveness and feasibility of our method on a set of Medical CT Scan Images. Our general framework is a single address space, distributed memory programming model. We use efficient techniques for distributing and coalescing data as well as efficient combinations of task and data parallelism. The image segmentation algorithm makes use of an efficient cluster process which uses a novel approach for parallel merging. Our experimental results are consistent with the theoretical analysis and practical results. It provides the faster execution time for segmentation, when compared with Conventional method. Our test data is different CT scan images from the Medical database. More efficient implementations of Image Segmentation will likely result in even faster execution times.

  1. Optimally segmented magnetic structures

    DEFF Research Database (Denmark)

    Insinga, Andrea Roberto; Bahl, Christian; Bjørk, Rasmus

    We present a semi-analytical algorithm for magnet design problems, which calculates the optimal way to subdivide a given design region into uniformly magnetized segments.The availability of powerful rare-earth magnetic materials such as Nd-Fe-B has broadened the range of applications of permanent...... is not available.We will illustrate the results for magnet design problems from different areas, such as electric motors/generators (as the example in the picture), beam focusing for particle accelerators and magnetic refrigeration devices.......We present a semi-analytical algorithm for magnet design problems, which calculates the optimal way to subdivide a given design region into uniformly magnetized segments.The availability of powerful rare-earth magnetic materials such as Nd-Fe-B has broadened the range of applications of permanent...... magnets[1][2]. However, the powerful rare-earth magnets are generally expensive, so both the scientific and industrial communities have devoted a lot of effort into developing suitable design methods. Even so, many magnet optimization algorithms either are based on heuristic approaches[3...

  2. Brookhaven segment interconnect

    International Nuclear Information System (INIS)

    Morse, W.M.; Benenson, G.; Leipuner, L.B.

    1983-01-01

    We have performed a high energy physics experiment using a multisegment Brookhaven FASTBUS system. The system was composed of three crate segments and two cable segments. We discuss the segment interconnect module which permits communication between the various segments

  3. Analytical studies by activation. Part A and B: Counting of short half-life radio-nuclides. Part C: Analytical programs for decay curves

    International Nuclear Information System (INIS)

    Junod, E.

    1966-03-01

    Part A and B: Since a radio-nuclide of short half-life is characterized essentially by the decrease in its activity even while it is being measured, the report begins by recalling the basic relationships linking the half-life the counting time, the counting rate and the number of particles recorded. The second part is devoted to the problem of corrections for counting losses due to the idle period of multichannel analyzers. Exact correction formulae have been drawn up for the case where the short half-life radionuclide is pure or contains only a long half-life radio-nuclide. By comparison, charts have been drawn up showing the approximations given by the so-called 'active time' counting and by the counting involving the real time associated with a measurement of the overall idle period, this latter method proving to be more valid than the former. A method is given for reducing the case of a complex mixture to that of a two-component mixture. Part C: The problems connected with the qualitative and quantitative analysis of the decay curves of a mixture of radioactive sources of which one at least has a short half-life are presented. A mathematical description is given of six basic processes for which some elements of Fortran programs are proposed. Two supplementary programs are drawn up for giving an overall treatment of problems of dosage in activation analysis: one on the basis of a simultaneous irradiation of the sample and of one or several known samples, the other with separate irradiation of the unknown and known samples, a dosimeter (activation, or external) being used for normalizing the irradiation flux conditions. (author) [fr

  4. O papel dos programas interlaboratoriais para a qualidade dos resultados analíticos Interlaboratorial programs for improving the quality of analytical results

    Directory of Open Access Journals (Sweden)

    Queenie Siu Hang Chui

    2004-12-01

    Full Text Available Interlaboratorial programs are conducted for a number of purposes: to identify problems related to the calibration of instruments, to assess the degree of equivalence of analytical results among several laboratories, to attribute quantity values and its uncertainties in the development of a certified reference material and to verify the performance of laboratories as in proficiency testing, a key quality assurance technique, which is sometimes used in conjunction with accreditation. Several statistics tools are employed to assess the analytical results of laboratories participating in an intercomparison program. Among them are the z-score technique, the elypse of confidence and the Grubbs and Cochran test. This work presents the experience in coordinating an intercomparison exercise in order to determine Ca, Al, Fe, Ti and Mn, as impurities in samples of silicon metal of chemical grade prepared as a candidate for reference material.

  5. A Goal Programming R&D (Research and Development) Project Funding Model of the U.S. Army Strategic Defense Command Using the Analytic Hierarchy Process.

    Science.gov (United States)

    1987-09-01

    A187 899 A GOAL PROGRANNIN R&D (RESEARCH AND DEVELOPMENT) 1/2 PROJECT FUNDING MODEL 0 (U) NAVAL POSTGRADUATE SCHOOL MONTEREY CA S M ANDERSON SEP 87...PROGRAMMING R&D PROJECT FUNDING MODEL OF THE U.S. ARMY STRATEGIC DEFENSE COMMAND USING THE ANALYTIC HIERARCHY PROCESS by Steven M. Anderson September 1987...jACCESSION NO TITI E (Influde Securt ClauAIcatsrn) A Goal Programming R&D Project Funding Model of the U.S. Army Strategic Defense Command Using the

  6. Teachers as Producers of Data Analytics: A Case Study of a Teacher-Focused Educational Data Science Program

    Science.gov (United States)

    McCoy, Chase; Shih, Patrick C.

    2016-01-01

    Educational data science (EDS) is an emerging, interdisciplinary research domain that seeks to improve educational assessment, teaching, and student learning through data analytics. Teachers have been portrayed in the EDS literature as users of pre-constructed data dashboards in educational technologies, with little consideration given to them as…

  7. Letter of Intent for River Protection Project (RPP) Characterization Program: Process Engineering and Hanford Analytical Services and Characterization Project Operations and Quality Assurance

    International Nuclear Information System (INIS)

    ADAMS, M.R.

    2000-01-01

    The Characterization Project level of success achieved by the River Protection Project (RPP) is determined by the effectiveness of several organizations across RPP working together. The requirements, expectations, interrelationships, and performance criteria for each of these organizations were examined in order to understand the performances necessary to achieve characterization objectives. This Letter of Intent documents the results of the above examination. It formalizes the details of interfaces, working agreements, and requirements for obtaining and transferring tank waste samples from the Tank Farm System (RPP Process Engineering, Characterization Project Operations, and RPP Quality Assurance) to the characterization laboratory complex (222-S Laboratory, Waste Sampling and Characterization Facility, and the Hanford Analytical Service Program) and for the laboratory complex analysis and reporting of analytical results

  8. A Meta-Analytic Review of the Effectiveness of Behavioural Early Intervention Programs for Children with Autistic Spectrum Disorders

    Science.gov (United States)

    Makrygianni, Maria K.; Reed, Phil

    2010-01-01

    The effectiveness of behavioural intervention programs for children with Autistic Spectrum Disorders was addressed by a meta-analysis, which reviewed 14 studies. The findings suggest that the behavioural programs are effective in improving several developmental aspects in the children, in terms of their treatment gains, and also relative to…

  9. Using analytical tools for decision-making and program planning in natural resources: breaking the fear barrier

    Science.gov (United States)

    David L. Peterson; Daniel L. Schmoldt

    1999-01-01

    The National Park Service and other public agencies are increasing their emphasis on inventory and monitoring (I&M) programs to obtain the information needed to infer changes in resource conditions and trigger management responses.A few individuals on a planning team can develop I&M programs, although a focused workshop is more effective.Workshops are...

  10. Applying Learning Analytics for Improving Students Engagement and Learning Outcomes in an MOOCS Enabled Collaborative Programming Course

    Science.gov (United States)

    Lu, Owen H. T.; Huang, Jeff C. H.; Huang, Anna Y. Q.; Yang, Stephen J. H.

    2017-01-01

    As information technology continues to evolve rapidly, programming skills become increasingly crucial. To be able to construct superb programming skills, the training must begin before college or even senior high school. However, when developing comprehensive training programmers, the learning and teaching processes must be considered. In order to…

  11. Educational intervention together with an on-line quality control program achieve recommended analytical goals for bedside blood glucose monitoring in a 1200-bed university hospital.

    Science.gov (United States)

    Sánchez-Margalet, Víctor; Rodriguez-Oliva, Manuel; Sánchez-Pozo, Cristina; Fernández-Gallardo, María Francisca; Goberna, Raimundo

    2005-01-01

    Portable meters for blood glucose concentrations are used at the patients bedside, as well as by patients for self-monitoring of blood glucose. Even though most devices have important technological advances that decrease operator error, the analytical goals proposed for the performance of glucose meters have been recently changed by the American Diabetes Association (ADA) to reach nurses in a 1200-bed University Hospital to achieve recommended analytical goals, so that we could improve the quality of diabetes care. We used portable glucose meters connected on-line to the laboratory after an educational program for nurses with responsibilities in point-of-care testing. We evaluated the system by assessing total error of the glucometers using high- and low-level glucose control solutions. In a period of 6 months, we collected data from 5642 control samples obtained by 14 devices (Precision PCx) directly from the control program (QC manager). The average total error for the low-level glucose control (2.77 mmol/l) was 6.3% (range 5.5-7.6%), and even lower for the high-level glucose control (16.66 mmol/l), at 4.8% (range 4.1-6.5%). In conclusion, the performance of glucose meters used in our University Hospital with more than 1000 beds not only improved after the intervention, but the meters achieved the analytical goals of the suggested ADA/National Academy of Clinical Biochemistry criteria for total error (<7.9% in the range 2.77-16.66 mmol/l glucose) and optimal total error for high glucose concentrations of <5%, which will improve the quality of care of our patients.

  12. Analytic trigonometry

    CERN Document Server

    Bruce, William J; Maxwell, E A; Sneddon, I N

    1963-01-01

    Analytic Trigonometry details the fundamental concepts and underlying principle of analytic geometry. The title aims to address the shortcomings in the instruction of trigonometry by considering basic theories of learning and pedagogy. The text first covers the essential elements from elementary algebra, plane geometry, and analytic geometry. Next, the selection tackles the trigonometric functions of angles in general, basic identities, and solutions of equations. The text also deals with the trigonometric functions of real numbers. The fifth chapter details the inverse trigonometric functions

  13. Analytical Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Analytical Labspecializes in Oil and Hydraulic Fluid Analysis, Identification of Unknown Materials, Engineering Investigations, Qualification Testing (to support...

  14. Data Analytics vs. Data Science: A Study of Similarities and Differences in Undergraduate Programs Based on Course Descriptions

    Science.gov (United States)

    Aasheim, Cheryl L.; Williams, Susan; Rutner, Paige; Gardiner, Adrian

    2015-01-01

    The rate at which data is produced and accumulated today is greater than at any point in history with little prospect of slowing. As organizations attempt to collect and analyze this data, there is a tremendous unmet demand for appropriately skilled knowledge workers. In response, universities are developing degree programs in data science and…

  15. The Representation of Iran’s Nuclear Program in British Newspaper Editorials: A Critical Discourse Analytic Perspective

    Directory of Open Access Journals (Sweden)

    Mahmood Reza Atai

    2013-09-01

    Full Text Available In this study, Van Dijk’s (1998 model of CDA was utilized in order to examine the representation of Iran’s nuclear program in editorials published by British news casting companies. The analysis of the editorials was carried out at two levels of headlines and full text stories with regard to the linguistic features of lexical choices, nominalization, passivization, overcompleteness, and voice. The results support biased representation in media discourse, in this case Iran’s nuclear program. Likewise, the findings approve Bloor and Bloor (2007 ideological circles of Self (i.e., the West and Other (i.e., Iran or US and THEM in the media. The findings may be utilized to increase Critical Language Awareness (CLA among EFL teachers / students and can promise implications for ESP materials development and EAP courses for the students of journalism.

  16. Learning analytics: Dataset for empirical evaluation of entry requirements into engineering undergraduate programs in a Nigerian university.

    Science.gov (United States)

    Odukoya, Jonathan A; Popoola, Segun I; Atayero, Aderemi A; Omole, David O; Badejo, Joke A; John, Temitope M; Olowo, Olalekan O

    2018-04-01

    In Nigerian universities, enrolment into any engineering undergraduate program requires that the minimum entry criteria established by the National Universities Commission (NUC) must be satisfied. Candidates seeking admission to study engineering discipline must have reached a predetermined entry age and met the cut-off marks set for Senior School Certificate Examination (SSCE), Unified Tertiary Matriculation Examination (UTME), and the post-UTME screening. However, limited effort has been made to show that these entry requirements eventually guarantee successful academic performance in engineering programs because the data required for such validation are not readily available. In this data article, a comprehensive dataset for empirical evaluation of entry requirements into engineering undergraduate programs in a Nigerian university is presented and carefully analyzed. A total sample of 1445 undergraduates that were admitted between 2005 and 2009 to study Chemical Engineering (CHE), Civil Engineering (CVE), Computer Engineering (CEN), Electrical and Electronics Engineering (EEE), Information and Communication Engineering (ICE), Mechanical Engineering (MEE), and Petroleum Engineering (PET) at Covenant University, Nigeria were randomly selected. Entry age, SSCE aggregate, UTME score, Covenant University Scholastic Aptitude Screening (CUSAS) score, and the Cumulative Grade Point Average (CGPA) of the undergraduates were obtained from the Student Records and Academic Affairs unit. In order to facilitate evidence-based evaluation, the robust dataset is made publicly available in a Microsoft Excel spreadsheet file. On yearly basis, first-order descriptive statistics of the dataset are presented in tables. Box plot representations, frequency distribution plots, and scatter plots of the dataset are provided to enrich its value. Furthermore, correlation and linear regression analyses are performed to understand the relationship between the entry requirements and the

  17. Solving multi-objective facility location problem using the fuzzy analytical hierarchy process and goal programming: a case study on infectious waste disposal centers

    Directory of Open Access Journals (Sweden)

    Narong Wichapa

    Full Text Available The selection of a suitable location for infectious waste disposal is one of the major problems in waste management. Determining the location of infectious waste disposal centers is a difficult and complex process because it requires combining social and environmental factors that are hard to interpret, and cost factors that require the allocation of resources. Additionally, it depends on several regulations. Based on the actual conditions of a case study, forty hospitals and three candidate municipalities in the sub-Northeast region of Thailand, we considered multiple factors such as infrastructure, geological and social & environmental factors, calculating global priority weights using the fuzzy analytical hierarchy process (FAHP. After that, a new multi-objective facility location problem model which combines FAHP and goal programming (GP, namely the FAHP-GP model, was tested. The proposed model can lead to selecting new suitable locations for infectious waste disposal by considering both total cost and final priority weight objectives. The novelty of the proposed model is the simultaneous combination of relevant factors that are difficult to interpret and cost factors, which require the allocation of resources. Keywords: Multi-objective facility location problem, Fuzzy analytic hierarchy process, Infectious waste disposal centers

  18. Users manual for SPLPACK-1: a program package for plotting and editing of experimental and analytical data of various transient systems

    International Nuclear Information System (INIS)

    Muramatsu, Ken; Kosaka, Atsuo; Abe, Kiyoharu; Araya, Fumimasa; Kanazawa, Masayuki; Tanabe, Shuichi; Maniwa, Masaki.

    1983-11-01

    In the field of nuclear safety research, a number of computer codes are being developed for predicting trasient behavior of various nuclear facilities, and they are verified by comparing calculational results with experimental data. In the development and verification of these codes, data plotting is one of indispensable but labor-consuming processes. SPLPACK-1 which is a package of computer programs, written mostly in FORTRAN-IV, has been developed for data editing and plotting. The SPLPACK-1 package consists of two parts, SPLEDIT and SPLPLOT-1, and plotting is performed through two steps : (1) to store data by SPLEDIT in a file in a standardized format (the SPL format) and (2) to plot the data in the file by SPLPLOT-1 or other optional programs. The standardization of data format makes it possible to plot outputs from various sources by a single plotter program. This benefit not only reduces the costs of plotter program developments but also makes it easy to compare many analytical and experimental data in one figure. SPLPLOT-1 draws two-dimensional graphs of time-dependent history of variables or graphs of relationship between two variables. For user's convenience, it is provided with functions of auto-scaling, automatic unit conversion, data processing by user-supplied subroutines and so on. Several additional programs are available for other types of plotting such as bird's-eye view graphs. SPLPACK-1 has been effectively used in JAERI for the developments and uses of safety codes (THYDE-B1, THYDE-P1, RELAP4, RELAP5, MARCH, CORRAL, etc.) and experiments (ROSA-III and LOFT). (author)

  19. Analytic geometry

    CERN Document Server

    Burdette, A C

    1971-01-01

    Analytic Geometry covers several fundamental aspects of analytic geometry needed for advanced subjects, including calculus.This book is composed of 12 chapters that review the principles, concepts, and analytic proofs of geometric theorems, families of lines, the normal equation of the line, and related matters. Other chapters highlight the application of graphing, foci, directrices, eccentricity, and conic-related topics. The remaining chapters deal with the concept polar and rectangular coordinates, surfaces and curves, and planes.This book will prove useful to undergraduate trigonometric st

  20. Follow-up and control of analytical results from environmental monitoring program of the Radioactive Waste Disposal Facility - Abadia de Goias

    International Nuclear Information System (INIS)

    Peixoto, Claudia Marques; Jacomino, Vanusa Maria Feliciano

    2000-01-01

    The analytical results for the 12 month period (August/1997 to July/1998) of the Environmental Monitoring Program operational phase of the radioactive waste disposal facility 'Abadia de Goias' (DIGOI), located in the District of Goiania, are summarized in this report. A statistical treatment of the data using control graphs is also presented. The use of these graphs allows the arrangement of the data in a way that facilitates process control and visualization of data trends and periodicity organized according to temporal variation. A comparison is made of these results vs. those obtained during the pre-operational phase. Moreover, the effective equivalent dose received by the public individuals for different critical pathways is estimated. (author)

  1. Segmented trapped vortex cavity

    Science.gov (United States)

    Grammel, Jr., Leonard Paul (Inventor); Pennekamp, David Lance (Inventor); Winslow, Jr., Ralph Henry (Inventor)

    2010-01-01

    An annular trapped vortex cavity assembly segment comprising includes a cavity forward wall, a cavity aft wall, and a cavity radially outer wall there between defining a cavity segment therein. A cavity opening extends between the forward and aft walls at a radially inner end of the assembly segment. Radially spaced apart pluralities of air injection first and second holes extend through the forward and aft walls respectively. The segment may include first and second expansion joint features at distal first and second ends respectively of the segment. The segment may include a forward subcomponent including the cavity forward wall attached to an aft subcomponent including the cavity aft wall. The forward and aft subcomponents include forward and aft portions of the cavity radially outer wall respectively. A ring of the segments may be circumferentially disposed about an axis to form an annular segmented vortex cavity assembly.

  2. Pavement management segment consolidation

    Science.gov (United States)

    1998-01-01

    Dividing roads into "homogeneous" segments has been a major problem for all areas of highway engineering. SDDOT uses Deighton Associates Limited software, dTIMS, to analyze life-cycle costs for various rehabilitation strategies on each segment of roa...

  3. Speaker segmentation and clustering

    OpenAIRE

    Kotti, M; Moschou, V; Kotropoulos, C

    2008-01-01

    07.08.13 KB. Ok to add the accepted version to Spiral, Elsevier says ok whlile mandate not enforced. This survey focuses on two challenging speech processing topics, namely: speaker segmentation and speaker clustering. Speaker segmentation aims at finding speaker change points in an audio stream, whereas speaker clustering aims at grouping speech segments based on speaker characteristics. Model-based, metric-based, and hybrid speaker segmentation algorithms are reviewed. Concerning speaker...

  4. Spinal segmental dysgenesis

    Directory of Open Access Journals (Sweden)

    N Mahomed

    2009-06-01

    Full Text Available Spinal segmental dysgenesis is a rare congenital spinal abnormality , seen in neonates and infants in which a segment of the spine and spinal cord fails to develop normally . The condition is segmental with normal vertebrae above and below the malformation. This condition is commonly associated with various abnormalities that affect the heart, genitourinary, gastrointestinal tract and skeletal system. We report two cases of spinal segmental dysgenesis and the associated abnormalities.

  5. Computer automation of continuous-flow analyzers for trace constituents in water. Volume 4. Description of program segments. Part 3. TAASTART

    International Nuclear Information System (INIS)

    Crawford, R.W.

    1979-01-01

    This report describes TAASTART, the third program in the series of programs necessary in automating the Technicon AutoAnalyzer. Included is a flow chart that illustrates the program logic and a description of each section and subroutine. In addition, all arrays, variables and strings are listed and defined, and a sample program listing with a complete list of symbols and references is provided

  6. Analytical chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Chae, Myeong Hu; Lee, Hu Jun; Kim, Ha Seok

    1989-02-15

    This book give explanations on analytical chemistry with ten chapters, which deal with development of analytical chemistry, the theory of error with definition and classification, sample and treatment gravimetry on general process of gravimetry in aqueous solution and non-aqueous solution, precipitation titration about precipitation reaction and types, complexometry with summary and complex compound, oxidation-reduction equilibrium on electrode potential and potentiometric titration, solvent extraction and chromatograph and experiment with basic operation for chemical experiment.

  7. Analytical chemistry

    International Nuclear Information System (INIS)

    Chae, Myeong Hu; Lee, Hu Jun; Kim, Ha Seok

    1989-02-01

    This book give explanations on analytical chemistry with ten chapters, which deal with development of analytical chemistry, the theory of error with definition and classification, sample and treatment gravimetry on general process of gravimetry in aqueous solution and non-aqueous solution, precipitation titration about precipitation reaction and types, complexometry with summary and complex compound, oxidation-reduction equilibrium on electrode potential and potentiometric titration, solvent extraction and chromatograph and experiment with basic operation for chemical experiment.

  8. Automatic Melody Segmentation

    NARCIS (Netherlands)

    Rodríguez López, Marcelo

    2016-01-01

    The work presented in this dissertation investigates music segmentation. In the field of Musicology, segmentation refers to a score analysis technique, whereby notated pieces or passages of these pieces are divided into “units” referred to as sections, periods, phrases, and so on. Segmentation

  9. A three-dimensional image processing program for accurate, rapid, and semi-automated segmentation of neuronal somata with dense neurite outgrowth

    Science.gov (United States)

    Ross, James D.; Cullen, D. Kacy; Harris, James P.; LaPlaca, Michelle C.; DeWeerth, Stephen P.

    2015-01-01

    Three-dimensional (3-D) image analysis techniques provide a powerful means to rapidly and accurately assess complex morphological and functional interactions between neural cells. Current software-based identification methods of neural cells generally fall into two applications: (1) segmentation of cell nuclei in high-density constructs or (2) tracing of cell neurites in single cell investigations. We have developed novel methodologies to permit the systematic identification of populations of neuronal somata possessing rich morphological detail and dense neurite arborization throughout thick tissue or 3-D in vitro constructs. The image analysis incorporates several novel automated features for the discrimination of neurites and somata by initially classifying features in 2-D and merging these classifications into 3-D objects; the 3-D reconstructions automatically identify and adjust for over and under segmentation errors. Additionally, the platform provides for software-assisted error corrections to further minimize error. These features attain very accurate cell boundary identifications to handle a wide range of morphological complexities. We validated these tools using confocal z-stacks from thick 3-D neural constructs where neuronal somata had varying degrees of neurite arborization and complexity, achieving an accuracy of ≥95%. We demonstrated the robustness of these algorithms in a more complex arena through the automated segmentation of neural cells in ex vivo brain slices. These novel methods surpass previous techniques by improving the robustness and accuracy by: (1) the ability to process neurites and somata, (2) bidirectional segmentation correction, and (3) validation via software-assisted user input. This 3-D image analysis platform provides valuable tools for the unbiased analysis of neural tissue or tissue surrogates within a 3-D context, appropriate for the study of multi-dimensional cell-cell and cell-extracellular matrix interactions. PMID

  10. Development of Land Segmentation, Stream-Reach Network, and Watersheds in Support of Hydrological Simulation Program-Fortran (HSPF) Modeling, Chesapeake Bay Watershed, and Adjacent Parts of Maryland, Delaware, and Virginia

    Science.gov (United States)

    Martucci, Sarah K.; Krstolic, Jennifer L.; Raffensperger, Jeff P.; Hopkins, Katherine J.

    2006-01-01

    The U.S. Geological Survey, U.S. Environmental Protection Agency Chesapeake Bay Program Office, Interstate Commission on the Potomac River Basin, Maryland Department of the Environment, Virginia Department of Conservation and Recreation, Virginia Department of Environmental Quality, and the University of Maryland Center for Environmental Science are collaborating on the Chesapeake Bay Regional Watershed Model, using Hydrological Simulation Program - FORTRAN to simulate streamflow and concentrations and loads of nutrients and sediment to Chesapeake Bay. The model will be used to provide information for resource managers. In order to establish a framework for model simulation, digital spatial datasets were created defining the discretization of the model region (including the Chesapeake Bay watershed, as well as the adjacent parts of Maryland, Delaware, and Virginia outside the watershed) into land segments, a stream-reach network, and associated watersheds. Land segmentation was based on county boundaries represented by a 1:100,000-scale digital dataset. Fifty of the 254 counties and incorporated cities in the model region were divided on the basis of physiography and topography, producing a total of 309 land segments. The stream-reach network for the Chesapeake Bay watershed part of the model region was based on the U.S. Geological Survey Chesapeake Bay SPARROW (SPAtially Referenced Regressions On Watershed attributes) model stream-reach network. Because that network was created only for the Chesapeake Bay watershed, the rest of the model region uses a 1:500,000-scale stream-reach network. Streams with mean annual streamflow of less than 100 cubic feet per second were excluded based on attributes from the dataset. Additional changes were made to enhance the data and to allow for inclusion of stream reaches with monitoring data that were not part of the original network. Thirty-meter-resolution Digital Elevation Model data were used to delineate watersheds for each

  11. Probabilistic Segmentation of Folk Music Recordings

    Directory of Open Access Journals (Sweden)

    Ciril Bohak

    2016-01-01

    Full Text Available The paper presents a novel method for automatic segmentation of folk music field recordings. The method is based on a distance measure that uses dynamic time warping to cope with tempo variations and a dynamic programming approach to handle pitch drifting for finding similarities and estimating the length of repeating segment. A probabilistic framework based on HMM is used to find segment boundaries, searching for optimal match between the expected segment length, between-segment similarities, and likely locations of segment beginnings. Evaluation of several current state-of-the-art approaches for segmentation of commercial music is presented and their weaknesses when dealing with folk music are exposed, such as intolerance to pitch drift and variable tempo. The proposed method is evaluated and its performance analyzed on a collection of 206 folk songs of different ensemble types: solo, two- and three-voiced, choir, instrumental, and instrumental with singing. It outperforms current commercial music segmentation methods for noninstrumental music and is on a par with the best for instrumental recordings. The method is also comparable to a more specialized method for segmentation of solo singing folk music recordings.

  12. Segmentation-less Digital Rock Physics

    Science.gov (United States)

    Tisato, N.; Ikeda, K.; Goldfarb, E. J.; Spikes, K. T.

    2017-12-01

    In the last decade, Digital Rock Physics (DRP) has become an avenue to investigate physical and mechanical properties of geomaterials. DRP offers the advantage of simulating laboratory experiments on numerical samples that are obtained from analytical methods. Potentially, DRP could allow sparing part of the time and resources that are allocated to perform complicated laboratory tests. Like classic laboratory tests, the goal of DRP is to estimate accurately physical properties of rocks like hydraulic permeability or elastic moduli. Nevertheless, the physical properties of samples imaged using micro-computed tomography (μCT) are estimated through segmentation of the μCT dataset. Segmentation proves to be a challenging and arbitrary procedure that typically leads to inaccurate estimates of physical properties. Here we present a novel technique to extract physical properties from a μCT dataset without the use of segmentation. We show examples in which we use segmentation-less method to simulate elastic wave propagation and pressure wave diffusion to estimate elastic properties and permeability, respectively. The proposed method takes advantage of effective medium theories and uses the density and the porosity that are measured in the laboratory to constrain the results. We discuss the results and highlight that segmentation-less DRP is more accurate than segmentation based DRP approaches and theoretical modeling for the studied rock. In conclusion, the segmentation-less approach here presented seems to be a promising method to improve accuracy and to ease the overall workflow of DRP.

  13. Use of GSR particle analysis program on an analytical SEM to identify sources of emission of airborne particles

    International Nuclear Information System (INIS)

    Chan, Y.C.; Trumper, J.; Bostrom, T.

    2002-01-01

    Full text: High concentrations of airborne particles, in particular PM 10 (particulate matter 10 , but has been little used in Australia for airborne particulates. Two sets of 15 mm PM 10 samples were collected in March and April 2000 from two sites in Brisbane, one within a suburb and one next to an arterial road. The particles were collected directly onto double-sided carbon tapes with a cascade impactor attached to a high-volume PM 10 sampler. The carbon tapes were analysed in a JEOL 840 SEM equipped with a Be-window energy-dispersive X-ray detector and Moran Scientific microanalysis system. An automated Gun Shot Residue (GSR) program was used together with backscattered electron imaging to characterise and analyse individual particulates. About 6,000 particles in total were analysed for each set of impactor samples. Due to limitations of useful pixel size, only particles larger than about 0.5 μm could be analysed. The size, shape and estimated elemental composition (from Na to Pb) of the particles were subjected to non-hierarchical cluster analysis and the characteristics of the clusters were related to their possible sources of emission. Both samples resulted in similar particle clusters. The particles could be classified into three main categories non-spherical (58% of the total number of analysed particles, shape factor >1 1), spherical (15%) and 'carbonaceous' (27%, ie with unexplained % of elemental mass >75%). Non-spherical particles were mainly sea salt and soil particles, and a small amount of iron, lead and mineral dust. The spherical particles were mainly sea salt particles and flyash, and a small amount of iron, lead and secondary sulphate dust. The carbonaceous particles included carbon material mixed with secondary aerosols, roadside dust, sea salt or industrial dust. The arterial road sample also contained more roadside dust and less secondary aerosols than the suburb sample. Current limitations with this method are the minimum particle size

  14. Computer automation of continuous-flow analyzers for trace constituents in water. Volume 4. Description of program segments. Part 2. TAAINRE

    International Nuclear Information System (INIS)

    Crawford, R.W.

    1979-01-01

    TAAINRE, the second program in the series of programs necessary in automating the Technicon AutoAnalyzer, is presented. A flow chart and sequence list that describes and illustrates the function of each logical group of coding, and a description of the contents and function of each section and subroutine in the program is included. In addition, all arrays, strings, and variables are listed and defined, and a sample program listing with a complete list of symbols and references provided

  15. Computer automation of continuous-flow analyzers for trace constituents in water. Volume 4. Description of program segments. Part 1. TAAIN

    International Nuclear Information System (INIS)

    Crawford, R.W.

    1979-01-01

    This report describes TAAIN, the first program in the series of programs necessary in automating the Technicon AutoAnalyzer. A flow chart and sequence list that describes and illustrates each logical group of coding, and a description of the contents and functions of each section and subroutine in the program is included. In addition, all arrays, strings, and variables are listed and defined, and a sample program listing with a complete list of symbols and references is provided

  16. Analytical mechanics

    CERN Document Server

    Lemos, Nivaldo A

    2018-01-01

    Analytical mechanics is the foundation of many areas of theoretical physics including quantum theory and statistical mechanics, and has wide-ranging applications in engineering and celestial mechanics. This introduction to the basic principles and methods of analytical mechanics covers Lagrangian and Hamiltonian dynamics, rigid bodies, small oscillations, canonical transformations and Hamilton–Jacobi theory. This fully up-to-date textbook includes detailed mathematical appendices and addresses a number of advanced topics, some of them of a geometric or topological character. These include Bertrand's theorem, proof that action is least, spontaneous symmetry breakdown, constrained Hamiltonian systems, non-integrability criteria, KAM theory, classical field theory, Lyapunov functions, geometric phases and Poisson manifolds. Providing worked examples, end-of-chapter problems, and discussion of ongoing research in the field, it is suitable for advanced undergraduate students and graduate students studying analyt...

  17. Analytical quadrics

    CERN Document Server

    Spain, Barry; Ulam, S; Stark, M

    1960-01-01

    Analytical Quadrics focuses on the analytical geometry of three dimensions. The book first discusses the theory of the plane, sphere, cone, cylinder, straight line, and central quadrics in their standard forms. The idea of the plane at infinity is introduced through the homogenous Cartesian coordinates and applied to the nature of the intersection of three planes and to the circular sections of quadrics. The text also focuses on paraboloid, including polar properties, center of a section, axes of plane section, and generators of hyperbolic paraboloid. The book also touches on homogenous coordi

  18. SRL online Analytical Development

    International Nuclear Information System (INIS)

    Jenkins, C.W.

    1991-01-01

    The Savannah River Site is operated by the Westinghouse Savannah River Co. for the Department of Energy to produce special nuclear materials for defense. R ampersand D support for site programs is provided by the Savannah River Laboratory, which I represent. The site is known primarily for its nuclear reactors, but actually three fourths of the efforts at the site are devoted to fuel/target fabrication, fuel/target reprocessing, and waste management. All of these operations rely heavily on chemical processes. The site is therefore a large chemical plant. There are then many potential applications for process analytical chemistry at SRS. The Savannah River Laboratory (SRL) has an Analytical Development Section of roughly 65 personnel that perform analyses for R ampersand D efforts at the lab, act as backup to the site Analytical Laboratories Department and develop analytical methods and instruments. I manage a subgroup of the Analytical Development Section called the Process Control ampersand Analyzer Development Group. The Prime mission of this group is to develop online/at-line analytical systems for site applications

  19. An analytic thomism?

    Directory of Open Access Journals (Sweden)

    Daniel Alejandro Pérez Chamorro.

    2012-12-01

    Full Text Available For 50 years the philosophers of the Anglo-Saxon analytic tradition (E. Anscombre, P. Geach, A. Kenny, P. Foot have tried to follow the Thomas Aquinas School which they use as a source to surpass the Cartesian Epistemology and to develop the virtue ethics. Recently, J. Haldane has inaugurated a program of “analytical thomism” which main result until the present has been his “theory of identity mind/world”. Nevertheless, none of Thomás’ admirers has still found the means of assimilating his metaphysics of being.

  20. Supercritical fluid analytical methods

    International Nuclear Information System (INIS)

    Smith, R.D.; Kalinoski, H.T.; Wright, B.W.; Udseth, H.R.

    1988-01-01

    Supercritical fluids are providing the basis for new and improved methods across a range of analytical technologies. New methods are being developed to allow the detection and measurement of compounds that are incompatible with conventional analytical methodologies. Characterization of process and effluent streams for synfuel plants requires instruments capable of detecting and measuring high-molecular-weight compounds, polar compounds, or other materials that are generally difficult to analyze. The purpose of this program is to develop and apply new supercritical fluid techniques for extraction, separation, and analysis. These new technologies will be applied to previously intractable synfuel process materials and to complex mixtures resulting from their interaction with environmental and biological systems

  1. Segmentation of the Indian photovoltaic market

    International Nuclear Information System (INIS)

    Srinivasan, S.

    2005-01-01

    This paper provides an analytical framework studying the actors, networks and institutions and examines the evolution of the Indian Solar Photovoltaic (PV) Market. Different market segments, along the lines of demand and supply of PV equipment, i.e. on the basis of geography, end-use application, subsidy policy and other financing mechanisms, are detailed. The objective of this effort is to identify segments that require special attention from policy makers, donors and the Ministry of Non-Conventional Energy Sources. The paper also discusses the evolution of the commercial PV market in certain parts of the country and trends in the maturity of the market. (author)

  2. Segmenting patients and physicians using preferences from discrete choice experiments.

    Science.gov (United States)

    Deal, Ken

    2014-01-01

    People often form groups or segments that have similar interests and needs and seek similar benefits from health providers. Health organizations need to understand whether the same health treatments, prevention programs, services, and products should be applied to everyone in the relevant population or whether different treatments need to be provided to each of several segments that are relatively homogeneous internally but heterogeneous among segments. Our objective was to explain the purposes, benefits, and methods of segmentation for health organizations, and to illustrate the process of segmenting health populations based on preference coefficients from a discrete choice conjoint experiment (DCE) using an example study of prevention of cyberbullying among university students. We followed a two-level procedure for investigating segmentation incorporating several methods for forming segments in Level 1 using DCE preference coefficients and testing their quality, reproducibility, and usability by health decision makers. Covariates (demographic, behavioral, lifestyle, and health state variables) were included in Level 2 to further evaluate quality and to support the scoring of large databases and developing typing tools for assigning those in the relevant population, but not in the sample, to the segments. Several segmentation solution candidates were found during the Level 1 analysis, and the relationship of the preference coefficients to the segments was investigated using predictive methods. Those segmentations were tested for their quality and reproducibility and three were found to be very close in quality. While one seemed better than others in the Level 1 analysis, another was very similar in quality and proved ultimately better in predicting segment membership using covariates in Level 2. The two segments in the final solution were profiled for attributes that would support the development and acceptance of cyberbullying prevention programs among university

  3. Unsupervised Segmentation Methods of TV Contents

    Directory of Open Access Journals (Sweden)

    Elie El-Khoury

    2010-01-01

    Full Text Available We present a generic algorithm to address various temporal segmentation topics of audiovisual contents such as speaker diarization, shot, or program segmentation. Based on a GLR approach, involving the ΔBIC criterion, this algorithm requires the value of only a few parameters to produce segmentation results at a desired scale and on most typical low-level features used in the field of content-based indexing. Results obtained on various corpora are of the same quality level than the ones obtained by other dedicated and state-of-the-art methods.

  4. Selecting university undergraduate student activities via compromised-analytical hierarchy process and 0-1 integer programming to maximize SETARA points

    Science.gov (United States)

    Nazri, Engku Muhammad; Yusof, Nur Ai'Syah; Ahmad, Norazura; Shariffuddin, Mohd Dino Khairri; Khan, Shazida Jan Mohd

    2017-11-01

    Prioritizing and making decisions on what student activities to be selected and conducted to fulfill the aspiration of a university as translated in its strategic plan must be executed with transparency and accountability. It is becoming even more crucial, particularly for universities in Malaysia with the recent budget cut imposed by the Malaysian government. In this paper, we illustrated how 0-1 integer programming (0-1 IP) model was implemented to select which activities among the forty activities proposed by the student body of Universiti Utara Malaysia (UUM) to be implemented for the 2017/2018 academic year. Two different models were constructed. The first model was developed to determine the minimum total budget that should be given to the student body by the UUM management to conduct all the activities that can fulfill the minimum targeted number of activities as stated in its strategic plan. On the other hand, the second model was developed to determine which activities to be selected based on the total budget already allocated beforehand by the UUM management towards fulfilling the requirements as set in its strategic plan. The selection of activities for the second model, was also based on the preference of the members of the student body whereby the preference value for each activity was determined using Compromised-Analytical Hierarchy Process. The outputs from both models were compared and discussed. The technique used in this study will be useful and suitable to be implemented by organizations with key performance indicator-oriented programs and having limited budget allocation issues.

  5. Segmenting the MBA Market: An Australian Strategy.

    Science.gov (United States)

    Everett, James E.; Armstrong, Robert W.

    1990-01-01

    A University of Western Australia market segmentation study for the masters program in business administration examined the relationship between Graduate Management Admission Test scores, work experience, faculty of undergraduate degree, gender, and academic success in the program. Implications of the results for establishing admission criteria…

  6. Application of the analytic hierarchy process in the performance measurement of colorectal cancer care for the design of a pay-for-performance program in Taiwan.

    Science.gov (United States)

    Chung, Kuo-Piao; Chen, Li-Ju; Chang, Yao-Jen; Chang, Yun-Jau; Lai, Mei-Shu

    2013-02-01

    To prioritize performance measures for colorectal cancer care to facilitate the implementation of a pay-for-performance (PFP) system. Questionnaires survey. Medical hospitals in Taiwan. Sixty-six medical doctors from 5 November 2009 to 10 December 2009. Analytic hierarchy process (AHP) technique. Main outcome measure(s) Performance measures (two pre-treatment, six treatment related and three monitoring related) were used. Forty-eight doctors responded and returned questionnaires (response rate 72.7%) with surgeons and physicians contributing equally. The most important measure was the proportion of colorectal patients who had pre-operative examinations that included chest X-ray and abdominal ultrasound, computed tomography or MRI (global priority: 0.144), followed by the proportion of stages I-III colorectal cancer patients who had undergone a wide surgical resection documented as 'negative margin' (global priority: 0.133) and the proportion of colorectal cancer patients who had undergone surgery with a pathology report that included information on tumor size and node differentiation (global priority: 0.116). Most participants considered that the best interval for the renewal indicators was 3-5 years (43.75%) followed by 5-10 years (27.08%). To design a PFP program, the AHP method is a useful technique to prioritize performance measures, especially in a highly specialized domain such as colorectal cancer care.

  7. Penerapan Metode Analytic Network Process (ANP Untuk Pendukung Keputusan Pemilihan Tema Tugas Akhir (Studi Kasus: Program Studi S1 Informatika ST3 Telkom

    Directory of Open Access Journals (Sweden)

    Dila Nurlaila

    2017-07-01

    Full Text Available Berdasarkan hasil dari survey yang dilakukan terhadap 30 mahasiswa Informatika yang akan mengambil mata kuliah tugas akhir, lebih dari 80% menjawab belum memiliki konsep Tugas Akhir, hal ini menjadi perhatian bahwa masih banyak dari mahasiswa yang belum mengetahui tema Tugas Akhir apa yang akan dambilnya nanti yang sesuai dengan minat dan kompetensinya. Dari hal tersebut akan dilakukan penelitian penerapan metode Analytic Network Process (ANP pada  Pendukung keputusan pemilihan tugas tema Tugas Akhir. ANP merupakan suatu metode dalam decision making yang mempertimbangkan hubungan antar kriteria. Penelitian ini bertujuan untuk menguji tingkat keberhasilan metode ANP dalam mengatasi masalah mahasiswa yang belum mengetahui konsep dari tugas akhir. Langkah pertama, ditentukan kriteria yang menjadi penentu dari pemilihan tema Tugas Akhir di prodi S1 Informatika. Kriteria ini akan dibuat model jaringan ANP menggunakan software super decision dan setiap kriteria akan dilakukan pairwised comparison (perbandingan berpasangan guna untuk mendapatkan pembobotan dari masing – masing kriteria dan sub kriteria. Yang menjadi expert judgement pada pengambil keputusan ini adalah ketua keahlian program studi ICM dan DESTI. Setelah melakukan pengujian dengan membandingkan pilihan secara manual dengan pilihan berdasarkan perhitungan ANP hasilnya sebesar 46,6% tema tugas akhir mahasiswa sesuai dan akurat, hilangnya 53,4% akurasi dikarenakan ketidak sesuaian jawaban mahasiswa saat menentukan nilai peminatan.

  8. Analytical and Mathematical Modeling and Optimization of Fiber Metal Laminates (FMLs subjected to low-velocity impact via combined response surface regression and zero-One programming

    Directory of Open Access Journals (Sweden)

    Faramarz Ashenai Ghasemi

    Full Text Available This paper presents analytical and mathematical modeling and optimization of the dynamic behavior of the fiber metal laminates (FMLs subjected to low-velocity impact. The deflection to thickness (w/h ratio has been identified through the governing equations of the plate that are solved using the first-order shear deformation theory as well as the Fourier series method. With the help of a two degrees-of-freedom system, consisting of springs-masses, and the Choi's linearized Hertzian contact model the interaction between the impactor and the plate is modeled. Thirty-one experiments are conducted on samples of different layer sequences and volume fractions of Al plies in the composite Structures. A reliable fitness function in the form of a strict linear mathematical function constructed. Using an ordinary least square method, response regression coefficients estimated and a zero-one programming technique proposed to optimize the FML plate behavior subjected to any technological or cost restrictions. The results indicated that FML plate behavior is highly affected by layer sequences and volume fractions of Al plies. The results also showed that, embedding Al plies at outer layers of the structure significantly results in a better response of the structure under low-velocity impact, instead of embedding them in the middle or middle and outer layers of the structure.

  9. Analytical chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jae Seong

    1993-02-15

    This book is comprised of nineteen chapters, which describes introduction of analytical chemistry, experimental error and statistics, chemistry equilibrium and solubility, gravimetric analysis with mechanism of precipitation, range and calculation of the result, volume analysis on general principle, sedimentation method on types and titration curve, acid base balance, acid base titration curve, complex and firing reaction, introduction of chemical electro analysis, acid-base titration curve, electrode and potentiometry, electrolysis and conductometry, voltammetry and polarographic spectrophotometry, atomic spectrometry, solvent extraction, chromatograph and experiments.

  10. Analytical chemistry

    International Nuclear Information System (INIS)

    Choi, Jae Seong

    1993-02-01

    This book is comprised of nineteen chapters, which describes introduction of analytical chemistry, experimental error and statistics, chemistry equilibrium and solubility, gravimetric analysis with mechanism of precipitation, range and calculation of the result, volume analysis on general principle, sedimentation method on types and titration curve, acid base balance, acid base titration curve, complex and firing reaction, introduction of chemical electro analysis, acid-base titration curve, electrode and potentiometry, electrolysis and conductometry, voltammetry and polarographic spectrophotometry, atomic spectrometry, solvent extraction, chromatograph and experiments.

  11. Analytical chemistry

    International Nuclear Information System (INIS)

    Anon.

    1985-01-01

    The division for Analytical Chemistry continued to try and develope an accurate method for the separation of trace amounts from mixtures which, contain various other elements. Ion exchange chromatography is of special importance in this regard. New separation techniques were tried on certain trace amounts in South African standard rock materials and special ceramics. Methods were also tested for the separation of carrier-free radioisotopes from irradiated cyclotron discs

  12. Segmentation, advertising and prices

    NARCIS (Netherlands)

    Galeotti, Andrea; Moraga González, José

    This paper explores the implications of market segmentation on firm competitiveness. In contrast to earlier work, here market segmentation is minimal in the sense that it is based on consumer attributes that are completely unrelated to tastes. We show that when the market is comprised by two

  13. Sipunculans and segmentation

    DEFF Research Database (Denmark)

    Wanninger, Andreas; Kristof, Alen; Brinkmann, Nora

    2009-01-01

    mechanisms may act on the level of gene expression, cell proliferation, tissue differentiation and organ system formation in individual segments. Accordingly, in some polychaete annelids the first three pairs of segmental peripheral neurons arise synchronously, while the metameric commissures of the ventral...

  14. The Regional MBA: Distinct Segments, Wants, and Needs

    Science.gov (United States)

    Passyn, Kirsten; Diriker, Memo

    2011-01-01

    MBA Programs in top-tier school differ greatly from those in regional schools. A survey that aimed at assessing segmentation, pedagogy, and satisfaction in regional MBA programs was developed and administered in three universities of the Mid Atlantic, Midwest, and Southern regions. The results show four clearly distinguished segments that…

  15. Pancreas and cyst segmentation

    Science.gov (United States)

    Dmitriev, Konstantin; Gutenko, Ievgeniia; Nadeem, Saad; Kaufman, Arie

    2016-03-01

    Accurate segmentation of abdominal organs from medical images is an essential part of surgical planning and computer-aided disease diagnosis. Many existing algorithms are specialized for the segmentation of healthy organs. Cystic pancreas segmentation is especially challenging due to its low contrast boundaries, variability in shape, location and the stage of the pancreatic cancer. We present a semi-automatic segmentation algorithm for pancreata with cysts. In contrast to existing automatic segmentation approaches for healthy pancreas segmentation which are amenable to atlas/statistical shape approaches, a pancreas with cysts can have even higher variability with respect to the shape of the pancreas due to the size and shape of the cyst(s). Hence, fine results are better attained with semi-automatic steerable approaches. We use a novel combination of random walker and region growing approaches to delineate the boundaries of the pancreas and cysts with respective best Dice coefficients of 85.1% and 86.7%, and respective best volumetric overlap errors of 26.0% and 23.5%. Results show that the proposed algorithm for pancreas and pancreatic cyst segmentation is accurate and stable.

  16. Improved model reduction and tuning of fractional-order PI(λ)D(μ) controllers for analytical rule extraction with genetic programming.

    Science.gov (United States)

    Das, Saptarshi; Pan, Indranil; Das, Shantanu; Gupta, Amitava

    2012-03-01

    Genetic algorithm (GA) has been used in this study for a new approach of suboptimal model reduction in the Nyquist plane and optimal time domain tuning of proportional-integral-derivative (PID) and fractional-order (FO) PI(λ)D(μ) controllers. Simulation studies show that the new Nyquist-based model reduction technique outperforms the conventional H(2)-norm-based reduced parameter modeling technique. With the tuned controller parameters and reduced-order model parameter dataset, optimum tuning rules have been developed with a test-bench of higher-order processes via genetic programming (GP). The GP performs a symbolic regression on the reduced process parameters to evolve a tuning rule which provides the best analytical expression to map the data. The tuning rules are developed for a minimum time domain integral performance index described by a weighted sum of error index and controller effort. From the reported Pareto optimal front of the GP-based optimal rule extraction technique, a trade-off can be made between the complexity of the tuning formulae and the control performance. The efficacy of the single-gene and multi-gene GP-based tuning rules has been compared with the original GA-based control performance for the PID and PI(λ)D(μ) controllers, handling four different classes of representative higher-order processes. These rules are very useful for process control engineers, as they inherit the power of the GA-based tuning methodology, but can be easily calculated without the requirement for running the computationally intensive GA every time. Three-dimensional plots of the required variation in PID/fractional-order PID (FOPID) controller parameters with reduced process parameters have been shown as a guideline for the operator. Parametric robustness of the reported GP-based tuning rules has also been shown with credible simulation examples. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  17. Segmentation of consumer's markets and evaluation of market's segments

    OpenAIRE

    ŠVECOVÁ, Iveta

    2013-01-01

    The goal of this bachelor thesis was to explain a possibly segmentation of consumer´s markets for a chosen company, and to present a suitable goods offer, so it would be suitable to the needs of selected segments. The work is divided into theoretical and practical part. First part describes marketing, segmentation, segmentation of consumer's markets, consumer's market, market's segments a other terms. Second part describes an evaluation of questionnaire survey, discovering of market's segment...

  18. Efficient Algorithms for Segmentation of Item-Set Time Series

    Science.gov (United States)

    Chundi, Parvathi; Rosenkrantz, Daniel J.

    We propose a special type of time series, which we call an item-set time series, to facilitate the temporal analysis of software version histories, email logs, stock market data, etc. In an item-set time series, each observed data value is a set of discrete items. We formalize the concept of an item-set time series and present efficient algorithms for segmenting a given item-set time series. Segmentation of a time series partitions the time series into a sequence of segments where each segment is constructed by combining consecutive time points of the time series. Each segment is associated with an item set that is computed from the item sets of the time points in that segment, using a function which we call a measure function. We then define a concept called the segment difference, which measures the difference between the item set of a segment and the item sets of the time points in that segment. The segment difference values are required to construct an optimal segmentation of the time series. We describe novel and efficient algorithms to compute segment difference values for each of the measure functions described in the paper. We outline a dynamic programming based scheme to construct an optimal segmentation of the given item-set time series. We use the item-set time series segmentation techniques to analyze the temporal content of three different data sets—Enron email, stock market data, and a synthetic data set. The experimental results show that an optimal segmentation of item-set time series data captures much more temporal content than a segmentation constructed based on the number of time points in each segment, without examining the item set data at the time points, and can be used to analyze different types of temporal data.

  19. THE PROPOSAL MODEL OF RATIONAL WORKFORCE ASSIGNMENT IN DOKUZ EYLUL UNIVERSITY BY ANALYTIC HIERARCHY PROCESS BASED 0-1 INTEGER PROGRAMMING

    Directory of Open Access Journals (Sweden)

    Yılmaz GÖKŞEN

    2016-11-01

    Full Text Available Production factors include labour, capital, nature and technology in traditional organization management. Nowadays, creativity, innovation and ability can be added to these. Competition is intense both in private and public sectors. Workforce can be said to be the main source enriching the organization and making it complex. What makes the organizational powerful is that talented personnel can make new ideas to be productive by using their creativity. From this point of view workforce productivity is important parameter in organizational efficiency. Workforce productivity in organizations where there are too many employees is primarily related to employ the personnel in a right position. The appointment of employees to work in accordance with their capabilities increase efficiency. This situation makes the structure of the workforce and make the organization be capable of competing with the rivals. The Assignment model of the type of linear programming models is a mathematical method used to match the right person for the right job. The coefficients of the variables in the objective function of the assignment model constitute potential contributions of employees. Factors that make contributions in different types of jobs are different and expert opinion is needed for the evaluation of these factors. In this study, more than 1000 people who work for Dokuz Eylul University as drivers, food handlers, technicians, office workers, securities and servants are taken into consideration. Gender, level of education, the distance from working place, marital status, number of children and tenure of these employees have been included in the analysis. Analytical hierarchy process which is a method of multi-criteria decision-making model has been preferred as a method. Specific criteria have been determined for each occupational group depending on this method. Later, weighted average for these criteria in question have been found. With these values belonging

  20. Comprehensive electrocardiogram-to-device time for primary percutaneous coronary intervention in ST-segment elevation myocardial infarction: A report from the American Heart Association mission: Lifeline program.

    Science.gov (United States)

    Shavadia, Jay S; French, William; Hellkamp, Anne S; Thomas, Laine; Bates, Eric R; Manoukian, Steven V; Kontos, Michael C; Suter, Robert; Henry, Timothy D; Dauerman, Harold L; Roe, Matthew T

    2018-03-01

    Assessing hospital-related network-level primary percutaneous coronary intervention (PCI) performance for ST-segment elevation myocardial infarction (STEMI) is challenging due to differential time-to-treatment metrics based on location of diagnostic electrocardiogram (ECG) for STEMI. STEMI patients undergoing primary PCI at 588 PCI-capable hospitals in AHA Mission: Lifeline (2008-2013) were categorized by initial STEMI identification location: PCI-capable hospitals (Group 1); pre-hospital setting (Group 2); and non-PCI-capable hospitals (Group 3). Patient-specific time-to-treatment categories were converted to minutes ahead of or behind their group-specific mean; average time-to-treatment difference for all patients at a given hospital was termed comprehensive ECG-to-device time. Hospitals were then stratified into tertiles based on their comprehensive ECG-to-device times with negative values below the mean representing shorter (faster) time intervals. Of 117,857 patients, the proportion in Groups 1, 2, and 3 were 42%, 33%, and 25%, respectively. Lower rates of heart failure and cardiac arrest at presentation are noted within patients presenting to high-performing hospitals. Median comprehensive ECG-to-device time was shortest at -9 minutes (25th, 75th percentiles: -13, -6) for the high-performing hospital tertile, 1 minute (-1, 3) for middle-performing, and 11 minutes (7, 16) for low-performing. Unadjusted rates of in-hospital mortality were 2.3%, 2.6%, and 2.7%, respectively, but the adjusted risk of in-hospital mortality was similar across tertiles. Comprehensive ECG-to-device time provides an integrated hospital-related network-level assessment of reperfusion timing metrics for primary PCI, regardless of the location for STEMI identification; further validation will delineate how this metric can be used to facilitate STEMI care improvements. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Visual Analytics Applied to Image Analysis : From Segmentation to Classification

    NARCIS (Netherlands)

    Rauber, Paulo

    2017-01-01

    Image analysis is the field of study concerned with extracting information from images. This field is immensely important for commercial and scientific applications, from identifying people in photographs to recognizing diseases in medical images. The goal behind the work presented in this thesis is

  2. Analytical mechanics

    CERN Document Server

    Helrich, Carl S

    2017-01-01

    This advanced undergraduate textbook begins with the Lagrangian formulation of Analytical Mechanics and then passes directly to the Hamiltonian formulation and the canonical equations, with constraints incorporated through Lagrange multipliers. Hamilton's Principle and the canonical equations remain the basis of the remainder of the text. Topics considered for applications include small oscillations, motion in electric and magnetic fields, and rigid body dynamics. The Hamilton-Jacobi approach is developed with special attention to the canonical transformation in order to provide a smooth and logical transition into the study of complex and chaotic systems. Finally the text has a careful treatment of relativistic mechanics and the requirement of Lorentz invariance. The text is enriched with an outline of the history of mechanics, which particularly outlines the importance of the work of Euler, Lagrange, Hamilton and Jacobi. Numerous exercises with solutions support the exceptionally clear and concise treatment...

  3. Segmental tuberculosis verrucosa cutis

    Directory of Open Access Journals (Sweden)

    Hanumanthappa H

    1994-01-01

    Full Text Available A case of segmental Tuberculosis Verrucosa Cutis is reported in 10 year old boy. The condition was resembling the ascending lymphangitic type of sporotrichosis. The lesions cleared on treatment with INH 150 mg daily for 6 months.

  4. Chromosome condensation and segmentation

    International Nuclear Information System (INIS)

    Viegas-Pequignot, E.M.

    1981-01-01

    Some aspects of chromosome condensation in mammalians -humans especially- were studied by means of cytogenetic techniques of chromosome banding. Two further approaches were adopted: a study of normal condensation as early as prophase, and an analysis of chromosome segmentation induced by physical (temperature and γ-rays) or chemical agents (base analogues, antibiotics, ...) in order to show out the factors liable to affect condensation. Here 'segmentation' means an abnormal chromosome condensation appearing systematically and being reproducible. The study of normal condensation was made possible by the development of a technique based on cell synchronization by thymidine and giving prophasic and prometaphasic cells. Besides, the possibility of inducing R-banding segmentations on these cells by BrdU (5-bromodeoxyuridine) allowed a much finer analysis of karyotypes. Another technique was developed using 5-ACR (5-azacytidine), it allowed to induce a segmentation similar to the one obtained using BrdU and identify heterochromatic areas rich in G-C bases pairs [fr

  5. International EUREKA: Initialization Segment

    International Nuclear Information System (INIS)

    1982-02-01

    The Initialization Segment creates the starting description of the uranium market. The starting description includes the international boundaries of trade, the geologic provinces, resources, reserves, production, uranium demand forecasts, and existing market transactions. The Initialization Segment is designed to accept information of various degrees of detail, depending on what is known about each region. It must transform this information into a specific data structure required by the Market Segment of the model, filling in gaps in the information through a predetermined sequence of defaults and built in assumptions. A principal function of the Initialization Segment is to create diagnostic messages indicating any inconsistencies in data and explaining which assumptions were used to organize the data base. This permits the user to manipulate the data base until such time the user is satisfied that all the assumptions used are reasonable and that any inconsistencies are resolved in a satisfactory manner

  6. Fluence map segmentation

    International Nuclear Information System (INIS)

    Rosenwald, J.-C.

    2008-01-01

    The lecture addressed the following topics: 'Interpreting' the fluence map; The sequencer; Reasons for difference between desired and actual fluence map; Principle of 'Step and Shoot' segmentation; Large number of solutions for given fluence map; Optimizing 'step and shoot' segmentation; The interdigitation constraint; Main algorithms; Conclusions on segmentation algorithms (static mode); Optimizing intensity levels and monitor units; Sliding window sequencing; Synchronization to avoid the tongue-and-groove effect; Accounting for physical characteristics of MLC; Importance of corrections for leaf transmission and offset; Accounting for MLC mechanical constraints; The 'complexity' factor; Incorporating the sequencing into optimization algorithm; Data transfer to the treatment machine; Interface between R and V and accelerator; and Conclusions on fluence map segmentation (Segmentation is part of the overall inverse planning procedure; 'Step and Shoot' and 'Dynamic' options are available for most TPS (depending on accelerator model; The segmentation phase tends to come into the optimization loop; The physical characteristics of the MLC have a large influence on final dose distribution; The IMRT plans (MU and relative dose distribution) must be carefully validated). (P.A.)

  7. Gamifying Video Object Segmentation.

    Science.gov (United States)

    Spampinato, Concetto; Palazzo, Simone; Giordano, Daniela

    2017-10-01

    Video object segmentation can be considered as one of the most challenging computer vision problems. Indeed, so far, no existing solution is able to effectively deal with the peculiarities of real-world videos, especially in cases of articulated motion and object occlusions; limitations that appear more evident when we compare the performance of automated methods with the human one. However, manually segmenting objects in videos is largely impractical as it requires a lot of time and concentration. To address this problem, in this paper we propose an interactive video object segmentation method, which exploits, on one hand, the capability of humans to identify correctly objects in visual scenes, and on the other hand, the collective human brainpower to solve challenging and large-scale tasks. In particular, our method relies on a game with a purpose to collect human inputs on object locations, followed by an accurate segmentation phase achieved by optimizing an energy function encoding spatial and temporal constraints between object regions as well as human-provided location priors. Performance analysis carried out on complex video benchmarks, and exploiting data provided by over 60 users, demonstrated that our method shows a better trade-off between annotation times and segmentation accuracy than interactive video annotation and automated video object segmentation approaches.

  8. Strategic market segmentation

    Directory of Open Access Journals (Sweden)

    Maričić Branko R.

    2015-01-01

    Full Text Available Strategic planning of marketing activities is the basis of business success in modern business environment. Customers are not homogenous in their preferences and expectations. Formulating an adequate marketing strategy, focused on realization of company's strategic objectives, requires segmented approach to the market that appreciates differences in expectations and preferences of customers. One of significant activities in strategic planning of marketing activities is market segmentation. Strategic planning imposes a need to plan marketing activities according to strategically important segments on the long term basis. At the same time, there is a need to revise and adapt marketing activities on the short term basis. There are number of criteria based on which market segmentation is performed. The paper will consider effectiveness and efficiency of different market segmentation criteria based on empirical research of customer expectations and preferences. The analysis will include traditional criteria and criteria based on behavioral model. The research implications will be analyzed from the perspective of selection of the most adequate market segmentation criteria in strategic planning of marketing activities.

  9. Remote sensing image segmentation based on Hadoop cloud platform

    Science.gov (United States)

    Li, Jie; Zhu, Lingling; Cao, Fubin

    2018-01-01

    To solve the problem that the remote sensing image segmentation speed is slow and the real-time performance is poor, this paper studies the method of remote sensing image segmentation based on Hadoop platform. On the basis of analyzing the structural characteristics of Hadoop cloud platform and its component MapReduce programming, this paper proposes a method of image segmentation based on the combination of OpenCV and Hadoop cloud platform. Firstly, the MapReduce image processing model of Hadoop cloud platform is designed, the input and output of image are customized and the segmentation method of the data file is rewritten. Then the Mean Shift image segmentation algorithm is implemented. Finally, this paper makes a segmentation experiment on remote sensing image, and uses MATLAB to realize the Mean Shift image segmentation algorithm to compare the same image segmentation experiment. The experimental results show that under the premise of ensuring good effect, the segmentation rate of remote sensing image segmentation based on Hadoop cloud Platform has been greatly improved compared with the single MATLAB image segmentation, and there is a great improvement in the effectiveness of image segmentation.

  10. Segmented block copolymers with monodisperse aramide end-segments

    NARCIS (Netherlands)

    Araichimani, A.; Gaymans, R.J.

    2008-01-01

    Segmented block copolymers were synthesized using monodisperse diaramide (TT) as hard segments and PTMO with a molecular weight of 2 900 g · mol-1 as soft segments. The aramide: PTMO segment ratio was increased from 1:1 to 2:1 thereby changing the structure from a high molecular weight multi-block

  11. Rediscovering market segmentation.

    Science.gov (United States)

    Yankelovich, Daniel; Meer, David

    2006-02-01

    In 1964, Daniel Yankelovich introduced in the pages of HBR the concept of nondemographic segmentation, by which he meant the classification of consumers according to criteria other than age, residence, income, and such. The predictive power of marketing studies based on demographics was no longer strong enough to serve as a basis for marketing strategy, he argued. Buying patterns had become far better guides to consumers' future purchases. In addition, properly constructed nondemographic segmentations could help companies determine which products to develop, which distribution channels to sell them in, how much to charge for them, and how to advertise them. But more than 40 years later, nondemographic segmentation has become just as unenlightening as demographic segmentation had been. Today, the technique is used almost exclusively to fulfill the needs of advertising, which it serves mainly by populating commercials with characters that viewers can identify with. It is true that psychographic types like "High-Tech Harry" and "Joe Six-Pack" may capture some truth about real people's lifestyles, attitudes, self-image, and aspirations. But they are no better than demographics at predicting purchase behavior. Thus they give corporate decision makers very little idea of how to keep customers or capture new ones. Now, Daniel Yankelovich returns to these pages, with consultant David Meer, to argue the case for a broad view of nondemographic segmentation. They describe the elements of a smart segmentation strategy, explaining how segmentations meant to strengthen brand identity differ from those capable of telling a company which markets it should enter and what goods to make. And they introduce their "gravity of decision spectrum", a tool that focuses on the form of consumer behavior that should be of the greatest interest to marketers--the importance that consumers place on a product or product category.

  12. Predictive analytics can support the ACO model.

    Science.gov (United States)

    Bradley, Paul

    2012-04-01

    Predictive analytics can be used to rapidly spot hard-to-identify opportunities to better manage care--a key tool in accountable care. When considering analytics models, healthcare providers should: Make value-based care a priority and act on information from analytics models. Create a road map that includes achievable steps, rather than major endeavors. Set long-term expectations and recognize that the effectiveness of an analytics program takes time, unlike revenue cycle initiatives that may show a quick return.

  13. Scorpion image segmentation system

    Science.gov (United States)

    Joseph, E.; Aibinu, A. M.; Sadiq, B. A.; Bello Salau, H.; Salami, M. J. E.

    2013-12-01

    Death as a result of scorpion sting has been a major public health problem in developing countries. Despite the high rate of death as a result of scorpion sting, little report exists in literature of intelligent device and system for automatic detection of scorpion. This paper proposed a digital image processing approach based on the floresencing characteristics of Scorpion under Ultra-violet (UV) light for automatic detection and identification of scorpion. The acquired UV-based images undergo pre-processing to equalize uneven illumination and colour space channel separation. The extracted channels are then segmented into two non-overlapping classes. It has been observed that simple thresholding of the green channel of the acquired RGB UV-based image is sufficient for segmenting Scorpion from other background components in the acquired image. Two approaches to image segmentation have also been proposed in this work, namely, the simple average segmentation technique and K-means image segmentation. The proposed algorithm has been tested on over 40 UV scorpion images obtained from different part of the world and results obtained show an average accuracy of 97.7% in correctly classifying the pixel into two non-overlapping clusters. The proposed 1system will eliminate the problem associated with some of the existing manual approaches presently in use for scorpion detection.

  14. Segmental Refinement: A Multigrid Technique for Data Locality

    KAUST Repository

    Adams, Mark F.; Brown, Jed; Knepley, Matt; Samtaney, Ravi

    2016-01-01

    We investigate a domain decomposed multigrid technique, termed segmental refinement, for solving general nonlinear elliptic boundary value problems. We extend the method first proposed in 1994 by analytically and experimentally investigating its complexity. We confirm that communication of traditional parallel multigrid is eliminated on fine grids, with modest amounts of extra work and storage, while maintaining the asymptotic exactness of full multigrid. We observe an accuracy dependence on the segmental refinement subdomain size, which was not considered in the original analysis. We present a communication complexity analysis that quantifies the communication costs ameliorated by segmental refinement and report performance results with up to 64K cores on a Cray XC30.

  15. Development of a segmented grating mount system for FIREX-1

    International Nuclear Information System (INIS)

    Ezaki, Y; Tabata, M; Kihara, M; Horiuchi, Y; Endo, M; Jitsuno, T

    2008-01-01

    A mount system for segmented meter-sized gratings has been developed, which has a high precision grating support mechanism and drive mechanism to minimize both deformation of the optical surfaces and misalignments in setting a segmented grating for obtaining sufficient performance of the pulse compressor. From analytical calculations, deformation of the grating surface is less than 1/20 lambda RMS and the estimated drive resolution for piston and tilt drive of the segmented grating is 1/20 lambda, which are both compliant with the requirements for the rear-end subsystem of FIREX-1

  16. Segmental Refinement: A Multigrid Technique for Data Locality

    KAUST Repository

    Adams, Mark F.

    2016-08-04

    We investigate a domain decomposed multigrid technique, termed segmental refinement, for solving general nonlinear elliptic boundary value problems. We extend the method first proposed in 1994 by analytically and experimentally investigating its complexity. We confirm that communication of traditional parallel multigrid is eliminated on fine grids, with modest amounts of extra work and storage, while maintaining the asymptotic exactness of full multigrid. We observe an accuracy dependence on the segmental refinement subdomain size, which was not considered in the original analysis. We present a communication complexity analysis that quantifies the communication costs ameliorated by segmental refinement and report performance results with up to 64K cores on a Cray XC30.

  17. Segmentation of complex document

    Directory of Open Access Journals (Sweden)

    Souad Oudjemia

    2014-06-01

    Full Text Available In this paper we present a method for segmentation of documents image with complex structure. This technique based on GLCM (Grey Level Co-occurrence Matrix used to segment this type of document in three regions namely, 'graphics', 'background' and 'text'. Very briefly, this method is to divide the document image, in block size chosen after a series of tests and then applying the co-occurrence matrix to each block in order to extract five textural parameters which are energy, entropy, the sum entropy, difference entropy and standard deviation. These parameters are then used to classify the image into three regions using the k-means algorithm; the last step of segmentation is obtained by grouping connected pixels. Two performance measurements are performed for both graphics and text zones; we have obtained a classification rate of 98.3% and a Misclassification rate of 1.79%.

  18. Modeling of market segmentation for new IT product development

    Science.gov (United States)

    Nasiopoulos, Dimitrios K.; Sakas, Damianos P.; Vlachos, D. S.; Mavrogianni, Amanda

    2015-02-01

    Businesses from all Information Technology sectors use market segmentation[1] in their product development[2] and strategic planning[3]. Many studies have concluded that market segmentation is considered as the norm of modern marketing. With the rapid development of technology, customer needs are becoming increasingly diverse. These needs can no longer be satisfied by a mass marketing approach and follow one rule. IT Businesses can face with this diversity by pooling customers[4] with similar requirements and buying behavior and strength into segments. The result of the best choices about which segments are the most appropriate to serve can then be made, thus making the best of finite resources. Despite the attention which segmentation gathers and the resources that are invested in it, growing evidence suggests that businesses have problems operationalizing segmentation[5]. These problems take various forms. There may have been a rule that the segmentation process necessarily results in homogeneous groups of customers for whom appropriate marketing programs and procedures for dealing with them can be developed. Then the segmentation process, that a company follows, can fail. This increases concerns about what causes segmentation failure and how it might be overcome. To prevent the failure, we created a dynamic simulation model of market segmentation[6] based on the basic factors leading to this segmentation.

  19. Connecting textual segments

    DEFF Research Database (Denmark)

    Brügger, Niels

    2017-01-01

    history than just the years of the emergence of the web, the chapter traces the history of how segments of text have deliberately been connected to each other by the use of specific textual and media features, from clay tablets, manuscripts on parchment, and print, among others, to hyperlinks on stand......In “Connecting textual segments: A brief history of the web hyperlink” Niels Brügger investigates the history of one of the most fundamental features of the web: the hyperlink. Based on the argument that the web hyperlink is best understood if it is seen as another step in a much longer and broader...

  20. The Health Extension Program and Its Association with Change in Utilization of Selected Maternal Health Services in Tigray Region, Ethiopia: A Segmented Linear Regression Analysis

    Science.gov (United States)

    Gebrehiwot, Tesfay Gebregzabher; San Sebastian, Miguel; Edin, Kerstin; Goicolea, Isabel

    2015-01-01

    Background In 2003, the Ethiopian Ministry of Health established the Health Extension Program (HEP), with the goal of improving access to health care and health promotion activities in rural areas of the country. This paper aims to assess the association of the HEP with improved utilization of maternal health services in Northern Ethiopia using institution-based retrospective data. Methods Average quarterly total attendances for antenatal care (ANC), delivery care (DC) and post-natal care (PNC) at health posts and health care centres were studied from 2002 to 2012. Regression analysis was applied to two models to assess whether trends were statistically significant. One model was used to estimate the level and trend changes associated with the immediate period of intervention, while changes related to the post-intervention period were estimated by the other. Results The total number of consultations for ANC, DC and PNC increased constantly, particularly after the late-intervention period. Increases were higher for ANC and PNC at health post level and for DC at health centres. A positive statistically significant upward trend was found for DC and PNC in all facilities (p<0.01). The positive trend was also present in ANC at health centres (p = 0.04), but not at health posts. Conclusion Our findings revealed an increase in the use of antenatal, delivery and post-natal care after the introduction of the HEP. We are aware that other factors, that we could not control for, might be explaining that increase. The figures for DC and PNC are however low and more needs to be done in order to increase the access to the health care system as well as the demand for these services by the population. Strengthening of the health information system in the region needs also to be prioritized. PMID:26218074

  1. The Health Extension Program and Its Association with Change in Utilization of Selected Maternal Health Services in Tigray Region, Ethiopia: A Segmented Linear Regression Analysis.

    Science.gov (United States)

    Gebrehiwot, Tesfay Gebregzabher; San Sebastian, Miguel; Edin, Kerstin; Goicolea, Isabel

    2015-01-01

    In 2003, the Ethiopian Ministry of Health established the Health Extension Program (HEP), with the goal of improving access to health care and health promotion activities in rural areas of the country. This paper aims to assess the association of the HEP with improved utilization of maternal health services in Northern Ethiopia using institution-based retrospective data. Average quarterly total attendances for antenatal care (ANC), delivery care (DC) and post-natal care (PNC) at health posts and health care centres were studied from 2002 to 2012. Regression analysis was applied to two models to assess whether trends were statistically significant. One model was used to estimate the level and trend changes associated with the immediate period of intervention, while changes related to the post-intervention period were estimated by the other. The total number of consultations for ANC, DC and PNC increased constantly, particularly after the late-intervention period. Increases were higher for ANC and PNC at health post level and for DC at health centres. A positive statistically significant upward trend was found for DC and PNC in all facilities (pintroduction of the HEP. We are aware that other factors, that we could not control for, might be explaining that increase. The figures for DC and PNC are however low and more needs to be done in order to increase the access to the health care system as well as the demand for these services by the population. Strengthening of the health information system in the region needs also to be prioritized.

  2. Marketing Education Through Benefit Segmentation. AIR Forum 1981 Paper.

    Science.gov (United States)

    Goodnow, Wilma Elizabeth

    The applicability of the "benefit segmentation" marketing technique to education was tested at the College of DuPage in 1979. Benefit segmentation identified target markets homogeneous in benefits expected from a program offering and may be useful in combatting declining enrollments. The 487 randomly selected students completed the 223…

  3. Biosensors: Future Analytical Tools

    Directory of Open Access Journals (Sweden)

    Vikas

    2007-02-01

    Full Text Available Biosensors offer considerable promises for attaining the analytic information in a faster, simpler and cheaper manner compared to conventional assays. Biosensing approach is rapidly advancing and applications ranging from metabolite, biological/ chemical warfare agent, food pathogens and adulterant detection to genetic screening and programmed drug delivery have been demonstrated. Innovative efforts, coupling micromachining and nanofabrication may lead to even more powerful devices that would accelerate the realization of large-scale and routine screening. With gradual increase in commercialization a wide range of new biosensors are thus expected to reach the market in the coming years.

  4. Evaluation Model for Applying an E-Learning System in a Course: An Analytic Hierarchy Process-Multi-Choice Goal Programming Approach

    Science.gov (United States)

    Lin, Teng-Chiao; Ho, Hui-Ping; Chang, Ching-Ter

    2014-01-01

    With the widespread use of the Internet, adopting e-learning systems in courses has gradually become more and more important in universities in Taiwan. However, because of limitations of teachers' time, selecting suitable online IT tools has become very important. This study proposes an analytic hierarchy process (AHP)-multi-choice goal…

  5. A fuzzy analytic hierarchy/data envelopment analysis approach for measuring the relative efficiency of hydrogen R and D programs in the sector of developing hydrogen energy technologies

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seongkon; Kim, Jongwook [Korea Institute of Energy Research (Korea, Republic of). Energy Policy Research Center; Mogi, Gento [Tokyo Univ. (Japan). Graduate School of Engineering; Hui, K.S. [Hong Kong City Univ. (China). Manufacturing Engineering and Engineering Management

    2010-07-01

    Korea takes 10th place of largest energy consuming nations in the world since it spends 222 million ton of oil equivalent per year and depends on the most amount of consumed energy resources, which account for 96% import in 2008 with the 5.6% selfsufficiency ratio of energy resources. The interest of energy technology development has increased due to its poor energy environments. Specifically, the fluctuation of oil prices has been easily affecting Korean energy environments and economy. Considering its energy environments, energy technology development can be one of the optimal solution and breakthrough to solve Korea's energy circumstances, energy security, and the low carbon green growth with Korea's sustainable development. Moreover, energy and environment issues are the key factors for leading the future sustainable competitive advantage and green growth of one nation over the others nations. Lots of advanced nations have been trying to develop the energy technologies with the establishment of the strategic energy technology R and D programs for creating and maintain a competitive advantage and leading the global energy market. In 2005, we established strategic hydrogen energy technology roadmap in the sector of developing hydrogen energy technologies for coping with next 10 years from 2006 to 2015 as an aspect of hydrogen energy technology development. Hydrogen energy technologies are environmentally sound and friendly comparing with conventional energy technologies. Hydrogen energy technologies can play a key role and is the one of the best alternatives getting much attentions coping with UNFCCC and the hydrogen economy. Hydrogen energy technology roadmap shows meaningful guidelines for implementing the low carbon green growth society. We analyzed the world energy outlook to make hydrogen ETRM and provide energy policy directions in 2005. It focuses on developing hydrogen energy technology considering Korea's energy circumstance. We make a

  6. Segmentation in cinema perception.

    Science.gov (United States)

    Carroll, J M; Bever, T G

    1976-03-12

    Viewers perceptually segment moving picture sequences into their cinematically defined units: excerpts that follow short film sequences are recognized faster when the excerpt originally came after a structural cinematic break (a cut or change in the action) than when it originally came before the break.

  7. Dictionary Based Image Segmentation

    DEFF Research Database (Denmark)

    Dahl, Anders Bjorholm; Dahl, Vedrana Andersen

    2015-01-01

    We propose a method for weakly supervised segmentation of natural images, which may contain both textured or non-textured regions. Our texture representation is based on a dictionary of image patches. To divide an image into separated regions with similar texture we use an implicit level sets...

  8. Unsupervised Image Segmentation

    Czech Academy of Sciences Publication Activity Database

    Haindl, Michal; Mikeš, Stanislav

    2014-01-01

    Roč. 36, č. 4 (2014), s. 23-23 R&D Projects: GA ČR(CZ) GA14-10911S Institutional support: RVO:67985556 Keywords : unsupervised image segmentation Subject RIV: BD - Theory of Information http://library.utia.cas.cz/separaty/2014/RO/haindl-0434412.pdf

  9. Metrics for image segmentation

    Science.gov (United States)

    Rees, Gareth; Greenway, Phil; Morray, Denise

    1998-07-01

    An important challenge in mapping image-processing techniques onto applications is the lack of quantitative performance measures. From a systems engineering perspective these are essential if system level requirements are to be decomposed into sub-system requirements which can be understood in terms of algorithm selection and performance optimization. Nowhere in computer vision is this more evident than in the area of image segmentation. This is a vigorous and innovative research activity, but even after nearly two decades of progress, it remains almost impossible to answer the question 'what would the performance of this segmentation algorithm be under these new conditions?' To begin to address this shortcoming, we have devised a well-principled metric for assessing the relative performance of two segmentation algorithms. This allows meaningful objective comparisons to be made between their outputs. It also estimates the absolute performance of an algorithm given ground truth. Our approach is an information theoretic one. In this paper, we describe the theory and motivation of our method, and present practical results obtained from a range of state of the art segmentation methods. We demonstrate that it is possible to measure the objective performance of these algorithms, and to use the information so gained to provide clues about how their performance might be improved.

  10. Loading effects of anterior cervical spine fusion on adjacent segments

    Directory of Open Access Journals (Sweden)

    Chien-Shiung Wang

    2012-11-01

    Full Text Available Adjacent segment degeneration typically follows anterior cervical spine fusion. However, the primary cause of adjacent segment degeneration remains unknown. Therefore, in order to identify the loading effects that cause adjacent segment degeneration, this study examined the loading effects to superior segments adjacent to fused bone following anterior cervical spine fusion. The C3–C6 cervical spine segments of 12 sheep were examined. Specimens were divided into the following groups: intact spine (group 1; and C5–C6 segments that were fused via cage-instrumented plate fixation (group 2. Specimens were cycled between 20° flexion and 15° extension with a displacement control of 1°/second. The tested parameters included the range of motion (ROM of each segment, torque and strain on both the body and inferior articular process at the superior segments (C3–C4 adjacent to the fused bone, and the position of the neutral axis of stress at under 20° flexion and 15° extension. Under flexion and Group 2, torque, ROM, and strain on both the bodies and facets of superior segments adjacent to the fused bone were higher than those of Group 1. Under extension and Group 2, ROM for the fused segment was less than that of Group 1; torque, ROM, and stress on both the bodies and facets of superior segments adjacent to the fused bone were higher than those of Group 1. These analytical results indicate that the muscles and ligaments require greater force to achieve cervical motion than the intact spine following anterior cervical spine fusion. In addition, ROM and stress on the bodies and facets of the joint segments adjacent to the fused bone were significantly increased. Under flexion, the neutral axis of the stress on the adjacent segment moved backward, and the stress on the bodies of the segments adjacent to the fused bone increased. These comparative results indicate that increased stress on the adjacent segments is caused by stress-shielding effects

  11. Strategies for regular segmented reductions on GPU

    DEFF Research Database (Denmark)

    Larsen, Rasmus Wriedt; Henriksen, Troels

    2017-01-01

    We present and evaluate an implementation technique for regular segmented reductions on GPUs. Existing techniques tend to be either consistent in performance but relatively inefficient in absolute terms, or optimised for specific workloads and thereby exhibiting bad performance for certain input...... is in the context of the Futhark compiler, the implementation technique is applicable to any library or language that has a need for segmented reductions. We evaluate the technique on four microbenchmarks, two of which we also compare to implementations in the CUB library for GPU programming, as well as on two...

  12. SIDES - Segment Interconnect Diagnostic Expert System

    International Nuclear Information System (INIS)

    Booth, A.W.; Forster, R.; Gustafsson, L.; Ho, N.

    1989-01-01

    It is well known that the FASTBUS Segment Interconnect (SI) provides a communication path between two otherwise independent, asynchronous bus segments. The SI is probably the most important module in any FASTBUS data acquisition network since it's failure to function can cause whole segments of the network to be inaccessible and sometimes inoperable. This paper describes SIDES, an intelligent program designed to diagnose SI's both in situ as they operate in a data acquisition network, and in the laboratory in an acceptance/repair environment. The paper discusses important issues such as knowledge acquisition; extracting knowledge from human experts and other knowledge sources. SIDES can benefit high energy physics experiments, where SI problems can be diagnosed and solved more quickly. Equipment pool technicians can also benefit from SIDES, first by decreasing the number of SI's erroneously turned in for repair, and secondly as SIDES acts as an intelligent assistant to the technician in the diagnosis and repair process

  13. Analytical model for computing transient pressures and forces in the safety/relief valve discharge line. Mark I Containment Program, task number 7.1.2

    International Nuclear Information System (INIS)

    Wheeler, A.J.

    1978-02-01

    An analytical model is described that computes the transient pressures, velocities and forces in the safety/relief valve discharge line immediately after safety/relief valve opening. Equations of motion are defined for the gas-flow and water-flow models. Results are not only verified by comparing them with an earlier version of the model, but also with Quad Cities and Monticello plant data. The model shows reasonable agreement with the earlier model and the plant data

  14. Status of the segment interconnect, cable segment ancillary logic, and the cable segment hybrid driver projects

    International Nuclear Information System (INIS)

    Swoboda, C.; Barsotti, E.; Chappa, S.; Downing, R.; Goeransson, G.; Lensy, D.; Moore, G.; Rotolo, C.; Urish, J.

    1985-01-01

    The FASTBUS Segment Interconnect (SI) provides a communication path between two otherwise independent, asynchronous bus segments. In particular, the Segment Interconnect links a backplane crate segment to a cable segment. All standard FASTBUS address and data transactions can be passed through the SI or any number of SIs and segments in a path. Thus systems of arbitrary connection complexity can be formed, allowing simultaneous independent processing, yet still permitting devices associated with one segment to be accessed from others. The model S1 Segment Interconnect and the Cable Segment Ancillary Logic covered in this report comply with all the mandatory features stated in the FASTBUS specification document DOE/ER-0189. A block diagram of the SI is shown

  15. Contour tracing for segmentation of mammographic masses

    International Nuclear Information System (INIS)

    Elter, Matthias; Held, Christian; Wittenberg, Thomas

    2010-01-01

    CADx systems have the potential to support radiologists in the difficult task of discriminating benign and malignant mammographic lesions. The segmentation of mammographic masses from the background tissue is an important module of CADx systems designed for the characterization of mass lesions. In this work, a novel approach to this task is presented. The segmentation is performed by automatically tracing the mass' contour in-between manually provided landmark points defined on the mass' margin. The performance of the proposed approach is compared to the performance of implementations of three state-of-the-art approaches based on region growing and dynamic programming. For an unbiased comparison of the different segmentation approaches, optimal parameters are selected for each approach by means of tenfold cross-validation and a genetic algorithm. Furthermore, segmentation performance is evaluated on a dataset of ROI and ground-truth pairs. The proposed method outperforms the three state-of-the-art methods. The benchmark dataset will be made available with publication of this paper and will be the first publicly available benchmark dataset for mass segmentation.

  16. Market segmentation: Venezuelan ADRs

    Directory of Open Access Journals (Sweden)

    Urbi Garay

    2012-12-01

    Full Text Available The control on foreign exchange imposed by Venezuela in 2003 constitute a natural experiment that allows researchers to observe the effects of exchange controls on stock market segmentation. This paper provides empirical evidence that although the Venezuelan capital market as a whole was highly segmented before the controls were imposed, the shares in the firm CANTV were, through their American Depositary Receipts (ADRs, partially integrated with the global market. Following the imposition of the exchange controls this integration was lost. Research also documents the spectacular and apparently contradictory rise experienced by the Caracas Stock Exchange during the serious economic crisis of 2003. It is argued that, as it happened in Argentina in 2002, the rise in share prices occurred because the depreciation of the Bolívar in the parallel currency market increased the local price of the stocks that had associated ADRs, which were negotiated in dollars.

  17. Scintillation counter, segmented shield

    International Nuclear Information System (INIS)

    Olson, R.E.; Thumim, A.D.

    1975-01-01

    A scintillation counter, particularly for counting gamma ray photons, includes a massive lead radiation shield surrounding a sample-receiving zone. The shield is disassembleable into a plurality of segments to allow facile installation and removal of a photomultiplier tube assembly, the segments being so constructed as to prevent straight-line access of external radiation through the shield into radiation-responsive areas. Provisions are made for accurately aligning the photomultiplier tube with respect to one or more sample-transmitting bores extending through the shield to the sample receiving zone. A sample elevator, used in transporting samples into the zone, is designed to provide a maximum gamma-receiving aspect to maximize the gamma detecting efficiency. (U.S.)

  18. Head segmentation in vertebrates

    OpenAIRE

    Kuratani, Shigeru; Schilling, Thomas

    2008-01-01

    Classic theories of vertebrate head segmentation clearly exemplify the idealistic nature of comparative embryology prior to the 20th century. Comparative embryology aimed at recognizing the basic, primary structure that is shared by all vertebrates, either as an archetype or an ancestral developmental pattern. Modern evolutionary developmental (Evo-Devo) studies are also based on comparison, and therefore have a tendency to reduce complex embryonic anatomy into overly simplified patterns. Her...

  19. Video segmentation using keywords

    Science.gov (United States)

    Ton-That, Vinh; Vong, Chi-Tai; Nguyen-Dao, Xuan-Truong; Tran, Minh-Triet

    2018-04-01

    At DAVIS-2016 Challenge, many state-of-art video segmentation methods achieve potential results, but they still much depend on annotated frames to distinguish between background and foreground. It takes a lot of time and efforts to create these frames exactly. In this paper, we introduce a method to segment objects from video based on keywords given by user. First, we use a real-time object detection system - YOLOv2 to identify regions containing objects that have labels match with the given keywords in the first frame. Then, for each region identified from the previous step, we use Pyramid Scene Parsing Network to assign each pixel as foreground or background. These frames can be used as input frames for Object Flow algorithm to perform segmentation on entire video. We conduct experiments on a subset of DAVIS-2016 dataset in half the size of its original size, which shows that our method can handle many popular classes in PASCAL VOC 2012 dataset with acceptable accuracy, about 75.03%. We suggest widely testing by combining other methods to improve this result in the future.

  20. Market segmentation in behavioral perspective.

    OpenAIRE

    Wells, V.K.; Chang, S.W.; Oliveira-Castro, J.M.; Pallister, J.

    2010-01-01

    A segmentation approach is presented using both traditional demographic segmentation bases (age, social class/occupation, and working status) and a segmentation by benefits sought. The benefits sought in this case are utilitarian and informational reinforcement, variables developed from the Behavioral Perspective Model (BPM). Using data from 1,847 consumers and from a total of 76,682 individual purchases, brand choice and price and reinforcement responsiveness were assessed for each segment a...

  1. Guide to Savannah River Laboratory Analytical Services Group

    International Nuclear Information System (INIS)

    1990-04-01

    The mission of the Analytical Services Group (ASG) is to provide analytical support for Savannah River Laboratory Research and Development Programs using onsite and offsite analytical labs as resources. A second mission is to provide Savannah River Site (SRS) operations with analytical support for nonroutine material characterization or special chemical analyses. The ASG provides backup support for the SRS process control labs as necessary

  2. Guide to Savannah River Laboratory Analytical Services Group

    Energy Technology Data Exchange (ETDEWEB)

    1990-04-01

    The mission of the Analytical Services Group (ASG) is to provide analytical support for Savannah River Laboratory Research and Development Programs using onsite and offsite analytical labs as resources. A second mission is to provide Savannah River Site (SRS) operations with analytical support for nonroutine material characterization or special chemical analyses. The ASG provides backup support for the SRS process control labs as necessary.

  3. Market Segmentation for Information Services.

    Science.gov (United States)

    Halperin, Michael

    1981-01-01

    Discusses the advantages and limitations of market segmentation as strategy for the marketing of information services made available by nonprofit organizations, particularly libraries. Market segmentation is defined, a market grid for libraries is described, and the segmentation of information services is outlined. A 16-item reference list is…

  4. Production of linear polarization by segmentation of helical undulator

    International Nuclear Information System (INIS)

    Tanaka, T.; Kitamura, H.

    2002-01-01

    A simple scheme to obtain linearly polarized radiation (LPR) with a segmented undulator is proposed. The undulator is composed of several segments each of which forms a helical undulator and has helicity opposite to those of adjacent segments. Due to coherent sum of radiation, the circularly polarized component is canceled out resulting in production of LPR without any higher harmonics. The radiation from the proposed device is investigated analytically, which shows that a high degree of linear polarization is obtained in spite of a finite beam emittance and angular acceptance of optics, if a sufficiently large number of segments and an adequate photon energy are chosen. Results of calculation to investigate practical performances of the proposed device are presented

  5. Deliberate and Crisis Action Planning and Execution Segments Increment 2A (DCAPES Inc 2A)

    Science.gov (United States)

    2016-03-01

    2016 Major Automated Information System Annual Report Deliberate and Crisis Action Planning and Execution Segments Increment 2A (DCAPES Inc 2A...Program Name Deliberate and Crisis Action Planning and Execution Segments Increment 2A (DCAPES Inc 2A) DoD Component Air Force Responsible Office Program...APB) dated March 9, 2015 DCAPES Inc 2A 2016 MAR UNCLASSIFIED 4 Program Description Deliberate and Crisis Action Planning and Execution Segments

  6. Albedo estimation for scene segmentation

    Energy Technology Data Exchange (ETDEWEB)

    Lee, C H; Rosenfeld, A

    1983-03-01

    Standard methods of image segmentation do not take into account the three-dimensional nature of the underlying scene. For example, histogram-based segmentation tacitly assumes that the image intensity is piecewise constant, and this is not true when the scene contains curved surfaces. This paper introduces a method of taking 3d information into account in the segmentation process. The image intensities are adjusted to compensate for the effects of estimated surface orientation; the adjusted intensities can be regarded as reflectivity estimates. When histogram-based segmentation is applied to these new values, the image is segmented into parts corresponding to surfaces of constant reflectivity in the scene. 7 references.

  7. FRAMEWORK FOR COMPARING SEGMENTATION ALGORITHMS

    Directory of Open Access Journals (Sweden)

    G. Sithole

    2015-05-01

    Full Text Available The notion of a ‘Best’ segmentation does not exist. A segmentation algorithm is chosen based on the features it yields, the properties of the segments (point sets it generates, and the complexity of its algorithm. The segmentation is then assessed based on a variety of metrics such as homogeneity, heterogeneity, fragmentation, etc. Even after an algorithm is chosen its performance is still uncertain because the landscape/scenarios represented in a point cloud have a strong influence on the eventual segmentation. Thus selecting an appropriate segmentation algorithm is a process of trial and error. Automating the selection of segmentation algorithms and their parameters first requires methods to evaluate segmentations. Three common approaches for evaluating segmentation algorithms are ‘goodness methods’, ‘discrepancy methods’ and ‘benchmarks’. Benchmarks are considered the most comprehensive method of evaluation. This paper shortcomings in current benchmark methods are identified and a framework is proposed that permits both a visual and numerical evaluation of segmentations for different algorithms, algorithm parameters and evaluation metrics. The concept of the framework is demonstrated on a real point cloud. Current results are promising and suggest that it can be used to predict the performance of segmentation algorithms.

  8. AISLE: an automatic volumetric segmentation method for the study of lung allometry.

    Science.gov (United States)

    Ren, Hongliang; Kazanzides, Peter

    2011-01-01

    We developed a fully automatic segmentation method for volumetric CT (computer tomography) datasets to support construction of a statistical atlas for the study of allometric laws of the lung. The proposed segmentation method, AISLE (Automated ITK-Snap based on Level-set), is based on the level-set implementation from an existing semi-automatic segmentation program, ITK-Snap. AISLE can segment the lung field without human interaction and provide intermediate graphical results as desired. The preliminary experimental results show that the proposed method can achieve accurate segmentation, in terms of volumetric overlap metric, by comparing with the ground-truth segmentation performed by a radiologist.

  9. Let's Talk... Analytics

    Science.gov (United States)

    Oblinger, Diana G.

    2012-01-01

    Talk about analytics seems to be everywhere. Everyone is talking about analytics. Yet even with all the talk, many in higher education have questions about--and objections to--using analytics in colleges and universities. In this article, the author explores the use of analytics in, and all around, higher education. (Contains 1 note.)

  10. Analytics for Education

    Science.gov (United States)

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  11. Analytical chemistry instrumentation

    International Nuclear Information System (INIS)

    Laing, W.R.

    1986-01-01

    In nine sections, 48 chapters cover 1) analytical chemistry and the environment 2) environmental radiochemistry 3) automated instrumentation 4) advances in analytical mass spectrometry 5) fourier transform spectroscopy 6) analytical chemistry of plutonium 7) nuclear analytical chemistry 8) chemometrics and 9) nuclear fuel technology

  12. Analytical chemistry: Principles and techniques

    International Nuclear Information System (INIS)

    Hargis, L.G.

    1988-01-01

    Although this text seems to have been intended for use in a one-semester course in undergraduate analytical chemistry, it includes the range of topics usually encountered in a two-semester introductory course in chemical analysis. The material is arranged logically for use in a two-semester course: the first 12 chapters contain the subjects most often covered in the first term, and the next 10 chapters pertain to the second (instrumental) term. Overall breadth and level of treatment are standards for an undergraduate text of this sort, and the only major omission is that of kinetic methods (which is a common omission in analytical texts). In the first 12 chapters coverage of the basic material is quite good. The emphasis on the underlying principles of the techniques rather than on specifics and design of instrumentation is welcomed. This text may be more useful for the instrumental portion of an analytical chemistry course than for the solution chemistry segment. The instrumental analysis portion is appropriate for an introductory textbook

  13. Segmentation of ribs in digital chest radiographs

    Science.gov (United States)

    Cong, Lin; Guo, Wei; Li, Qiang

    2016-03-01

    Ribs and clavicles in posterior-anterior (PA) digital chest radiographs often overlap with lung abnormalities such as nodules, and cause missing of these abnormalities, it is therefore necessary to remove or reduce the ribs in chest radiographs. The purpose of this study was to develop a fully automated algorithm to segment ribs within lung area in digital radiography (DR) for removal of the ribs. The rib segmentation algorithm consists of three steps. Firstly, a radiograph was pre-processed for contrast adjustment and noise removal; second, generalized Hough transform was employed to localize the lower boundary of the ribs. In the third step, a novel bilateral dynamic programming algorithm was used to accurately segment the upper and lower boundaries of ribs simultaneously. The width of the ribs and the smoothness of the rib boundaries were incorporated in the cost function of the bilateral dynamic programming for obtaining consistent results for the upper and lower boundaries. Our database consisted of 93 DR images, including, respectively, 23 and 70 images acquired with a DR system from Shanghai United-Imaging Healthcare Co. and from GE Healthcare Co. The rib localization algorithm achieved a sensitivity of 98.2% with 0.1 false positives per image. The accuracy of the detected ribs was further evaluated subjectively in 3 levels: "1", good; "2", acceptable; "3", poor. The percentages of good, acceptable, and poor segmentation results were 91.1%, 7.2%, and 1.7%, respectively. Our algorithm can obtain good segmentation results for ribs in chest radiography and would be useful for rib reduction in our future study.

  14. Analytical Chemistry Division's sample transaction system

    International Nuclear Information System (INIS)

    Stanton, J.S.; Tilson, P.A.

    1980-10-01

    The Analytical Chemistry Division uses the DECsystem-10 computer for a wide range of tasks: sample management, timekeeping, quality assurance, and data calculation. This document describes the features and operating characteristics of many of the computer programs used by the Division. The descriptions are divided into chapters which cover all of the information about one aspect of the Analytical Chemistry Division's computer processing

  15. Demonstrating Success: Web Analytics and Continuous Improvement

    Science.gov (United States)

    Loftus, Wayne

    2012-01-01

    As free and low-cost Web analytics tools become more sophisticated, libraries' approach to user analysis can become more nuanced and precise. Tracking appropriate metrics with a well-formulated analytics program can inform design decisions, demonstrate the degree to which those decisions have succeeded, and thereby inform the next iteration in the…

  16. Segmentation of the Infant Food Market

    OpenAIRE

    Hrůzová, Daniela

    2015-01-01

    The theoretical part covers general market segmentation, namely the marketing importance of differences among consumers, the essence of market segmentation, its main conditions and the process of segmentation, which consists of four consecutive phases - defining the market, determining important criteria, uncovering segments and developing segment profiles. The segmentation criteria, segmentation approaches, methods and techniques for the process of market segmentation are also described in t...

  17. Analytic models for the evolution of semilocal string networks

    International Nuclear Information System (INIS)

    Nunes, A. S.; Martins, C. J. A. P.; Avgoustidis, A.; Urrestilla, J.

    2011-01-01

    We revisit previously developed analytic models for defect evolution and adapt them appropriately for the study of semilocal string networks. We thus confirm the expectation (based on numerical simulations) that linear scaling evolution is the attractor solution for a broad range of model parameters. We discuss in detail the evolution of individual semilocal segments, focusing on the phenomenology of segment growth, and also provide a preliminary comparison with existing numerical simulations.

  18. Making Decisions by Analytical Chemistry

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov

    . These discrepancies are very unfortunate because erroneous conclusions may arise from an otherwise meticulous and dedicated effort of research staff. This may eventually lead to unreliable conclusions thus jeopardizing investigations of environmental monitoring, climate changes, food safety, clinical chemistry......It has been long recognized that results of analytical chemistry are not flawless, owing to the fact that professional laboratories and research laboratories analysing the same type of samples by the same type of instruments are likely to obtain significantly different results. The European......, forensics and other fields of science where analytical chemistry is the key instrument of decision making. In order to elucidate the potential origin of the statistical variations found among laboratories, a major program was undertaken including several analytical technologies where the purpose...

  19. Benefits of a comprehensive quality program for cryopreserved PBMC covering 28 clinical trials sites utilizing an integrated, analytical web-based portal.

    Science.gov (United States)

    Ducar, Constance; Smith, Donna; Pinzon, Cris; Stirewalt, Michael; Cooper, Cristine; McElrath, M Juliana; Hural, John

    2014-07-01

    The HIV Vaccine Trials Network (HVTN) is a global network of 28 clinical trial sites dedicated to identifying an effective HIV vaccine. Cryopreservation of high-quality peripheral blood mononuclear cells (PBMC) is critical for the assessment of vaccine-induced cellular immune functions. The HVTN PBMC Quality Management Program is designed to ensure that viable PBMC are processed, stored and shipped for clinical trial assays from all HVTN clinical trial sites. The program has evolved by developing and incorporating best practices for laboratory and specimen quality and implementing automated, web-based tools. These tools allow the site-affiliated processing laboratories and the central Laboratory Operations Unit to rapidly collect, analyze and report PBMC quality data. The HVTN PBMC Quality Management Program includes five key components: 1) Laboratory Assessment, 2) PBMC Training and Certification, 3) Internal Quality Control, 4) External Quality Control (EQC), and 5) Assay Specimen Quality Control. Fresh PBMC processing data is uploaded from each clinical site processing laboratory to a central HVTN Statistical and Data Management Center database for access and analysis on a web portal. Samples are thawed at a central laboratory for assay or specimen quality control and sample quality data is uploaded directly to the database by the central laboratory. Four year cumulative data covering 23,477 blood draws reveals an average fresh PBMC yield of 1.45×10(6)±0.48 cells per milliliter of useable whole blood. 95% of samples were within the acceptable range for fresh cell yield of 0.8-3.2×10(6) cells/ml of usable blood. Prior to full implementation of the HVTN PBMC Quality Management Program, the 2007 EQC evaluations from 10 international sites showed a mean day 2 thawed viability of 83.1% and a recovery of 67.5%. Since then, four year cumulative data covering 3338 specimens used in immunologic assays shows that 99.88% had acceptable viabilities (>66%) for use in

  20. Phasing multi-segment undulators

    International Nuclear Information System (INIS)

    Chavanne, J.; Elleaume, P.; Vaerenbergh, P. Van

    1996-01-01

    An important issue in the manufacture of multi-segment undulators as a source of synchrotron radiation or as a free-electron laser (FEL) is the phasing between successive segments. The state of the art is briefly reviewed, after which a novel pure permanent magnet phasing section that is passive and does not require any current is presented. The phasing section allows the introduction of a 6 mm longitudinal gap between each segment, resulting in complete mechanical independence and reduced magnetic interaction between segments. The tolerance of the longitudinal positioning of one segment with respect to the next is found to be 2.8 times lower than that of conventional phasing. The spectrum at all gaps and useful harmonics is almost unchanged when compared with a single-segment undulator of the same total length. (au) 3 refs

  1. Bias from two analytical laboratories involved in a long-term air monitoring program measuring organic pollutants in the Arctic: a quality assurance/quality control assessment.

    Science.gov (United States)

    Su, Yushan; Hung, Hayley; Stern, Gary; Sverko, Ed; Lao, Randy; Barresi, Enzo; Rosenberg, Bruno; Fellin, Phil; Li, Henrik; Xiao, Hang

    2011-11-01

    Initiated in 1992, air monitoring of organic pollutants in the Canadian Arctic provided spatial and temporal trends in support of Canada's participation in the Stockholm Convention of Persistent Organic Pollutants. The specific analytical laboratory charged with this task was changed in 2002 while field sampling protocols remained unchanged. Three rounds of intensive comparison studies were conducted in 2004, 2005, and 2008 to assess data comparability between the two laboratories. Analysis was compared for organochlorine pesticides (OCPs), polychlorinated biphenyls (PCBs) and polycyclic aromatic hydrocarbons (PAHs) in standards, blind samples of mixed standards and extracts of real air samples. Good measurement accuracy was achieved for both laboratories when standards were analyzed. Variation of measurement accuracy over time was found for some OCPs and PCBs in standards on a random and non-systematic manner. Relatively low accuracy in analyzing blind samples was likely related to the process of sample purification. Inter-laboratory measurement differences for standards (<30%) and samples (<70%) were generally less than or comparable to those reported in a previous inter-laboratory study with 21 participating laboratories. Regression analysis showed inconsistent data comparability between the two laboratories during the initial stages of the study. These inter-laboratory differences can complicate abilities to discern long-term trends of pollutants in a given sampling site. It is advisable to maintain long-term measurements with minimal changes in sample analysis.

  2. The LOFT Ground Segment

    DEFF Research Database (Denmark)

    Bozzo, E.; Antonelli, A.; Argan, A.

    2014-01-01

    targets per orbit (~90 minutes), providing roughly ~80 GB of proprietary data per day (the proprietary period will be 12 months). The WFM continuously monitors about 1/3 of the sky at a time and provides data for about ~100 sources a day, resulting in a total of ~20 GB of additional telemetry. The LOFT...... Burst alert System additionally identifies on-board bright impulsive events (e.g., Gamma-ray Bursts, GRBs) and broadcasts the corresponding position and trigger time to the ground using a dedicated system of ~15 VHF receivers. All WFM data are planned to be made public immediately. In this contribution...... we summarize the planned organization of the LOFT ground segment (GS), as established in the mission Yellow Book 1 . We describe the expected GS contributions from ESA and the LOFT consortium. A review is provided of the planned LOFT data products and the details of the data flow, archiving...

  3. Segmented heat exchanger

    Science.gov (United States)

    Baldwin, Darryl Dean; Willi, Martin Leo; Fiveland, Scott Byron; Timmons, Kristine Ann

    2010-12-14

    A segmented heat exchanger system for transferring heat energy from an exhaust fluid to a working fluid. The heat exchanger system may include a first heat exchanger for receiving incoming working fluid and the exhaust fluid. The working fluid and exhaust fluid may travel through at least a portion of the first heat exchanger in a parallel flow configuration. In addition, the heat exchanger system may include a second heat exchanger for receiving working fluid from the first heat exchanger and exhaust fluid from a third heat exchanger. The working fluid and exhaust fluid may travel through at least a portion of the second heat exchanger in a counter flow configuration. Furthermore, the heat exchanger system may include a third heat exchanger for receiving working fluid from the second heat exchanger and exhaust fluid from the first heat exchanger. The working fluid and exhaust fluid may travel through at least a portion of the third heat exchanger in a parallel flow configuration.

  4. International EUREKA: Market Segment

    International Nuclear Information System (INIS)

    1982-03-01

    The purpose of the Market Segment of the EUREKA model is to simultaneously project uranium market prices, uranium supply and purchasing activities. The regional demands are extrinsic. However, annual forward contracting activities to meet these demands as well as inventory requirements are calculated. The annual price forecast is based on relatively short term, forward balances between available supply and desired purchases. The forecasted prices and extrapolated price trends determine decisions related to exploration and development, new production operations, and the operation of existing capacity. Purchasing and inventory requirements are also adjusted based on anticipated prices. The calculation proceeds one year at a time. Conditions calculated at the end of one year become the starting conditions for the calculation in the subsequent year

  5. Probabilistic retinal vessel segmentation

    Science.gov (United States)

    Wu, Chang-Hua; Agam, Gady

    2007-03-01

    Optic fundus assessment is widely used for diagnosing vascular and non-vascular pathology. Inspection of the retinal vasculature may reveal hypertension, diabetes, arteriosclerosis, cardiovascular disease and stroke. Due to various imaging conditions retinal images may be degraded. Consequently, the enhancement of such images and vessels in them is an important task with direct clinical applications. We propose a novel technique for vessel enhancement in retinal images that is capable of enhancing vessel junctions in addition to linear vessel segments. This is an extension of vessel filters we have previously developed for vessel enhancement in thoracic CT scans. The proposed approach is based on probabilistic models which can discern vessels and junctions. Evaluation shows the proposed filter is better than several known techniques and is comparable to the state of the art when evaluated on a standard dataset. A ridge-based vessel tracking process is applied on the enhanced image to demonstrate the effectiveness of the enhancement filter.

  6. Reactor Section standard analytical methods. Part 1

    Energy Technology Data Exchange (ETDEWEB)

    Sowden, D.

    1954-07-01

    the Standard Analytical Methods manual was prepared for the purpose of consolidating and standardizing all current analytical methods and procedures used in the Reactor Section for routine chemical analyses. All procedures are established in accordance with accepted practice and the general analytical methods specified by the Engineering Department. These procedures are specifically adapted to the requirements of the water treatment process and related operations. The methods included in this manual are organized alphabetically within the following five sections which correspond to the various phases of the analytical control program in which these analyses are to be used: water analyses, essential material analyses, cotton plug analyses boiler water analyses, and miscellaneous control analyses.

  7. Segmented rail linear induction motor

    Science.gov (United States)

    Cowan, Jr., Maynard; Marder, Barry M.

    1996-01-01

    A segmented rail linear induction motor has a segmented rail consisting of a plurality of nonferrous electrically conductive segments aligned along a guideway. The motor further includes a carriage including at least one pair of opposed coils fastened to the carriage for moving the carriage. A power source applies an electric current to the coils to induce currents in the conductive surfaces to repel the coils from adjacent edges of the conductive surfaces.

  8. A model-based Bayesian framework for ECG beat segmentation

    International Nuclear Information System (INIS)

    Sayadi, O; Shamsollahi, M B

    2009-01-01

    The study of electrocardiogram (ECG) waveform amplitudes, timings and patterns has been the subject of intense research, for it provides a deep insight into the diagnostic features of the heart's functionality. In some recent works, a Bayesian filtering paradigm has been proposed for denoising and compression of ECG signals. In this paper, it is shown that this framework may be effectively used for ECG beat segmentation and extraction of fiducial points. Analytic expressions for the determination of points and intervals are derived and evaluated on various real ECG signals. Simulation results show that the method can contribute to and enhance the clinical ECG beat segmentation performance

  9. Segmentation-DrivenTomographic Reconstruction

    DEFF Research Database (Denmark)

    Kongskov, Rasmus Dalgas

    such that the segmentation subsequently can be carried out by use of a simple segmentation method, for instance just a thresholding method. We tested the advantages of going from a two-stage reconstruction method to a one stage segmentation-driven reconstruction method for the phase contrast tomography reconstruction......The tomographic reconstruction problem is concerned with creating a model of the interior of an object from some measured data, typically projections of the object. After reconstructing an object it is often desired to segment it, either automatically or manually. For computed tomography (CT...

  10. Automated medical image segmentation techniques

    Directory of Open Access Journals (Sweden)

    Sharma Neeraj

    2010-01-01

    Full Text Available Accurate segmentation of medical images is a key step in contouring during radiotherapy planning. Computed topography (CT and Magnetic resonance (MR imaging are the most widely used radiographic techniques in diagnosis, clinical studies and treatment planning. This review provides details of automated segmentation methods, specifically discussed in the context of CT and MR images. The motive is to discuss the problems encountered in segmentation of CT and MR images, and the relative merits and limitations of methods currently available for segmentation of medical images.

  11. Proposal of a segmentation procedure for skid resistance data

    International Nuclear Information System (INIS)

    Tejeda, S. V.; Tampier, Hernan de Solominihac; Navarro, T.E.

    2008-01-01

    Skin resistance of pavements presents a high spatial variability along a road. This pavement characteristic is directly related to wet weather accidents; therefore, it is important to identify and characterize the skid resistance of homogeneous segments along a road in order to implement proper road safety management. Several data segmentation methods have been applied to other pavement characteristics (e.g. roughness). However, no application to skin resistance data was found during the literature review for this study. Typical segmentation methods are rather too general or too specific to ensure a detailed segmentation of skid resistance data, which can be used for managing pavement performance. The main objective of this paper is to propose a procedure for segmenting skid resistance data, based on existing data segmentation methods. The procedure needs to be efficient and to fulfill road management requirements. The proposed procedure considers the Leverage method to identify outlier data, the CUSUM method to accomplish initial data segmentation and a statistical method to group consecutive segments that are statistically similar. The statistical method applies the Student's t-test of mean equities, along with analysis of variance and the Tuckey test for the multiple comparison of means. The proposed procedure was applied to a sample of skid resistance data measured with SCRIM (Side Force Coefficient Routine Investigatory Machine) on a 4.2 km section of Chilean road and was compared to conventional segmentation methods. Results showed that the proposed procedure is more efficient than the conventional segmentation procedures, achieving the minimum weighted sum of square errors (SSEp) with all the identified segments statistically different. Due to its mathematical basis, proposed procedure can be easily adapted and programmed for use in road safety management. (author)

  12. Analytical Sociology: A Bungean Appreciation

    Science.gov (United States)

    Wan, Poe Yu-ze

    2012-10-01

    Analytical sociology, an intellectual project that has garnered considerable attention across a variety of disciplines in recent years, aims to explain complex social processes by dissecting them, accentuating their most important constituent parts, and constructing appropriate models to understand the emergence of what is observed. To achieve this goal, analytical sociologists demonstrate an unequivocal focus on the mechanism-based explanation grounded in action theory. In this article I attempt a critical appreciation of analytical sociology from the perspective of Mario Bunge's philosophical system, which I characterize as emergentist systemism. I submit that while the principles of analytical sociology and those of Bunge's approach share a lot in common, the latter brings to the fore the ontological status and explanatory importance of supra-individual actors (as concrete systems endowed with emergent causal powers) and macro-social mechanisms (as processes unfolding in and among social systems), and therefore it does not stipulate that every causal explanation of social facts has to include explicit references to individual-level actors and mechanisms. In this sense, Bunge's approach provides a reasonable middle course between the Scylla of sociological reification and the Charybdis of ontological individualism, and thus serves as an antidote to the untenable "strong program of microfoundations" to which some analytical sociologists are committed.

  13. The SRS analytical laboratories strategic plan

    International Nuclear Information System (INIS)

    Hiland, D.E.

    1993-01-01

    There is an acute shortage of Savannah River Site (SRS) analytical laboratory capacity to support key Department of Energy (DOE) environmental restoration and waste management (EM) programs while making the transition from traditional defense program (DP) missions as a result of the cessation of the Cold War. This motivated Westinghouse Savannah River Company (WSRC) to develop an open-quotes Analytical Laboratories Strategic Planclose quotes (ALSP) in order to provide appropriate input to SRS operating plans and justification for proposed analytical laboratory projects. The methodology used to develop this plan is applicable to all types of strategic planning

  14. Market segmentation and positioning: matching creativity with fiscal responsibility.

    Science.gov (United States)

    Kiener, M E

    1989-01-01

    This paper describes an approach to continuing professional education (CPE) program development in nursing within a university environment that utilizes the concepts of market segmentation and positioning. Use of these strategies enables the academic CPE enterprise to move beyond traditional needs assessment practices to create more successful and better-managed CPE programs.

  15. Characterizing and Reaching High-Risk Drinkers Using Audience Segmentation

    Science.gov (United States)

    Moss, Howard B.; Kirby, Susan D.; Donodeo, Fred

    2010-01-01

    Background Market or audience segmentation is widely used in social marketing efforts to help planners identify segments of a population to target for tailored program interventions. Market-based segments are typically defined by behaviors, attitudes, knowledge, opinions, or lifestyles. They are more helpful to health communication and marketing planning than epidemiologically-defined groups because market-based segments are similar in respect to how they behave or might react to marketing and communication efforts. However, market segmentation has rarely been used in alcohol research. As an illustration of its utility, we employed commercial data that describes the sociodemographic characteristics of high-risk drinkers as an audience segment; where they tend to live, lifestyles, interests, consumer behaviors, alcohol consumption behaviors, other health-related behaviors, and cultural values. Such information can be extremely valuable in targeting and planning public health campaigns, targeted mailings, prevention interventions and research efforts. Methods We describe the results of a segmentation analysis of those individuals who self-report consuming five or more drinks per drinking episode at least twice in the last 30-days. The study used the proprietary PRIZM™ audience segmentation database merged with Center for Disease Control and Prevention's (CDC) Behavioral Risk Factor Surveillance System (BRFSS) database. The top ten of the 66 PRIZM™ audience segments for this risky drinking pattern are described. For five of these segments we provide additional in-depth details about consumer behavior and the estimates of the market areas where these risky drinkers reside. Results The top ten audience segments (PRIZM clusters) most likely to engage in high-risk drinking are described. The cluster with the highest concentration of binge drinking behavior is referred to as the “Cyber Millenials.” This cluster is characterized as “the nation's tech-savvy singles

  16. Characterizing and reaching high-risk drinkers using audience segmentation.

    Science.gov (United States)

    Moss, Howard B; Kirby, Susan D; Donodeo, Fred

    2009-08-01

    Market or audience segmentation is widely used in social marketing efforts to help planners identify segments of a population to target for tailored program interventions. Market-based segments are typically defined by behaviors, attitudes, knowledge, opinions, or lifestyles. They are more helpful to health communication and marketing planning than epidemiologically defined groups because market-based segments are similar in respect to how they behave or might react to marketing and communication efforts. However, market segmentation has rarely been used in alcohol research. As an illustration of its utility, we employed commercial data that describes the sociodemographic characteristics of high-risk drinkers as an audience segment, including where they tend to live, lifestyles, interests, consumer behaviors, alcohol consumption behaviors, other health-related behaviors, and cultural values. Such information can be extremely valuable in targeting and planning public health campaigns, targeted mailings, prevention interventions, and research efforts. We described the results of a segmentation analysis of those individuals who self-reported to consume 5 or more drinks per drinking episode at least twice in the last 30 days. The study used the proprietary PRIZM (Claritas, Inc., San Diego, CA) audience segmentation database merged with the Center for Disease Control and Prevention's (CDC) Behavioral Risk Factor Surveillance System (BRFSS) database. The top 10 of the 66 PRIZM audience segments for this risky drinking pattern are described. For five of these segments we provided additional in-depth details about consumer behavior and the estimates of the market areas where these risky drinkers resided. The top 10 audience segments (PRIZM clusters) most likely to engage in high-risk drinking are described. The cluster with the highest concentration of binge-drinking behavior is referred to as the "Cyber Millenials." This cluster is characterized as "the nation's tech

  17. Deliberate and Crisis Action Planning and Execution Segments Increment 2B (DCAPES Inc 2B)

    Science.gov (United States)

    2016-03-01

    2016 Major Automated Information System Annual Report Deliberate and Crisis Action Planning and Execution Segments Increment 2B (DCAPES Inc 2B...Information Program Name Deliberate and Crisis Action Planning and Execution Segments Increment 2B (DCAPES Inc 2B) DoD Component Air Force Responsible Office...been established. DCAPES Inc 2B 2016 MAR UNCLASSIFIED 4 Program Description Deliberate and Crisis Action Planning and Execution Segments (DCAPES) is

  18. Analyticity without Differentiability

    Science.gov (United States)

    Kirillova, Evgenia; Spindler, Karlheinz

    2008-01-01

    In this article we derive all salient properties of analytic functions, including the analytic version of the inverse function theorem, using only the most elementary convergence properties of series. Not even the notion of differentiability is required to do so. Instead, analytical arguments are replaced by combinatorial arguments exhibiting…

  19. Understanding Business Analytics

    Science.gov (United States)

    2015-01-05

    analytics have been used in organizations for a variety of reasons for quite some time; ranging from the simple (generating and understanding business analytics...process. understanding business analytics 3 How well these two components are orchestrated will determine the level of success an organization has in

  20. Geochemical evolution processes and water-quality observations based on results of the National Water-Quality Assessment Program in the San Antonio segment of the Edwards aquifer, 1996-2006

    Science.gov (United States)

    Musgrove, MaryLynn; Fahlquist, Lynne; Houston, Natalie A.; Lindgren, Richard J.; Ging, Patricia B.

    2010-01-01

    As part of the National Water-Quality Assessment Program, the U.S. Geological Survey collected and analyzed groundwater samples during 1996-2006 from the San Antonio segment of the Edwards aquifer of central Texas, a productive karst aquifer developed in Cretaceous-age carbonate rocks. These National Water-Quality Assessment Program studies provide an extensive dataset of groundwater geochemistry and water quality, consisting of 249 groundwater samples collected from 136 sites (wells and springs), including (1) wells completed in the shallow, unconfined, and urbanized part of the aquifer in the vicinity of San Antonio (shallow/urban unconfined category), (2) wells completed in the unconfined (outcrop area) part of the regional aquifer (unconfined category), and (3) wells completed in and springs discharging from the confined part of the regional aquifer (confined category). This report evaluates these data to assess geochemical evolution processes, including local- and regional-scale processes controlling groundwater geochemistry, and to make water-quality observations pertaining to sources and distribution of natural constituents and anthropogenic contaminants, the relation between geochemistry and hydrologic conditions, and groundwater age tracers and travel time. Implications for monitoring water-quality trends in karst are also discussed. Geochemical and isotopic data are useful tracers of recharge, groundwater flow, fluid mixing, and water-rock interaction processes that affect water quality. Sources of dissolved constituents to Edwards aquifer groundwater include dissolution of and geochemical interaction with overlying soils and calcite and dolomite minerals that compose the aquifer. Geochemical tracers such as magnesium to calcium and strontium to calcium ratios and strontium isotope compositions are used to evaluate and constrain progressive fluid-evolution processes. Molar ratios of magnesium to calcium and strontium to calcium in groundwater typically

  1. Analytical results and effective dose estimation of the operational Environmental Monitoring Program for the radioactive waste repository in Abadia de Goias from 1998 to 2008

    Energy Technology Data Exchange (ETDEWEB)

    Ribeiro, Edison, E-mail: edison@cnen.gov.b [Centro Regional de Ciencias Nucleares do Centro-Oeste, Comissao Nacional de Energia Nuclear- Br 060 km 174, 5-Abadia de Goias- Goias, CEP 75345-000 (Brazil); Tauhata, Luiz, E-mail: tauhata@ird.gov.b [Instituto de Radioprotecao e Dosimetria, Comissao Nacional de Energia Nuclear, Recreio dos Bandeirantes, Rio de Janeiro, RJ, CEP 22780-160 (Brazil); Eugenia dos Santos, Eliane, E-mail: esantos@cnen.gov.b [Centro Regional de Ciencias Nucleares do Centro-Oeste, Comissao Nacional de Energia Nuclear- Br 060 km 174, 5-Abadia de Goias- Goias, CEP 75345-000 (Brazil); Silveira Correa, Rosangela da, E-mail: rcorrea@cnen.gov.b [Centro Regional de Ciencias Nucleares do Centro-Oeste, Comissao Nacional de Energia Nuclear- Br 060 km 174, 5-Abadia de Goias- Goias, CEP 75345-000 (Brazil)

    2011-02-15

    This paper presents the results of the Environmental Monitoring Program for the Radioactive waste repository of Abadia de Goias, which was originated from the accident of Goiania, conducted by the Regional Center of Nuclear Sciences (CRCN-CO) of the National Commission on Nuclear Energy (CNEN), from 1998 to 2008. The results are related to the determination of {sup 137}Cs activity per unit of mass or volume of samples from surface water, ground water, depth sediments of the river, soil and vegetation, and also the air-kerma rate estimation for gamma exposure in the monitored site. In the phase of operational Environmental Monitoring Program, the values of the geometric mean and standard deviation obtained for {sup 137}Cs activity per unit of mass or volume in the analyzed samples were (0.08 {+-} 1.16) Bq.L{sup -1} for surface and underground water, (0.22 {+-} 2.79) Bq.kg{sup -1} for soil, and (0.19 {+-} 2.72) Bq.kg{sup -1} for sediment, and (0.19 {+-} 2.30) Bq.kg{sup -1} for vegetation. These results were similar to the values of the pre-operational Environmental Monitoring Program. With these data, estimations for effective dose were evaluated for public individuals in the neighborhood of the waste repository, considering the main possible way of exposure of this population group. The annual effective dose obtained from the analysis of these results were lower than 0.3 mSv.y{sup -1}, which is the limit established by CNEN for environmental impact in the public individuals indicating that the facility is operating safely, without any radiological impact to the surrounding environment. - Research highlights: {yields} A stolen capsule of Cesium 137 was opened in the city of Goiania, generating some 6000 tons of debris which were stored in the Repository area built for this purpose. {yields} The activity of cesium 137 of the surface water, underground water, depth sediments of river, soil, vegetation, and air, inside and surround the Repository area. {yields

  2. Region segmentation along image sequence

    International Nuclear Information System (INIS)

    Monchal, L.; Aubry, P.

    1995-01-01

    A method to extract regions in sequence of images is proposed. Regions are not matched from one image to the following one. The result of a region segmentation is used as an initialization to segment the following and image to track the region along the sequence. The image sequence is exploited as a spatio-temporal event. (authors). 12 refs., 8 figs

  3. Market segmentation using perceived constraints

    Science.gov (United States)

    Jinhee Jun; Gerard Kyle; Andrew Mowen

    2008-01-01

    We examined the practical utility of segmenting potential visitors to Cleveland Metroparks using their constraint profiles. Our analysis identified three segments based on their scores on the dimensions of constraints: Other priorities--visitors who scored the highest on 'other priorities' dimension; Highly Constrained--visitors who scored relatively high on...

  4. Market Segmentation: An Instructional Module.

    Science.gov (United States)

    Wright, Peter H.

    A concept-based introduction to market segmentation is provided in this instructional module for undergraduate and graduate transportation-related courses. The material can be used in many disciplines including engineering, business, marketing, and technology. The concept of market segmentation is primarily a transportation planning technique by…

  5. IFRS 8 – OPERATING SEGMENTS

    Directory of Open Access Journals (Sweden)

    BOCHIS LEONICA

    2009-05-01

    Full Text Available Segment reporting in accordance with IFRS 8 will be mandatory for annual financial statements covering periods beginning on or after 1 January 2009. The standards replaces IAS 14, Segment Reporting, from that date. The objective of IFRS 8 is to require

  6. Reduplication Facilitates Early Word Segmentation

    Science.gov (United States)

    Ota, Mitsuhiko; Skarabela, Barbora

    2018-01-01

    This study explores the possibility that early word segmentation is aided by infants' tendency to segment words with repeated syllables ("reduplication"). Twenty-four nine-month-olds were familiarized with passages containing one novel reduplicated word and one novel non-reduplicated word. Their central fixation times in response to…

  7. The Importance of Marketing Segmentation

    Science.gov (United States)

    Martin, Gillian

    2011-01-01

    The rationale behind marketing segmentation is to allow businesses to focus on their consumers' behaviors and purchasing patterns. If done effectively, marketing segmentation allows an organization to achieve its highest return on investment (ROI) in turn for its marketing and sales expenses. If an organization markets its products or services to…

  8. Essays in international market segmentation

    NARCIS (Netherlands)

    Hofstede, ter F.

    1999-01-01

    The primary objective of this thesis is to develop and validate new methodologies to improve the effectiveness of international segmentation strategies. The current status of international market segmentation research is reviewed in an introductory chapter, which provided a number of

  9. A Guide to the Use of Market Segmentation for the Dissemination of Educational Innovations. Final Report of a Project to Study the Effectiveness of Marketing Programming for Educational Change.

    Science.gov (United States)

    Wrausmann, Gale L.; And Others

    Markets can be defined as groups of people or organizations that have resources that could be exchanged for distinct benefits. Market segmentation is one strategy for market management and involves describing the market in terms of the subgroups that compose it so that exchanges with those subgroups can be more effectively promoted or facilitated.…

  10. Load curve modelling of the residential segment electric power consumption applying a demand side energy management program; Modelagem da curva de carga das faixas de consumo de energia eletrica residencial a partir da aplicacao de um programa de gerenciamento de energia pelo lado da demanda

    Energy Technology Data Exchange (ETDEWEB)

    Rahde, Sergio Barbosa [Pontificia Univ. Catolica do Rio Grande do Sul, Porto Alegre (Brazil). Dept. de Engenharia Mecanica e Mecatronica]. E-mail: sergio@em.pucrs.br; Kaehler, Jose Wagner [Pontificia Univ. Catolica do Rio Grande do Sul, Porto Alegre (Brazil). Faculdade de Engenharia]. E-mail: kaehlerjw@pucrs.br

    2000-07-01

    The dissertation aims to offer a current vision on the use of electrical energy inside CEEE's newly defined area of operation. It also intends to propose different alternatives to set up a Demand Side Management (DSM) project to be carried out on the same market segment, through a Residential Load Management program. Starting from studies developed by DNAEE (the Brazilian federal government's agency for electrical energy), to establish the load curve characteristics, as well as from a research on electrical equipment ownership and electricity consumption habits, along with the contribution supplied by other utilities, especially in the US, an evaluation is offered, concerning several approaches to residential energy management, setting up conditions that simulate the residential segment's scenarios and their influence on the general system's load. (author)

  11. Waste minimization in analytical methods

    International Nuclear Information System (INIS)

    Green, D.W.; Smith, L.L.; Crain, J.S.; Boparai, A.S.; Kiely, J.T.; Yaeger, J.S. Schilling, J.B.

    1995-01-01

    The US Department of Energy (DOE) will require a large number of waste characterizations over a multi-year period to accomplish the Department's goals in environmental restoration and waste management. Estimates vary, but two million analyses annually are expected. The waste generated by the analytical procedures used for characterizations is a significant source of new DOE waste. Success in reducing the volume of secondary waste and the costs of handling this waste would significantly decrease the overall cost of this DOE program. Selection of appropriate analytical methods depends on the intended use of the resultant data. It is not always necessary to use a high-powered analytical method, typically at higher cost, to obtain data needed to make decisions about waste management. Indeed, for samples taken from some heterogeneous systems, the meaning of high accuracy becomes clouded if the data generated are intended to measure a property of this system. Among the factors to be considered in selecting the analytical method are the lower limit of detection, accuracy, turnaround time, cost, reproducibility (precision), interferences, and simplicity. Occasionally, there must be tradeoffs among these factors to achieve the multiple goals of a characterization program. The purpose of the work described here is to add waste minimization to the list of characteristics to be considered. In this paper the authors present results of modifying analytical methods for waste characterization to reduce both the cost of analysis and volume of secondary wastes. Although tradeoffs may be required to minimize waste while still generating data of acceptable quality for the decision-making process, they have data demonstrating that wastes can be reduced in some cases without sacrificing accuracy or precision

  12. Segmental vitiligo with segmental morphea: An autoimmune link?

    Directory of Open Access Journals (Sweden)

    Pravesh Yadav

    2014-01-01

    Full Text Available An 18-year old girl with segmental vitiligo involving the left side of the trunk and left upper limb with segmental morphea involving the right side of trunk and right upper limb without any deeper involvement is illustrated. There was no history of preceding drug intake, vaccination, trauma, radiation therapy, infection, or hormonal therapy. Family history of stable vitiligo in her brother and a history of type II diabetes mellitus in the father were elicited. Screening for autoimmune diseases and antithyroid antibody was negative. An autoimmune link explaining the co-occurrence has been proposed. Cutaneous mosiacism could explain the presence of both the pathologies in a segmental distribution.

  13. Japanese migration in contemporary Japan: economic segmentation and interprefectural migration.

    Science.gov (United States)

    Fukurai, H

    1991-01-01

    This paper examines the economic segmentation model in explaining 1985-86 Japanese interregional migration. The analysis takes advantage of statistical graphic techniques to illustrate the following substantive issues of interregional migration: (1) to examine whether economic segmentation significantly influences Japanese regional migration and (2) to explain socioeconomic characteristics of prefectures for both in- and out-migration. Analytic techniques include a latent structural equation (LISREL) methodology and statistical residual mapping. The residual dispersion patterns, for instance, suggest the extent to which socioeconomic and geopolitical variables explain migration differences by showing unique clusters of unexplained residuals. The analysis further points out that extraneous factors such as high residential land values, significant commuting populations, and regional-specific cultures and traditions need to be incorporated in the economic segmentation model in order to assess the extent of the model's reliability in explaining the pattern of interprefectural migration.

  14. Programming

    International Nuclear Information System (INIS)

    Jackson, M.A.

    1982-01-01

    The programmer's task is often taken to be the construction of algorithms, expressed in hierarchical structures of procedures: this view underlies the majority of traditional programming languages, such as Fortran. A different view is appropriate to a wide class of problem, perhaps including some problems in High Energy Physics. The programmer's task is regarded as having three main stages: first, an explicit model is constructed of the reality with which the program is concerned; second, this model is elaborated to produce the required program outputs; third, the resulting program is transformed to run efficiently in the execution environment. The first two stages deal in network structures of sequential processes; only the third is concerned with procedure hierarchies. (orig.)

  15. Programming

    OpenAIRE

    Jackson, M A

    1982-01-01

    The programmer's task is often taken to be the construction of algorithms, expressed in hierarchical structures of procedures: this view underlies the majority of traditional programming languages, such as Fortran. A different view is appropriate to a wide class of problem, perhaps including some problems in High Energy Physics. The programmer's task is regarded as having three main stages: first, an explicit model is constructed of the reality with which the program is concerned; second, thi...

  16. Using Predictability for Lexical Segmentation.

    Science.gov (United States)

    Çöltekin, Çağrı

    2017-09-01

    This study investigates a strategy based on predictability of consecutive sub-lexical units in learning to segment a continuous speech stream into lexical units using computational modeling and simulations. Lexical segmentation is one of the early challenges during language acquisition, and it has been studied extensively through psycholinguistic experiments as well as computational methods. However, despite strong empirical evidence, the explicit use of predictability of basic sub-lexical units in models of segmentation is underexplored. This paper presents an incremental computational model of lexical segmentation for exploring the usefulness of predictability for lexical segmentation. We show that the predictability cue is a strong cue for segmentation. Contrary to earlier reports in the literature, the strategy yields state-of-the-art segmentation performance with an incremental computational model that uses only this particular cue in a cognitively plausible setting. The paper also reports an in-depth analysis of the model, investigating the conditions affecting the usefulness of the strategy. Copyright © 2016 Cognitive Science Society, Inc.

  17. Tank 241-BY-109, cores 201 and 203, analytical results for the final report

    International Nuclear Information System (INIS)

    Esch, R.A.

    1997-01-01

    This document is the final laboratory report for tank 241-BY-109 push mode core segments collected between June 6, 1997 and June 17, 1997. The segments were subsampled and analyzed in accordance with the Tank Push Mode Core Sampling and Analysis Plan (Bell, 1997), the Tank Safety Screening Data Quality Objective (Dukelow, et al, 1995). The analytical results are included

  18. Hanford transuranic analytical capability

    International Nuclear Information System (INIS)

    McVey, C.B.

    1995-01-01

    With the current DOE focus on ER/WM programs, an increase in the quantity of waste samples that requires detailed analysis is forecasted. One of the prime areas of growth is the demand for DOE environmental protocol analyses of TRU waste samples. Currently there is no laboratory capacity to support analysis of TRU waste samples in excess of 200 nCi/gm. This study recommends that an interim solution be undertaken to provide these services. By adding two glove boxes in room 11A of 222S the interim waste analytical needs can be met for a period of four to five years or until a front end facility is erected at or near the 222-S facility. The yearly average of samples is projected to be approximately 600 samples. The figure has changed significantly due to budget changes and has been downgraded from 10,000 samples to the 600 level. Until these budget and sample projection changes become firmer, a long term option is not recommended at this time. A revision to this document is recommended by March 1996 to review the long term option and sample projections

  19. The Hierarchy of Segment Reports

    Directory of Open Access Journals (Sweden)

    Danilo Dorović

    2015-05-01

    Full Text Available The article presents an attempt to find the connection between reports created for managers responsible for different business segments. With this purpose, the hierarchy of the business reporting segments is proposed. This can lead to better understanding of the expenses under common responsibility of more than one manager since these expenses should be in more than one report. The structure of cost defined per business segment hierarchy with the aim of new, unusual but relevant cost structure for management can be established. Both could potentially bring new information benefits for management in the context of profit reporting.

  20. Segmental dilatation of the ileum

    Directory of Open Access Journals (Sweden)

    Tune-Yie Shih

    2017-01-01

    Full Text Available A 2-year-old boy was sent to the emergency department with the chief problem of abdominal pain for 1 day. He was just discharged from the pediatric ward with the diagnosis of mycoplasmal pneumonia and paralytic ileus. After initial examinations and radiographic investigations, midgut volvulus was impressed. An emergency laparotomy was performed. Segmental dilatation of the ileum with volvulus was found. The operative procedure was resection of the dilated ileal segment with anastomosis. The postoperative recovery was uneventful. The unique abnormality of gastrointestinal tract – segmental dilatation of the ileum, is described in details and the literature is reviewed.

  1. Accounting for segment correlations in segmented gamma-ray scans

    International Nuclear Information System (INIS)

    Sheppard, G.A.; Prettyman, T.H.; Piquette, E.C.

    1994-01-01

    In a typical segmented gamma-ray scanner (SGS), the detector's field of view is collimated so that a complete horizontal slice or segment of the desired thickness is visible. Ordinarily, the collimator is not deep enough to exclude gamma rays emitted from sample volumes above and below the segment aligned with the collimator. This can lead to assay biases, particularly for certain radioactive-material distributions. Another consequence of the collimator's low aspect ratio is that segment assays at the top and bottom of the sample are biased low because the detector's field of view is not filled. This effect is ordinarily countered by placing the sample on a low-Z pedestal and scanning one or more segment thicknesses below and above the sample. This takes extra time, however, We have investigated a number of techniques that both account for correlated segments and correct for end effects in SGS assays. Also, we have developed an algorithm that facilitates estimates of assay precision. Six calculation methods have been compared by evaluating the results of thousands of simulated, assays for three types of gamma-ray source distribution and ten masses. We will report on these computational studies and their experimental verification

  2. Public Interest Energy Research (PIER) Program Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Flapper, Joris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ke, Jing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kramer, Klaas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sathaye, Jayant [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-02-01

    water usage in individual dairy plants, augment benchmarking activities in the market places, and facilitate implementation of efficiency measures and strategies to save energy and water usage in the dairy industry. Industrial adoption of this emerging tool and technology in the market is expected to benefit dairy plants, which are important customers of California utilities. Further demonstration of this benchmarking tool is recommended, for facilitating its commercialization and expansion in functions of the tool. Wider use of this BEST-Dairy tool and its continuous expansion (in functionality) will help to reduce the actual consumption of energy and water in the dairy industry sector. The outcomes comply very well with the goals set by the AB 1250 for PIER program.

  3. Analytic nuclear scattering theories

    International Nuclear Information System (INIS)

    Di Marzio, F.; University of Melbourne, Parkville, VIC

    1999-01-01

    A wide range of nuclear reactions are examined in an analytical version of the usual distorted wave Born approximation. This new approach provides either semi analytic or fully analytic descriptions of the nuclear scattering processes. The resulting computational simplifications, when used within the limits of validity, allow very detailed tests of both nuclear interaction models as well as large basis models of nuclear structure to be performed

  4. Analytical Electron Microscope

    Data.gov (United States)

    Federal Laboratory Consortium — The Titan 80-300 is a transmission electron microscope (TEM) equipped with spectroscopic detectors to allow chemical, elemental, and other analytical measurements to...

  5. LDR segmented mirror technology assessment study

    Science.gov (United States)

    Krim, M.; Russo, J.

    1983-01-01

    In the mid-1990s, NASA plans to orbit a giant telescope, whose aperture may be as great as 30 meters, for infrared and sub-millimeter astronomy. Its primary mirror will be deployed or assembled in orbit from a mosaic of possibly hundreds of mirror segments. Each segment must be shaped to precise curvature tolerances so that diffraction-limited performance will be achieved at 30 micron (nominal operating wavelength). All panels must lie within 1 micron on a theoretical surface described by the optical precipitation of the telescope's primary mirror. To attain diffraction-limited performance, the issues of alignment and/or position sensing, position control of micron tolerances, and structural, thermal, and mechanical considerations for stowing, deploying, and erecting the reflector must be resolved. Radius of curvature precision influences panel size, shape, material, and type of construction. Two superior material choices emerged: fused quartz (sufficiently homogeneous with respect to thermal expansivity to permit a thin shell substrate to be drape molded between graphite dies to a precise enough off-axis asphere for optical finishing on the as-received a segment) and a Pyrex or Duran (less expensive than quartz and formable at lower temperatures). The optimal reflector panel size is between 1-1/2 and 2 meters. Making one, two-meter mirror every two weeks requires new approaches to manufacturing off-axis parabolic or aspheric segments (drape molding on precision dies and subsequent finishing on a nonrotationally symmetric dependent machine). Proof-of-concept developmental programs were identified to prove the feasibility of the materials and manufacturing ideas.

  6. CLG for Automatic Image Segmentation

    OpenAIRE

    Christo Ananth; S.Santhana Priya; S.Manisha; T.Ezhil Jothi; M.S.Ramasubhaeswari

    2017-01-01

    This paper proposes an automatic segmentation method which effectively combines Active Contour Model, Live Wire method and Graph Cut approach (CLG). The aim of Live wire method is to provide control to the user on segmentation process during execution. Active Contour Model provides a statistical model of object shape and appearance to a new image which are built during a training phase. In the graph cut technique, each pixel is represented as a node and the distance between those nodes is rep...

  7. Market segmentation, targeting and positioning

    OpenAIRE

    Camilleri, Mark Anthony

    2017-01-01

    Businesses may not be in a position to satisfy all of their customers, every time. It may prove difficult to meet the exact requirements of each individual customer. People do not have identical preferences, so rarely does one product completely satisfy everyone. Many companies may usually adopt a strategy that is known as target marketing. This strategy involves dividing the market into segments and developing products or services to these segments. A target marketing strategy is focused on ...

  8. Recognition Using Classification and Segmentation Scoring

    National Research Council Canada - National Science Library

    Kimball, Owen; Ostendorf, Mari; Rohlicek, Robin

    1992-01-01

    .... We describe an approach to connected word recognition that allows the use of segmental information through an explicit decomposition of the recognition criterion into classification and segmentation scoring...

  9. Computing the zeros of analytic functions

    CERN Document Server

    Kravanja, Peter

    2000-01-01

    Computing all the zeros of an analytic function and their respective multiplicities, locating clusters of zeros and analytic fuctions, computing zeros and poles of meromorphic functions, and solving systems of analytic equations are problems in computational complex analysis that lead to a rich blend of mathematics and numerical analysis. This book treats these four problems in a unified way. It contains not only theoretical results (based on formal orthogonal polynomials or rational interpolation) but also numerical analysis and algorithmic aspects, implementation heuristics, and polished software (the package ZEAL) that is available via the CPC Program Library. Graduate studets and researchers in numerical mathematics will find this book very readable.

  10. The Analytical Hierarchy Process

    DEFF Research Database (Denmark)

    Barfod, Michael Bruhn

    2007-01-01

    The technical note gathers the theory behind the Analytical Hierarchy Process (AHP) and present its advantages and disadvantages in practical use.......The technical note gathers the theory behind the Analytical Hierarchy Process (AHP) and present its advantages and disadvantages in practical use....

  11. Signals: Applying Academic Analytics

    Science.gov (United States)

    Arnold, Kimberly E.

    2010-01-01

    Academic analytics helps address the public's desire for institutional accountability with regard to student success, given the widespread concern over the cost of higher education and the difficult economic and budgetary conditions prevailing worldwide. Purdue University's Signals project applies the principles of analytics widely used in…

  12. Analytic Moufang-transformations

    International Nuclear Information System (INIS)

    Paal, Eh.N.

    1988-01-01

    The paper is aimed to be an introduction to the concept of an analytic birepresentation of an analytic Moufang loop. To describe the deviation of (S,T) from associativity, the associators (S,T) are defined and certain constraints for them, called the minimality conditions of (S,T) are established

  13. Quine's "Strictly Vegetarian" Analyticity

    NARCIS (Netherlands)

    Decock, L.B.

    2017-01-01

    I analyze Quine’s later writings on analyticity from a linguistic point of view. In Word and Object Quine made room for a “strictly vegetarian” notion of analyticity. In later years, he developed this notion into two more precise notions, which I have coined “stimulus analyticity” and “behaviorist

  14. Learning analytics dashboard applications

    NARCIS (Netherlands)

    Verbert, K.; Duval, E.; Klerkx, J.; Govaerts, S.; Santos, J.L.

    2013-01-01

    This article introduces learning analytics dashboards that visualize learning traces for learners and teachers. We present a conceptual framework that helps to analyze learning analytics applications for these kinds of users. We then present our own work in this area and compare with 15 related

  15. Learning Analytics Considered Harmful

    Science.gov (United States)

    Dringus, Laurie P.

    2012-01-01

    This essay is written to present a prospective stance on how learning analytics, as a core evaluative approach, must help instructors uncover the important trends and evidence of quality learner data in the online course. A critique is presented of strategic and tactical issues of learning analytics. The approach to the critique is taken through…

  16. Analytical mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    1990-01-01

    This 43rd Annual Summer Symposium on Analytical Chemistry was held July 24--27, 1990 at Oak Ridge, TN and contained sessions on the following topics: Fundamentals of Analytical Mass Spectrometry (MS), MS in the National Laboratories, Lasers and Fourier Transform Methods, Future of MS, New Ionization and LC/MS Methods, and an extra session. (WET)

  17. Analytical mass spectrometry. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    1990-12-31

    This 43rd Annual Summer Symposium on Analytical Chemistry was held July 24--27, 1990 at Oak Ridge, TN and contained sessions on the following topics: Fundamentals of Analytical Mass Spectrometry (MS), MS in the National Laboratories, Lasers and Fourier Transform Methods, Future of MS, New Ionization and LC/MS Methods, and an extra session. (WET)

  18. Learning Analytics for Online Discussions: Embedded and Extracted Approaches

    Science.gov (United States)

    Wise, Alyssa Friend; Zhao, Yuting; Hausknecht, Simone Nicole

    2014-01-01

    This paper describes an application of learning analytics that builds on an existing research program investigating how students contribute and attend to the messages of others in asynchronous online discussions. We first overview the E-Listening research program and then explain how this work was translated into analytics that students and…

  19. Quo vadis, analytical chemistry?

    Science.gov (United States)

    Valcárcel, Miguel

    2016-01-01

    This paper presents an open, personal, fresh approach to the future of Analytical Chemistry in the context of the deep changes Science and Technology are anticipated to experience. Its main aim is to challenge young analytical chemists because the future of our scientific discipline is in their hands. A description of not completely accurate overall conceptions of our discipline, both past and present, to be avoided is followed by a flexible, integral definition of Analytical Chemistry and its cornerstones (viz., aims and objectives, quality trade-offs, the third basic analytical reference, the information hierarchy, social responsibility, independent research, transfer of knowledge and technology, interfaces to other scientific-technical disciplines, and well-oriented education). Obsolete paradigms, and more accurate general and specific that can be expected to provide the framework for our discipline in the coming years are described. Finally, the three possible responses of analytical chemists to the proposed changes in our discipline are discussed.

  20. The influence of interactions between market segmentation strategy and competition on organizational performance. A simulation study.

    OpenAIRE

    Dolnicar, Sara; Freitag, Roman

    2003-01-01

    A computer simulation study is conducted to explore the interaction of alternative segmentation strategies and the competitiveness of the market environment, a goal that can neither be tackled by purely analytic approaches nor is sufficient and undistorted real market data available to deduct findings in an empirical manner. The fundamental idea of the simulation is to increase competition in the artificial marketplace and to study the influence of segmentation strategy and varying market con...

  1. Analysis of a Segmented Annular Coplanar Capacitive Tilt Sensor with Increased Sensitivity

    OpenAIRE

    Jiahao Guo; Pengcheng Hu; Jiubin Tan

    2016-01-01

    An investigation of a segmented annular coplanar capacitor is presented. We focus on its theoretical model, and a mathematical expression of the capacitance value is derived by solving a Laplace equation with Hankel transform. The finite element method is employed to verify the analytical result. Different control parameters are discussed, and each contribution to the capacitance value of the capacitor is obtained. On this basis, we analyze and optimize the structure parameters of a segmented...

  2. Big data and analytics strategic and organizational impacts

    CERN Document Server

    Morabito, Vincenzo

    2015-01-01

    This book presents and discusses the main strategic and organizational challenges posed by Big Data and analytics in a manner relevant to both practitioners and scholars. The first part of the book analyzes strategic issues relating to the growing relevance of Big Data and analytics for competitive advantage, which is also attributable to empowerment of activities such as consumer profiling, market segmentation, and development of new products or services. Detailed consideration is also given to the strategic impact of Big Data and analytics on innovation in domains such as government and education and to Big Data-driven business models. The second part of the book addresses the impact of Big Data and analytics on management and organizations, focusing on challenges for governance, evaluation, and change management, while the concluding part reviews real examples of Big Data and analytics innovation at the global level. The text is supported by informative illustrations and case studies, so that practitioners...

  3. Methods of evaluating segmentation characteristics and segmentation of major faults

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kie Hwa; Chang, Tae Woo; Kyung, Jai Bok [Seoul National Univ., Seoul (Korea, Republic of)] (and others)

    2000-03-15

    Seismological, geological, and geophysical studies were made for reasonable segmentation of the Ulsan fault and the results are as follows. One- and two- dimensional electrical surveys revealed clearly the fault fracture zone enlarges systematically northward and southward from the vicinity of Mohwa-ri, indicating Mohwa-ri is at the seismic segment boundary. Field Geological survey and microscope observation of fault gouge indicates that the Quaternary faults in the area are reactivated products of the preexisting faults. Trench survey of the Chonbuk fault Galgok-ri revealed thrust faults and cumulative vertical displacement due to faulting during the late Quaternary with about 1.1-1.9 m displacement per event; the latest event occurred from 14000 to 25000 yrs. BP. The seismic survey showed the basement surface os cut by numerous reverse faults and indicated the possibility that the boundary between Kyeongsangbukdo and Kyeongsannamdo may be segment boundary.

  4. Segmentation by Large Scale Hypothesis Testing - Segmentation as Outlier Detection

    DEFF Research Database (Denmark)

    Darkner, Sune; Dahl, Anders Lindbjerg; Larsen, Rasmus

    2010-01-01

    a microscope and we show how the method can handle transparent particles with significant glare point. The method generalizes to other problems. THis is illustrated by applying the method to camera calibration images and MRI of the midsagittal plane for gray and white matter separation and segmentation......We propose a novel and efficient way of performing local image segmentation. For many applications a threshold of pixel intensities is sufficient but determine the appropriate threshold value can be difficult. In cases with large global intensity variation the threshold value has to be adapted...... locally. We propose a method based on large scale hypothesis testing with a consistent method for selecting an appropriate threshold for the given data. By estimating the background distribution we characterize the segment of interest as a set of outliers with a certain probability based on the estimated...

  5. Methods of evaluating segmentation characteristics and segmentation of major faults

    International Nuclear Information System (INIS)

    Lee, Kie Hwa; Chang, Tae Woo; Kyung, Jai Bok

    2000-03-01

    Seismological, geological, and geophysical studies were made for reasonable segmentation of the Ulsan fault and the results are as follows. One- and two- dimensional electrical surveys revealed clearly the fault fracture zone enlarges systematically northward and southward from the vicinity of Mohwa-ri, indicating Mohwa-ri is at the seismic segment boundary. Field Geological survey and microscope observation of fault gouge indicates that the Quaternary faults in the area are reactivated products of the preexisting faults. Trench survey of the Chonbuk fault Galgok-ri revealed thrust faults and cumulative vertical displacement due to faulting during the late Quaternary with about 1.1-1.9 m displacement per event; the latest event occurred from 14000 to 25000 yrs. BP. The seismic survey showed the basement surface os cut by numerous reverse faults and indicated the possibility that the boundary between Kyeongsangbukdo and Kyeongsannamdo may be segment boundary

  6. Towards Secure and Trustworthy Cyberspace: Social Media Analytics on Hacker Communities

    Science.gov (United States)

    Li, Weifeng

    2017-01-01

    Social media analytics is a critical research area spawned by the increasing availability of rich and abundant online user-generated content. So far, social media analytics has had a profound impact on organizational decision making in many aspects, including product and service design, market segmentation, customer relationship management, and…

  7. Integrating Linear Programming and Analytical Hierarchical ...

    African Journals Online (AJOL)

    -GIS to Optimize Land Use Pattern at Watershed Level. ... The PDF file you selected should load here if your Web browser has a PDF reader plug-in installed (for example, a recent version of Adobe Acrobat Reader). If you would like more ...

  8. Integrating Linear Programming and Analytical Hierarchical ...

    African Journals Online (AJOL)

    Study area is about 28000 ha of Keleibar- Chai Watershed, located in eastern Azerbaijan, Iran. Socio-economic information collected through a two-stage survey of 19 villages, including 300 samples. Thematic maps also have summarized Ecological factors, including physical and economic data. A comprehensive Linear ...

  9. A study of symbol segmentation method for handwritten mathematical formula recognition using mathematical structure information

    OpenAIRE

    Toyozumi, Kenichi; Yamada, Naoya; Kitasaka, Takayuki; Mori, Kensaku; Suenaga, Yasuhito; Mase, Kenji; Takahashi, Tomoichi

    2004-01-01

    Symbol segmentation is very important in handwritten mathematical formula recognition, since it is the very first portion of the recognition, since it is the very first portion of the recognition process. This paper proposes a new symbol segmentation method using mathematical structure information. The base technique of symbol segmentation employed in theexisting methods is dynamic programming which optimizes the overall results of individual symbol recognition. The new method we propose here...

  10. Automatic segmentation of psoriasis lesions

    Science.gov (United States)

    Ning, Yang; Shi, Chenbo; Wang, Li; Shu, Chang

    2014-10-01

    The automatic segmentation of psoriatic lesions is widely researched these years. It is an important step in Computer-aid methods of calculating PASI for estimation of lesions. Currently those algorithms can only handle single erythema or only deal with scaling segmentation. In practice, scaling and erythema are often mixed together. In order to get the segmentation of lesions area - this paper proposes an algorithm based on Random forests with color and texture features. The algorithm has three steps. The first step, the polarized light is applied based on the skin's Tyndall-effect in the imaging to eliminate the reflection and Lab color space are used for fitting the human perception. The second step, sliding window and its sub windows are used to get textural feature and color feature. In this step, a feature of image roughness has been defined, so that scaling can be easily separated from normal skin. In the end, Random forests will be used to ensure the generalization ability of the algorithm. This algorithm can give reliable segmentation results even the image has different lighting conditions, skin types. In the data set offered by Union Hospital, more than 90% images can be segmented accurately.

  11. Deriving Earth Science Data Analytics Requirements

    Science.gov (United States)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  12. Skip segment Hirschsprung disease and Waardenburg syndrome

    Directory of Open Access Journals (Sweden)

    Erica R. Gross

    2015-04-01

    Full Text Available Skip segment Hirschsprung disease describes a segment of ganglionated bowel between two segments of aganglionated bowel. It is a rare phenomenon that is difficult to diagnose. We describe a recent case of skip segment Hirschsprung disease in a neonate with a family history of Waardenburg syndrome and the genetic profile that was identified.

  13. U.S. Army Custom Segmentation System

    Science.gov (United States)

    2007-06-01

    segmentation is individual or intergroup differences in response to marketing - mix variables. Presumptions about segments: •different demands in a...product or service category, •respond differently to changes in the marketing mix Criteria for segments: •The segments must exist in the environment

  14. Skip segment Hirschsprung disease and Waardenburg syndrome

    OpenAIRE

    Gross, Erica R.; Geddes, Gabrielle C.; McCarrier, Julie A.; Jarzembowski, Jason A.; Arca, Marjorie J.

    2015-01-01

    Skip segment Hirschsprung disease describes a segment of ganglionated bowel between two segments of aganglionated bowel. It is a rare phenomenon that is difficult to diagnose. We describe a recent case of skip segment Hirschsprung disease in a neonate with a family history of Waardenburg syndrome and the genetic profile that was identified.

  15. Google analytics integrations

    CERN Document Server

    Waisberg, Daniel

    2015-01-01

    A roadmap for turning Google Analytics into a centralized marketing analysis platform With Google Analytics Integrations, expert author Daniel Waisberg shows you how to gain a more meaningful, complete view of customers that can drive growth opportunities. This in-depth guide shows not only how to use Google Analytics, but also how to turn this powerful data collection and analysis tool into a central marketing analysis platform for your company. Taking a hands-on approach, this resource explores the integration and analysis of a host of common data sources, including Google AdWords, AdSens

  16. A Novel Iris Segmentation Scheme

    Directory of Open Access Journals (Sweden)

    Chen-Chung Liu

    2014-01-01

    Full Text Available One of the key steps in the iris recognition system is the accurate iris segmentation from its surrounding noises including pupil, sclera, eyelashes, and eyebrows of a captured eye-image. This paper presents a novel iris segmentation scheme which utilizes the orientation matching transform to outline the outer and inner iris boundaries initially. It then employs Delogne-Kåsa circle fitting (instead of the traditional Hough transform to further eliminate the outlier points to extract a more precise iris area from an eye-image. In the extracted iris region, the proposed scheme further utilizes the differences in the intensity and positional characteristics of the iris, eyelid, and eyelashes to detect and delete these noises. The scheme is then applied on iris image database, UBIRIS.v1. The experimental results show that the presented scheme provides a more effective and efficient iris segmentation than other conventional methods.

  17. Analytical torque calculation and experimental verification of synchronous permanent magnet couplings with Halbach arrays

    Science.gov (United States)

    Seo, Sung-Won; Kim, Young-Hyun; Lee, Jung-Ho; Choi, Jang-Young

    2018-05-01

    This paper presents analytical torque calculation and experimental verification of synchronous permanent magnet couplings (SPMCs) with Halbach arrays. A Halbach array is composed of various numbers of segments per pole; we calculate and compare the magnetic torques for 2, 3, and 4 segments. Firstly, based on the magnetic vector potential, and using a 2D polar coordinate system, we obtain analytical solutions for the magnetic field. Next, through a series of processes, we perform magnetic torque calculations using the derived solutions and a Maxwell stress tensor. Finally, the analytical results are verified by comparison with the results of 2D and 3D finite element analysis and the results of an experiment.

  18. 3-D discrete analytical ridgelet transform.

    Science.gov (United States)

    Helbert, David; Carré, Philippe; Andres, Eric

    2006-12-01

    In this paper, we propose an implementation of the 3-D Ridgelet transform: the 3-D discrete analytical Ridgelet transform (3-D DART). This transform uses the Fourier strategy for the computation of the associated 3-D discrete Radon transform. The innovative step is the definition of a discrete 3-D transform with the discrete analytical geometry theory by the construction of 3-D discrete analytical lines in the Fourier domain. We propose two types of 3-D discrete lines: 3-D discrete radial lines going through the origin defined from their orthogonal projections and 3-D planes covered with 2-D discrete line segments. These discrete analytical lines have a parameter called arithmetical thickness, allowing us to define a 3-D DART adapted to a specific application. Indeed, the 3-D DART representation is not orthogonal, It is associated with a flexible redundancy factor. The 3-D DART has a very simple forward/inverse algorithm that provides an exact reconstruction without any iterative method. In order to illustrate the potentiality of this new discrete transform, we apply the 3-D DART and its extension to the Local-DART (with smooth windowing) to the denoising of 3-D image and color video. These experimental results show that the simple thresholding of the 3-D DART coefficients is efficient.

  19. Document segmentation via oblique cuts

    Science.gov (United States)

    Svendsen, Jeremy; Branzan-Albu, Alexandra

    2013-01-01

    This paper presents a novel solution for the layout segmentation of graphical elements in Business Intelligence documents. We propose a generalization of the recursive X-Y cut algorithm, which allows for cutting along arbitrary oblique directions. An intermediate processing step consisting of line and solid region removal is also necessary due to presence of decorative elements. The output of the proposed segmentation is a hierarchical structure which allows for the identification of primitives in pie and bar charts. The algorithm was tested on a database composed of charts from business documents. Results are very promising.

  20. Optimally segmented permanent magnet structures

    DEFF Research Database (Denmark)

    Insinga, Andrea Roberto; Bjørk, Rasmus; Smith, Anders

    2016-01-01

    We present an optimization approach which can be employed to calculate the globally optimal segmentation of a two-dimensional magnetic system into uniformly magnetized pieces. For each segment the algorithm calculates the optimal shape and the optimal direction of the remanent flux density vector......, with respect to a linear objective functional. We illustrate the approach with results for magnet design problems from different areas, such as a permanent magnet electric motor, a beam focusing quadrupole magnet for particle accelerators and a rotary device for magnetic refrigeration....

  1. Analytical chemistry department. Annual report, 1977

    International Nuclear Information System (INIS)

    Knox, E.M.

    1978-09-01

    The annual report describes the analytical methods, analyses and equipment developed or adopted for use by the Analytical Chemistry Department during 1977. The individual articles range from a several page description of development and study programs to brief one paragraph descriptions of methods adopted for use with or without some modification. This year, we have included a list of the methods incorporated into our Analytical Chemistry Methods Manual. This report is organized into laboratory sections within the Department as well as major programs within General Atomic Company. Minor programs and studies are included under Miscellaneous. The analytical and technical support activities for GAC include gamma-ray spectroscopy, radiochemistry, activation analysis, gas chromatography, atomic absorption, spectrophotometry, emission spectroscopy, x-ray diffractometry, electron microprobe, titrimetry, gravimetry, and quality control. Services are provided to all organizations throughout General Atomic Company. The major effort, however, is in support of the research and development programs within HTGR Generic Technology Programs ranging from new fuel concepts, end-of-life studies, and irradiated capsules to fuel recycle studies

  2. Radionuclides in analytical chemistry

    International Nuclear Information System (INIS)

    Tousset, J.

    1984-01-01

    Applications of radionuclides in analytical chemistry are reviewed in this article: tracers, radioactive sources and activation analysis. Examples are given in all these fields and it is concluded that these methods should be used more widely [fr

  3. Mobility Data Analytics Center.

    Science.gov (United States)

    2016-01-01

    Mobility Data Analytics Center aims at building a centralized data engine to efficiently manipulate : large-scale data for smart decision making. Integrating and learning the massive data are the key to : the data engine. The ultimate goal of underst...

  4. Analytical strategies for phosphoproteomics

    DEFF Research Database (Denmark)

    Thingholm, Tine E; Jensen, Ole N; Larsen, Martin R

    2009-01-01

    sensitive and specific strategies. Today, most phosphoproteomic studies are conducted by mass spectrometric strategies in combination with phospho-specific enrichment methods. This review presents an overview of different analytical strategies for the characterization of phosphoproteins. Emphasis...

  5. Intercalary bone segment transport in treatment of segmental tibial defects

    International Nuclear Information System (INIS)

    Iqbal, A.; Amin, M.S.

    2002-01-01

    Objective: To evaluate the results and complications of intercalary bone segment transport in the treatment of segmental tibial defects. Design: This is a retrospective analysis of patients with segmental tibial defects who were treated with intercalary bone segment transport method. Place and Duration of Study: The study was carried out at Combined Military Hospital, Rawalpindi from September 1997 to April 2001. Subjects and methods: Thirteen patients were included in the study who had developed tibial defects either due to open fractures with bone loss or subsequent to bone debridement of infected non unions. The mean bone defect was 6.4 cms and there were eight associated soft tissue defects. Locally made unilateral 'Naseer-Awais' (NA) fixator was used for bone segment transport. The distraction was done at the rate of 1mm/day after 7-10 days of osteotomy. The patients were followed-up fortnightly during distraction and monthly thereafter. The mean follow-up duration was 18 months. Results: The mean time in external fixation was 9.4 months. The m ean healing index' was 1.47 months/cm. Satisfactory union was achieved in all cases. Six cases (46.2%) required bone grafting at target site and in one of them grafting was required at the level of regeneration as well. All the wounds healed well with no residual infection. There was no residual leg length discrepancy of more than 20 mm nd one angular deformity of more than 5 degrees. The commonest complication encountered was pin track infection seen in 38% of Shanz Screws applied. Loosening occurred in 6.8% of Shanz screws, requiring re-adjustment. Ankle joint contracture with equinus deformity and peroneal nerve paresis occurred in one case each. The functional results were graded as 'good' in seven, 'fair' in four, and 'poor' in two patients. Overall, thirteen patients had 31 (minor/major) complications with a ratio of 2.38 complications per patient. To treat the bone defects and associated complications, a mean of

  6. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.

    Science.gov (United States)

    Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita

    2016-10-11

    We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix

  7. Encyclopedia of analytical surfaces

    CERN Document Server

    Krivoshapko, S N

    2015-01-01

    This encyclopedia presents an all-embracing collection of analytical surface classes. It provides concise definitions  and description for more than 500 surfaces and categorizes them in 38 classes of analytical surfaces. All classes are cross references to the original literature in an excellent bibliography. The encyclopedia is of particular interest to structural and civil engineers and serves as valuable reference for mathematicians.

  8. Intermediate algebra & analytic geometry

    CERN Document Server

    Gondin, William R

    1967-01-01

    Intermediate Algebra & Analytic Geometry Made Simple focuses on the principles, processes, calculations, and methodologies involved in intermediate algebra and analytic geometry. The publication first offers information on linear equations in two unknowns and variables, functions, and graphs. Discussions focus on graphic interpretations, explicit and implicit functions, first quadrant graphs, variables and functions, determinate and indeterminate systems, independent and dependent equations, and defective and redundant systems. The text then examines quadratic equations in one variable, system

  9. Hydrophilic segmented block copolymers based on poly(ethylene oxide) and monodisperse amide segments

    NARCIS (Netherlands)

    Husken, D.; Feijen, Jan; Gaymans, R.J.

    2007-01-01

    Segmented block copolymers based on poly(ethylene oxide) (PEO) flexible segments and monodisperse crystallizable bisester tetra-amide segments were made via a polycondensation reaction. The molecular weight of the PEO segments varied from 600 to 4600 g/mol and a bisester tetra-amide segment (T6T6T)

  10. Increasing Enrollment by Better Serving Your Institution's Target Audiences through Benefit Segmentation.

    Science.gov (United States)

    Goodnow, Betsy

    The marketing technique of benefit segmentation may be effective in increasing enrollment in adult educational programs, according to a study at College of DuPage, Glen Ellyn, Illinois. The study was conducted to test applicability of benefit segmentation to enrollment generation. The measuring instrument used in this study--the course improvement…

  11. The implement of Talmud property allocation algorithm based on graphic point-segment way

    Science.gov (United States)

    Cen, Haifeng

    2017-04-01

    Under the guidance of the Talmud allocation scheme's theory, the paper analyzes the algorithm implemented process via the perspective of graphic point-segment way, and designs the point-segment way's Talmud property allocation algorithm. Then it uses Java language to implement the core of allocation algorithm, by using Android programming to build a visual interface.

  12. Identifying uniformly mutated segments within repeats.

    Science.gov (United States)

    Sahinalp, S Cenk; Eichler, Evan; Goldberg, Paul; Berenbrink, Petra; Friedetzky, Tom; Ergun, Funda

    2004-12-01

    Given a long string of characters from a constant size alphabet we present an algorithm to determine whether its characters have been generated by a single i.i.d. random source. More specifically, consider all possible n-coin models for generating a binary string S, where each bit of S is generated via an independent toss of one of the n coins in the model. The choice of which coin to toss is decided by a random walk on the set of coins where the probability of a coin change is much lower than the probability of using the same coin repeatedly. We present a procedure to evaluate the likelihood of a n-coin model for given S, subject a uniform prior distribution over the parameters of the model (that represent mutation rates and probabilities of copying events). In the absence of detailed prior knowledge of these parameters, the algorithm can be used to determine whether the a posteriori probability for n=1 is higher than for any other n>1. Our algorithm runs in time O(l4logl), where l is the length of S, through a dynamic programming approach which exploits the assumed convexity of the a posteriori probability for n. Our test can be used in the analysis of long alignments between pairs of genomic sequences in a number of ways. For example, functional regions in genome sequences exhibit much lower mutation rates than non-functional regions. Because our test provides means for determining variations in the mutation rate, it may be used to distinguish functional regions from non-functional ones. Another application is in determining whether two highly similar, thus evolutionarily related, genome segments are the result of a single copy event or of a complex series of copy events. This is particularly an issue in evolutionary studies of genome regions rich with repeat segments (especially tandemly repeated segments).

  13. Dictionary Based Segmentation in Volumes

    DEFF Research Database (Denmark)

    Emerson, Monica Jane; Jespersen, Kristine Munk; Jørgensen, Peter Stanley

    Method for supervised segmentation of volumetric data. The method is trained from manual annotations, and these annotations make the method very flexible, which we demonstrate in our experiments. Our method infers label information locally by matching the pattern in a neighborhood around a voxel ...... to a dictionary, and hereby accounts for the volume texture....

  14. Multiple Segmentation of Image Stacks

    DEFF Research Database (Denmark)

    Smets, Jonathan; Jaeger, Manfred

    2014-01-01

    We propose a method for the simultaneous construction of multiple image segmentations by combining a recently proposed “convolution of mixtures of Gaussians” model with a multi-layer hidden Markov random field structure. The resulting method constructs for a single image several, alternative...

  15. Segmenting Trajectories by Movement States

    NARCIS (Netherlands)

    Buchin, M.; Kruckenberg, H.; Kölzsch, A.; Timpf, S.; Laube, P.

    2013-01-01

    Dividing movement trajectories according to different movement states of animals has become a challenge in movement ecology, as well as in algorithm development. In this study, we revisit and extend a framework for trajectory segmentation based on spatio-temporal criteria for this purpose. We adapt

  16. Segmental Colitis Complicating Diverticular Disease

    Directory of Open Access Journals (Sweden)

    Guido Ma Van Rosendaal

    1996-01-01

    Full Text Available Two cases of idiopathic colitis affecting the sigmoid colon in elderly patients with underlying diverticulosis are presented. Segmental resection has permitted close review of the histopathology in this syndrome which demonstrates considerable similarity to changes seen in idiopathic ulcerative colitis. The reported experience with this syndrome and its clinical features are reviewed.

  17. Leaf segmentation in plant phenotyping

    NARCIS (Netherlands)

    Scharr, Hanno; Minervini, Massimo; French, Andrew P.; Klukas, Christian; Kramer, David M.; Liu, Xiaoming; Luengo, Imanol; Pape, Jean Michel; Polder, Gerrit; Vukadinovic, Danijela; Yin, Xi; Tsaftaris, Sotirios A.

    2016-01-01

    Image-based plant phenotyping is a growing application area of computer vision in agriculture. A key task is the segmentation of all individual leaves in images. Here we focus on the most common rosette model plants, Arabidopsis and young tobacco. Although leaves do share appearance and shape

  18. The Enhanced Segment Interconnect for FASTBUS data communications

    International Nuclear Information System (INIS)

    Machen, D.R.; Downing, R.W.; Kirsten, F.A.; Nelson, R.O.

    1987-01-01

    The Enhanced Segment Interconnect concept (ESI) for improved FASTBUS data communications is a development supported by the U.S. Department of Energy under the Small Business Innovation Research (SBIR) program. The ESI will contain both the Segment Interconnect (SI) Tyhpe S-1 and an optional buffered interconnect for store-and-forward data communications; fiber-optic-coupled serial ports will provide optional data paths. The ESI can be applied in large FASTBUS-implemented physics experiments whose data-set or data-transmission distance requirements dictate alternate approaches to data communications. This paper describes the functions of the ESI and the status of its development, now 25% complete

  19. Croatian Analytical Terminology

    Directory of Open Access Journals (Sweden)

    Kastelan-Macan; M.

    2008-04-01

    Full Text Available Results of analytical research are necessary in all human activities. They are inevitable in making decisions in the environmental chemistry, agriculture, forestry, veterinary medicine, pharmaceutical industry, and biochemistry. Without analytical measurements the quality of materials and products cannot be assessed, so that analytical chemistry is an essential part of technical sciences and disciplines.The language of Croatian science, and analytical chemistry within it, was one of the goals of our predecessors. Due to the political situation, they did not succeed entirely, but for the scientists in independent Croatia this is a duty, because language is one of the most important features of the Croatian identity. The awareness of the need to introduce Croatian terminology was systematically developed in the second half of the 19th century, along with the founding of scientific societies and the wish of scientists to write their scientific works in Croatian, so that the results of their research may be applied in economy. Many authors of textbooks from the 19th and the first half of the 20th century contributed to Croatian analytical terminology (F. Rački, B. Šulek, P. Žulić, G. Pexidr, J. Domac, G. Janeček , F. Bubanović, V. Njegovan and others. M. DeŢelić published the first systematic chemical terminology in 1940, adjusted to the IUPAC recommendations. In the second half of 20th century textbooks in classic analytical chemistry were written by V. Marjanović-Krajovan, M. Gyiketta-Ogrizek, S. Žilić and others. I. Filipović wrote the General and Inorganic Chemistry textbook and the Laboratory Handbook (in collaboration with P. Sabioncello and contributed greatly to establishing the terminology in instrumental analytical methods.The source of Croatian nomenclature in modern analytical chemistry today are translated textbooks by Skoog, West and Holler, as well as by Günnzler i Gremlich, and original textbooks by S. Turina, Z.

  20. Analytic manifolds in uniform algebras

    International Nuclear Information System (INIS)

    Tonev, T.V.

    1988-12-01

    Here we extend Bear-Hile's result concerning the version of famous Bishop's theorem for one-dimensional analytic structures in two directions: for n-dimensional complex analytic manifolds, n>1, and for generalized analytic manifolds. 14 refs

  1. [Segmentation of whole body bone SPECT image based on BP neural network].

    Science.gov (United States)

    Zhu, Chunmei; Tian, Lianfang; Chen, Ping; He, Yuanlie; Wang, Lifei; Ye, Guangchun; Mao, Zongyuan

    2007-10-01

    In this paper, BP neural network is used to segment whole body bone SPECT image so that the lesion area can be recognized automatically. For the uncertain characteristics of SPECT images, it is hard to achieve good segmentation result if only the BP neural network is employed. Therefore, the segmentation process is divided into three steps: first, the optimal gray threshold segmentation method is employed for preprocessing, then BP neural network is used to roughly identify the lesions, and finally template match method and symmetry-removing program are adopted to delete the wrongly recognized areas.

  2. Conflation of Short Identity-by-Descent Segments Bias Their Inferred Length Distribution

    Directory of Open Access Journals (Sweden)

    Charleston W. K. Chiang

    2016-05-01

    Full Text Available Identity-by-descent (IBD is a fundamental concept in genetics with many applications. In a common definition, two haplotypes are said to share an IBD segment if that segment is inherited from a recent shared common ancestor without intervening recombination. Segments several cM long can be efficiently detected by a number of algorithms using high-density SNP array data from a population sample, and there are currently efforts to detect shorter segments from sequencing. Here, we study a problem of identifiability: because existing approaches detect IBD based on contiguous segments of identity-by-state, inferred long segments of IBD may arise from the conflation of smaller, nearby IBD segments. We quantified this effect using coalescent simulations, finding that significant proportions of inferred segments 1–2 cM long are results of conflations of two or more shorter segments, each at least 0.2 cM or longer, under demographic scenarios typical for modern humans for all programs tested. The impact of such conflation is much smaller for longer (> 2 cM segments. This biases the inferred IBD segment length distribution, and so can affect downstream inferences that depend on the assumption that each segment of IBD derives from a single common ancestor. As an example, we present and analyze an estimator of the de novo mutation rate using IBD segments, and demonstrate that unmodeled conflation leads to underestimates of the ages of the common ancestors on these segments, and hence a significant overestimate of the mutation rate. Understanding the conflation effect in detail will make its correction in future methods more tractable.

  3. Analytical Thinking, Analytical Action: Using Prelab Video Demonstrations and e-Quizzes to Improve Undergraduate Preparedness for Analytical Chemistry Practical Classes

    Science.gov (United States)

    Jolley, Dianne F.; Wilson, Stephen R.; Kelso, Celine; O'Brien, Glennys; Mason, Claire E.

    2016-01-01

    This project utilizes visual and critical thinking approaches to develop a higher-education synergistic prelab training program for a large second-year undergraduate analytical chemistry class, directing more of the cognitive learning to the prelab phase. This enabled students to engage in more analytical thinking prior to engaging in the…

  4. Digital collection of photographic surveys of beach profiles and animals taken as part of the Beach Watch program at Pinnacle Gulch (segment 1-07), California from 2000-09-30 to 2001-06-25 (NCEI Accession 0071540)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — NOAA's Gulf of the Farallones National Marine Sanctuary (GFNMS) Beach Watch Program, administered by the Farallones Marine Sanctuary Association (FMSA), is a...

  5. APPLICATION SEGMENT ANALYSISFOR THE DEVELOPMENT STRATEGYEDUCATIONAL INSTITUTION

    Directory of Open Access Journals (Sweden)

    G. V. Alekseev

    2015-01-01

    Full Text Available Summary. Applicable at present methods of the shaping to strategies of the development of the educational institutions not always objective take into account the mutual influence and receivership separate structured and organizing block to organizations of the scholastic process, in particular work with applicant. The Article is dedicated to discussing the possibilities of the using the segment analysis for development of the strategies of the development of the educational institutions for the reason increasing produced specialist on the market of the labour real sector economy. In her is described possibility to formalize the choice of the marketing methods within the framework of approach of the stochastic programming, as section of the ill-defined logic (fuzzy logic, which is a generalizations classical theory of sets and classical formal logic. The Main reason of the using of such approach became presence ill-defined and drawn near discourses at description of the preferences applicant, quality of the formation, but consequently and missions of the educational institution. The Decision of the specified problems in significant measure promotes the ill-defined approach to modeling of the complex systems, which has obtained recognition all over the world for use the most most important factors and methods of the determination to value of the balance marketing approach on the base of the segment analysis and base expert estimation, for what is formed corresponding to about-gram for COMPUTER realizing specified approaches.

  6. Testing program for burning plasma experiment vacuum vessel bolted joint

    International Nuclear Information System (INIS)

    Hsueh, P.K.; Khan, M.Z.; Swanson, J.; Feng, T.; Dinkevich, S.; Warren, J.

    1992-01-01

    As presently designed, the Burning Plasma Experiment vacuum vessel will be segmentally fabricated and assembled by bolted joints in the field. Due to geometry constraints, most of the bolted joints have significant eccentricity which causes the joint behavior to be sensitive to joint clamping forces. Experience indicates that as a result of this eccentricity, the joint will tend to open at the side closest to the applied load with the extent of the opening being dependent on the initial preload. In this paper analytical models coupled with a confirmatory testing program are developed to investigate and predict the non-linear behavior of the vacuum vessel bolted joint

  7. Nyheder i SAS Analytics 14.2

    DEFF Research Database (Denmark)

    Milhøj, Anders

    2017-01-01

    I november 2016 blev Analytical Produts i den opdaterede version 14.2 sendt på markedet. Denne opdatering indeholder opdateringer af de analytiske programpakker inden for statistik, økonometri, operationsanalyse etc. Disse opdateringer er nu løsrevet fra samtidige opdateringer af det samlede SAS-program...

  8. Analytical aids in land management planning

    Science.gov (United States)

    David R. Betters

    1978-01-01

    Quantitative techniques may be applied to aid in completing various phases of land management planning. Analytical procedures which have been used include a procedure for public involvement, PUBLIC; a matrix information generator, MAGE5; an allocation procedure, linear programming (LP); and an input-output economic analysis (EA). These techniques have proven useful in...

  9. Cultivating Institutional Capacities for Learning Analytics

    Science.gov (United States)

    Lonn, Steven; McKay, Timothy A.; Teasley, Stephanie D.

    2017-01-01

    This chapter details the process the University of Michigan developed to build institutional capacity for learning analytics. A symposium series, faculty task force, fellows program, research grants, and other initiatives are discussed, with lessons learned for future efforts and how other institutions might adapt such efforts to spur cultural…

  10. Information theory in analytical chemistry

    National Research Council Canada - National Science Library

    Eckschlager, Karel; Danzer, Klaus

    1994-01-01

    Contents: The aim of analytical chemistry - Basic concepts of information theory - Identification of components - Qualitative analysis - Quantitative analysis - Multicomponent analysis - Optimum analytical...

  11. Competing on talent analytics.

    Science.gov (United States)

    Davenport, Thomas H; Harris, Jeanne; Shapiro, Jeremy

    2010-10-01

    Do investments in your employees actually affect workforce performance? Who are your top performers? How can you empower and motivate other employees to excel? Leading-edge companies such as Google, Best Buy, Procter & Gamble, and Sysco use sophisticated data-collection technology and analysis to answer these questions, leveraging a range of analytics to improve the way they attract and retain talent, connect their employee data to business performance, differentiate themselves from competitors, and more. The authors present the six key ways in which companies track, analyze, and use data about their people-ranging from a simple baseline of metrics to monitor the organization's overall health to custom modeling for predicting future head count depending on various "what if" scenarios. They go on to show that companies competing on talent analytics manage data and technology at an enterprise level, support what analytical leaders do, choose realistic targets for analysis, and hire analysts with strong interpersonal skills as well as broad expertise.

  12. Advanced business analytics

    CERN Document Server

    Lev, Benjamin

    2015-01-01

    The book describes advanced business analytics and shows how to apply them to many different professional areas of engineering and management. Each chapter of the book is contributed by a different author and covers a different area of business analytics. The book connects the analytic principles with business practice and provides an interface between the main disciplines of engineering/technology and the organizational, administrative and planning abilities of management. It also refers to other disciplines such as economy, finance, marketing, behavioral economics and risk analysis. This book is of special interest to engineers, economists and researchers who are developing new advances in engineering management but also to practitioners working on this subject.

  13. Analytic number theory

    CERN Document Server

    Iwaniec, Henryk

    2004-01-01

    Analytic Number Theory distinguishes itself by the variety of tools it uses to establish results, many of which belong to the mainstream of arithmetic. One of the main attractions of analytic number theory is the vast diversity of concepts and methods it includes. The main goal of the book is to show the scope of the theory, both in classical and modern directions, and to exhibit its wealth and prospects, its beautiful theorems and powerful techniques. The book is written with graduate students in mind, and the authors tried to balance between clarity, completeness, and generality. The exercis

  14. Social network data analytics

    CERN Document Server

    Aggarwal, Charu C

    2011-01-01

    Social network analysis applications have experienced tremendous advances within the last few years due in part to increasing trends towards users interacting with each other on the internet. Social networks are organized as graphs, and the data on social networks takes on the form of massive streams, which are mined for a variety of purposes. Social Network Data Analytics covers an important niche in the social network analytics field. This edited volume, contributed by prominent researchers in this field, presents a wide selection of topics on social network data mining such as Structural Pr

  15. News for analytical chemists

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov; Karlberg, Bo

    2009-01-01

    welfare. In conjunction with the meeting of the steering committee in Tallinn, Estonia, in April, Mihkel Kaljurand and Mihkel Koel of Tallinn University of Technology organised a successful symposium attended by 51 participants. The symposium illustrated the scientific work of the steering committee...... directed to various topics of analytical chemistry. Although affected by the global financial crisis, the Euroanalysis Conference will be held on 6 to 10 September in Innsbruck, Austria. For next year, the programme for the analytical section of the 3rd European Chemistry Congress is in preparation...

  16. Foundations of predictive analytics

    CERN Document Server

    Wu, James

    2012-01-01

    Drawing on the authors' two decades of experience in applied modeling and data mining, Foundations of Predictive Analytics presents the fundamental background required for analyzing data and building models for many practical applications, such as consumer behavior modeling, risk and marketing analytics, and other areas. It also discusses a variety of practical topics that are frequently missing from similar texts. The book begins with the statistical and linear algebra/matrix foundation of modeling methods, from distributions to cumulant and copula functions to Cornish--Fisher expansion and o

  17. Analytical Chemistry Laboratory: Progress report for FY 1988

    International Nuclear Information System (INIS)

    Green, D.W.; Heinrich, R.R.; Graczyk, D.G.; Lindahl, P.C.; Erickson, M.D.

    1988-12-01

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for fiscal year 1988 (October 1987 through September 1988). The Analytical Chemistry Laboratory is a full-cost recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques

  18. Analytical Chemistry Laboratory: Progress report for FY 1988

    Energy Technology Data Exchange (ETDEWEB)

    Green, D.W.; Heinrich, R.R.; Graczyk, D.G.; Lindahl, P.C.; Erickson, M.D.

    1988-12-01

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for fiscal year 1988 (October 1987 through September 1988). The Analytical Chemistry Laboratory is a full-cost recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques.

  19. Analytical Chemistry Laboratory progress report for FY 1991

    Energy Technology Data Exchange (ETDEWEB)

    Green, D.W.; Heinrich, R.R.; Graczyk, D.G.; Lindahl, P.C.; Boparai, A.S.

    1991-12-01

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1991 (October 1990 through September 1991). This is the eighth annual report for the ACL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques.

  20. Analytical Chemistry Laboratory progress report for FY 1989

    International Nuclear Information System (INIS)

    Green, D.W.; Heinrich, R.R.; Graczyk, D.G.; Lindahl, P.C.; Erickson, M.D.

    1989-12-01

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1989 (October 1988 through September 1989). The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques

  1. Leaf sequencing algorithms for segmented multileaf collimation

    International Nuclear Information System (INIS)

    Kamath, Srijit; Sahni, Sartaj; Li, Jonathan; Palta, Jatinder; Ranka, Sanjay

    2003-01-01

    The delivery of intensity-modulated radiation therapy (IMRT) with a multileaf collimator (MLC) requires the conversion of a radiation fluence map into a leaf sequence file that controls the movement of the MLC during radiation delivery. It is imperative that the fluence map delivered using the leaf sequence file is as close as possible to the fluence map generated by the dose optimization algorithm, while satisfying hardware constraints of the delivery system. Optimization of the leaf sequencing algorithm has been the subject of several recent investigations. In this work, we present a systematic study of the optimization of leaf sequencing algorithms for segmental multileaf collimator beam delivery and provide rigorous mathematical proofs of optimized leaf sequence settings in terms of monitor unit (MU) efficiency under most common leaf movement constraints that include minimum leaf separation constraint and leaf interdigitation constraint. Our analytical analysis shows that leaf sequencing based on unidirectional movement of the MLC leaves is as MU efficient as bidirectional movement of the MLC leaves

  2. Leaf sequencing algorithms for segmented multileaf collimation

    Energy Technology Data Exchange (ETDEWEB)

    Kamath, Srijit [Department of Computer and Information Science and Engineering, University of Florida, Gainesville, FL (United States); Sahni, Sartaj [Department of Computer and Information Science and Engineering, University of Florida, Gainesville, FL (United States); Li, Jonathan [Department of Radiation Oncology, University of Florida, Gainesville, FL (United States); Palta, Jatinder [Department of Radiation Oncology, University of Florida, Gainesville, FL (United States); Ranka, Sanjay [Department of Computer and Information Science and Engineering, University of Florida, Gainesville, FL (United States)

    2003-02-07

    The delivery of intensity-modulated radiation therapy (IMRT) with a multileaf collimator (MLC) requires the conversion of a radiation fluence map into a leaf sequence file that controls the movement of the MLC during radiation delivery. It is imperative that the fluence map delivered using the leaf sequence file is as close as possible to the fluence map generated by the dose optimization algorithm, while satisfying hardware constraints of the delivery system. Optimization of the leaf sequencing algorithm has been the subject of several recent investigations. In this work, we present a systematic study of the optimization of leaf sequencing algorithms for segmental multileaf collimator beam delivery and provide rigorous mathematical proofs of optimized leaf sequence settings in terms of monitor unit (MU) efficiency under most common leaf movement constraints that include minimum leaf separation constraint and leaf interdigitation constraint. Our analytical analysis shows that leaf sequencing based on unidirectional movement of the MLC leaves is as MU efficient as bidirectional movement of the MLC leaves.

  3. NPOESS Interface Data Processing Segment Product Generation

    Science.gov (United States)

    Grant, K. D.

    2009-12-01

    The National Oceanic and Atmospheric Administration (NOAA), Department of Defense (DoD), and National Aeronautics and Space Administration (NASA) are jointly acquiring the next-generation weather and environmental satellite system; the National Polar-orbiting Operational Environmental Satellite System (NPOESS). NPOESS replaces the current Polar-orbiting Operational Environmental Satellites (POES) managed by NOAA and the Defense Meteorological Satellite Program (DMSP) managed by the DoD. The NPOESS satellites carry a suite of sensors that collect meteorological, oceanographic, climatological, and solar-geophysical observations of the earth, atmosphere, and space. The NPOESS design allows centralized mission management and delivers high quality environmental products to military, civil and scientific users. The ground data processing segment for NPOESS is the Interface Data Processing Segment (IDPS), developed by Raytheon Intelligence and Information Systems. The IDPS processes NPOESS satellite data to provide environmental data products to NOAA and DoD processing centers operated by the United States government. The IDPS will process environmental data products beginning with the NPOESS Preparatory Project (NPP) and continuing through the lifetime of the NPOESS system. Within the overall NPOESS processing environment, the IDPS must process a data volume nearly 1000 times the size of current systems -- in one-quarter of the time. Further, it must support the calibration, validation, and data quality improvement initiatives of the NPOESS program to ensure the production of atmospheric and environmental products that meet strict requirements for accuracy and precision. This paper will describe the architecture approach that is necessary to meet these challenging, and seemingly exclusive, NPOESS IDPS design requirements, with a focus on the processing relationships required to generate the NPP products.

  4. NPOESS Interface Data Processing Segment (IDPS) Hardware

    Science.gov (United States)

    Sullivan, W. J.; Grant, K. D.; Bergeron, C.

    2008-12-01

    The National Oceanic and Atmospheric Administration (NOAA), Department of Defense (DoD), and National Aeronautics and Space Administration (NASA) are jointly acquiring the next-generation weather and environmental satellite system; the National Polar-orbiting Operational Environmental Satellite System (NPOESS). NPOESS replaces the current Polar-orbiting Operational Environmental Satellites (POES) managed by NOAA and the Defense Meteorological Satellite Program (DMSP) managed by the DoD. The NPOESS satellites carry a suite of sensors that collect meteorological, oceanographic, climatological, and solar-geophysical observations of the earth, atmosphere, and space. The NPOESS design allows centralized mission management and delivers high quality environmental products to military, civil and scientific users. The ground data processing segment for NPOESS is the Interface Data Processing Segment (IDPS), developed by Raytheon Intelligence and Information Systems. IDPS processes NPOESS satellite data to provide environmental data products to NOAA and DoD processing centers operated by the United States government. IDPS will process environmental data products beginning with the NPOESS Preparatory Project (NPP) and continuing through the lifetime of the NPOESS system. Within the overall NPOESS processing environment, the IDPS must process a data volume several orders of magnitude the size of current systems -- in one-quarter of the time. Further, it must support the calibration, validation, and data quality improvement initiatives of the NPOESS program to ensure the production of atmospheric and environmental products that meet strict requirements for accuracy and precision. This poster will illustrate and describe the IDPS HW architecture that is necessary to meet these challenging design requirements. In addition, it will illustrate the expandability features of the architecture in support of future data processing and data distribution needs.

  5. Automated image segmentation using information theory

    International Nuclear Information System (INIS)

    Hibbard, L.S.

    2001-01-01

    Full text: Our development of automated contouring of CT images for RT planning is based on maximum a posteriori (MAP) analyses of region textures, edges, and prior shapes, and assumes stationary Gaussian distributions for voxel textures and contour shapes. Since models may not accurately represent image data, it would be advantageous to compute inferences without relying on models. The relative entropy (RE) from information theory can generate inferences based solely on the similarity of probability distributions. The entropy of a distribution of a random variable X is defined as -Σ x p(x)log 2 p(x) for all the values x which X may assume. The RE (Kullback-Liebler divergence) of two distributions p(X), q(X), over X is Σ x p(x)log 2 {p(x)/q(x)}. The RE is a kind of 'distance' between p,q, equaling zero when p=q and increasing as p,q are more different. Minimum-error MAP and likelihood ratio decision rules have RE equivalents: minimum error decisions obtain with functions of the differences between REs of compared distributions. One applied result is the contour ideally separating two regions is that which maximizes the relative entropy of the two regions' intensities. A program was developed that automatically contours the outlines of patients in stereotactic headframes, a situation most often requiring manual drawing. The relative entropy of intensities inside the contour (patient) versus outside (background) was maximized by conjugate gradient descent over the space of parameters of a deformable contour. shows the computed segmentation of a patient from headframe backgrounds. This program is particularly useful for preparing images for multimodal image fusion. Relative entropy and allied measures of distribution similarity provide automated contouring criteria that do not depend on statistical models of image data. This approach should have wide utility in medical image segmentation applications. Copyright (2001) Australasian College of Physical Scientists and

  6. Comparing genomes with rearrangements and segmental duplications.

    Science.gov (United States)

    Shao, Mingfu; Moret, Bernard M E

    2015-06-15

    Large-scale evolutionary events such as genomic rearrange.ments and segmental duplications form an important part of the evolution of genomes and are widely studied from both biological and computational perspectives. A basic computational problem is to infer these events in the evolutionary history for given modern genomes, a task for which many algorithms have been proposed under various constraints. Algorithms that can handle both rearrangements and content-modifying events such as duplications and losses remain few and limited in their applicability. We study the comparison of two genomes under a model including general rearrangements (through double-cut-and-join) and segmental duplications. We formulate the comparison as an optimization problem and describe an exact algorithm to solve it by using an integer linear program. We also devise a sufficient condition and an efficient algorithm to identify optimal substructures, which can simplify the problem while preserving optimality. Using the optimal substructures with the integer linear program (ILP) formulation yields a practical and exact algorithm to solve the problem. We then apply our algorithm to assign in-paralogs and orthologs (a necessary step in handling duplications) and compare its performance with that of the state-of-the-art method MSOAR, using both simulations and real data. On simulated datasets, our method outperforms MSOAR by a significant margin, and on five well-annotated species, MSOAR achieves high accuracy, yet our method performs slightly better on each of the 10 pairwise comparisons. http://lcbb.epfl.ch/softwares/coser. © The Author 2015. Published by Oxford University Press.

  7. A Cautionary Analysis of STAPLE Using Direct Inference of Segmentation Truth

    DEFF Research Database (Denmark)

    Van Leemput, Koen; Sabuncu, Mert R.

    2014-01-01

    In this paper we analyze the properties of the well-known segmentation fusion algorithm STAPLE, using a novel inference technique that analytically marginalizes out all model parameters. We demonstrate both theoretically and empirically that when the number of raters is large, or when consensus r...

  8. Business Analytics in Practice and in Education: A Competency-Based Perspective

    Science.gov (United States)

    Mamonov, Stanislav; Misra, Ram; Jain, Rashmi

    2015-01-01

    Business analytics is a fast-growing area in practice. The rapid growth of business analytics in practice in the recent years is mirrored by a corresponding fast evolution of new educational programs. While more than 130 graduate and undergraduate degree programs in business analytics have been launched in the past 5 years, no commonly accepted…

  9. Human body segmentation via data-driven graph cut.

    Science.gov (United States)

    Li, Shifeng; Lu, Huchuan; Shao, Xingqing

    2014-11-01

    Human body segmentation is a challenging and important problem in computer vision. Existing methods usually entail a time-consuming training phase for prior knowledge learning with complex shape matching for body segmentation. In this paper, we propose a data-driven method that integrates top-down body pose information and bottom-up low-level visual cues for segmenting humans in static images within the graph cut framework. The key idea of our approach is first to exploit human kinematics to search for body part candidates via dynamic programming for high-level evidence. Then, by using the body parts classifiers, obtaining bottom-up cues of human body distribution for low-level evidence. All the evidence collected from top-down and bottom-up procedures are integrated in a graph cut framework for human body segmentation. Qualitative and quantitative experiment results demonstrate the merits of the proposed method in segmenting human bodies with arbitrary poses from cluttered backgrounds.

  10. Dictionary Based Segmentation in Volumes

    DEFF Research Database (Denmark)

    Emerson, Monica Jane; Jespersen, Kristine Munk; Jørgensen, Peter Stanley

    2015-01-01

    We present a method for supervised volumetric segmentation based on a dictionary of small cubes composed of pairs of intensity and label cubes. Intensity cubes are small image volumes where each voxel contains an image intensity. Label cubes are volumes with voxelwise probabilities for a given...... label. The segmentation process is done by matching a cube from the volume, of the same size as the dictionary intensity cubes, to the most similar intensity dictionary cube, and from the associated label cube we get voxel-wise label probabilities. Probabilities from overlapping cubes are averaged...... and hereby we obtain a robust label probability encoding. The dictionary is computed from labeled volumetric image data based on weighted clustering. We experimentally demonstrate our method using two data sets from material science – a phantom data set of a solid oxide fuel cell simulation for detecting...

  11. Compliance with Segment Disclosure Initiatives

    DEFF Research Database (Denmark)

    Arya, Anil; Frimor, Hans; Mittendorf, Brian

    2013-01-01

    Regulatory oversight of capital markets has intensified in recent years, with a particular emphasis on expanding financial transparency. A notable instance is efforts by the Financial Accounting Standards Board that push firms to identify and report performance of individual business units...... (segments). This paper seeks to address short-run and long-run consequences of stringent enforcement of and uniform compliance with these segment disclosure standards. To do so, we develop a parsimonious model wherein a regulatory agency promulgates disclosure standards and either permits voluntary...... by increasing transparency and leveling the playing field. However, our analysis also demonstrates that in the long run, if firms are unable to use discretion in reporting to maintain their competitive edge, they may seek more destructive alternatives. Accounting for such concerns, in the long run, voluntary...

  12. Segmental osteotomies of the maxilla.

    Science.gov (United States)

    Rosen, H M

    1989-10-01

    Multiple segment Le Fort I osteotomies provide the maxillofacial surgeon with the capabilities to treat complex dentofacial deformities existing in all three planes of space. Sagittal, vertical, and transverse maxillomandibular discrepancies as well as three-dimensional abnormalities within the maxillary arch can be corrected simultaneously. Accordingly, optimal aesthetic enhancement of the facial skeleton and a functional, healthy occlusion can be realized. What may be perceived as elaborate treatment plans are in reality conservative in terms of osseous stability and treatment time required. The close cooperation of an orthodontist well-versed in segmental orthodontics and orthognathic surgery is critical to the success of such surgery. With close attention to surgical detail, the complication rate inherent in such surgery can be minimized and the treatment goals achieved in a timely and predictable fashion.

  13. Analyzing Array Manipulating Programs by Program Transformation

    Science.gov (United States)

    Cornish, J. Robert M.; Gange, Graeme; Navas, Jorge A.; Schachte, Peter; Sondergaard, Harald; Stuckey, Peter J.

    2014-01-01

    We explore a transformational approach to the problem of verifying simple array-manipulating programs. Traditionally, verification of such programs requires intricate analysis machinery to reason with universally quantified statements about symbolic array segments, such as "every data item stored in the segment A[i] to A[j] is equal to the corresponding item stored in the segment B[i] to B[j]." We define a simple abstract machine which allows for set-valued variables and we show how to translate programs with array operations to array-free code for this machine. For the purpose of program analysis, the translated program remains faithful to the semantics of array manipulation. Based on our implementation in LLVM, we evaluate the approach with respect to its ability to extract useful invariants and the cost in terms of code size.

  14. Segmented fuel and moderator rod

    International Nuclear Information System (INIS)

    Doshi, P.K.

    1987-01-01

    This patent describes a continuous segmented fuel and moderator rod for use with a water cooled and moderated nuclear fuel assembly. The rod comprises: a lower fuel region containing a column of nuclear fuel; a moderator region, disposed axially above the fuel region. The moderator region has means for admitting and passing the water moderator therethrough for moderating an upper portion of the nuclear fuel assembly. The moderator region is separated from the fuel region by a water tight separator

  15. Segmentation of sows in farrowing pens

    DEFF Research Database (Denmark)

    Tu, Gang Jun; Karstoft, Henrik; Pedersen, Lene Juul

    2014-01-01

    The correct segmentation of a foreground object in video recordings is an important task for many surveillance systems. The development of an effective and practical algorithm to segment sows in grayscale video recordings captured under commercial production conditions is described...

  16. Tank 241-S-106, cores 183, 184 and 187 analytical results for the final report

    International Nuclear Information System (INIS)

    Esch, R.A.

    1997-01-01

    This document is the final laboratory report for tank 241-S-106 push mode core segments collected between February 12, 1997 and March 21, 1997. The segments were subsampled and analyzed in accordance with the Tank Push Mode Core Sampling and Analysis Plan (TSAP), the Tank Safety Screening Data Quality Objective (Safety DQO), the Historical Model Evaluation Data Requirements (Historical DQO) and the Data Quality Objective to Support Resolution of the Organic Complexant Safety Issue (Organic DQO). The analytical results are included in Table 1. Six of the twenty-four subsamples submitted for the differential scanning calorimetry (DSC) analysis exceeded the notification limit of 480 Joules/g stated in the DQO. Appropriate notifications were made. Total Organic Carbon (TOC) analyses were performed on all samples that produced exotherms during the DSC analysis. All results were less than the notification limit of three weight percent TOC. No cyanide analysis was performed, per agreement with the Tank Safety Program. None of the samples submitted for Total Alpha Activity exceeded notification limits as stated in the TSAP. Statistical evaluation of results by calculating the 95% upper confidence limit is not performed by the 222-S Laboratory and is not considered in this report. No core composites were created because there was insufficient solid material from any of the three core sampling events to generate a composite that would be representative of the tank contents

  17. Segmentation in local hospital markets.

    Science.gov (United States)

    Dranove, D; White, W D; Wu, L

    1993-01-01

    This study examines evidence of market segmentation on the basis of patients' insurance status, demographic characteristics, and medical condition in selected local markets in California in the years 1983 and 1989. Substantial differences exist in the probability patients may be admitted to particular hospitals based on insurance coverage, particularly Medicaid, and race. Segmentation based on insurance and race is related to hospital characteristics, but not the characteristics of the hospital's community. Medicaid patients are more likely to go to hospitals with lower costs and fewer service offerings. Privately insured patients go to hospitals offering more services, although cost concerns are increasing. Hispanic patients also go to low-cost hospitals, ceteris paribus. Results indicate little evidence of segmentation based on medical condition in either 1983 or 1989, suggesting that "centers of excellence" have yet to play an important role in patient choice of hospital. The authors found that distance matters, and that patients prefer nearby hospitals, moreso for some medical conditions than others, in ways consistent with economic theories of consumer choice.

  18. Analytical system availability techniques

    NARCIS (Netherlands)

    Brouwers, J.J.H.; Verbeek, P.H.J.; Thomson, W.R.

    1987-01-01

    Analytical techniques are presented to assess the probability distributions and related statistical parameters of loss of production from equipment networks subject to random failures and repairs. The techniques are based on a theoretical model for system availability, which was further developed

  19. Explanatory analytics in OLAP

    NARCIS (Netherlands)

    Caron, E.A.M.; Daniëls, H.A.M.

    2013-01-01

    In this paper the authors describe a method to integrate explanatory business analytics in OLAP information systems. This method supports the discovery of exceptional values in OLAP data and the explanation of such values by giving their underlying causes. OLAP applications offer a support tool for

  20. Analytical procedures. Pt. 1

    International Nuclear Information System (INIS)

    Weber, G.

    1985-01-01

    In analytical procedures (Boole procedures) there is certain to be a close relationship between the safety assessment and reliability assessment of technical facilities. The paper gives an overview of the organization of models, fault trees, the probabilistic evaluation of systems, evaluation with minimum steps or minimum paths regarding statistically dependent components and of systems liable to suffer different kinds of outages. (orig.) [de

  1. Ada & the Analytical Engine.

    Science.gov (United States)

    Freeman, Elisabeth

    1996-01-01

    Presents a brief history of Ada Byron King, Countess of Lovelace, focusing on her primary role in the development of the Analytical Engine--the world's first computer. Describes the Ada Project (TAP), a centralized World Wide Web site that serves as a clearinghouse for information related to women in computing, and provides a Web address for…

  2. User Behavior Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Turcotte, Melissa [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Moore, Juston Shane [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-02-28

    User Behaviour Analytics is the tracking, collecting and assessing of user data and activities. The goal is to detect misuse of user credentials by developing models for the normal behaviour of user credentials within a computer network and detect outliers with respect to their baseline.

  3. Of the Analytical Engine

    Indian Academy of Sciences (India)

    cloth will be woven all of one colour; but there will be a damask pattern upon it ... mathematical view of the Analytical Engine, and illustrate by example some of its .... be to v~rify the number of the card given it by subtracting its number from 2 3 ...

  4. Limitless Analytic Elements

    Science.gov (United States)

    Strack, O. D. L.

    2018-02-01

    We present equations for new limitless analytic line elements. These elements possess a virtually unlimited number of degrees of freedom. We apply these new limitless analytic elements to head-specified boundaries and to problems with inhomogeneities in hydraulic conductivity. Applications of these new analytic elements to practical problems involving head-specified boundaries require the solution of a very large number of equations. To make the new elements useful in practice, an efficient iterative scheme is required. We present an improved version of the scheme presented by Bandilla et al. (2007), based on the application of Cauchy integrals. The limitless analytic elements are useful when modeling strings of elements, rivers for example, where local conditions are difficult to model, e.g., when a well is close to a river. The solution of such problems is facilitated by increasing the order of the elements to obtain a good solution. This makes it unnecessary to resort to dividing the element in question into many smaller elements to obtain a satisfactory solution.

  5. Social Learning Analytics

    Science.gov (United States)

    Buckingham Shum, Simon; Ferguson, Rebecca

    2012-01-01

    We propose that the design and implementation of effective "Social Learning Analytics (SLA)" present significant challenges and opportunities for both research and enterprise, in three important respects. The first is that the learning landscape is extraordinarily turbulent at present, in no small part due to technological drivers.…

  6. History of analytic geometry

    CERN Document Server

    Boyer, Carl B

    2012-01-01

    Designed as an integrated survey of the development of analytic geometry, this study presents the concepts and contributions from before the Alexandrian Age through the eras of the great French mathematicians Fermat and Descartes, and on through Newton and Euler to the "Golden Age," from 1789 to 1850.

  7. Analytics for Customer Engagement

    NARCIS (Netherlands)

    Bijmolt, Tammo H. A.; Leeflang, Peter S. H.; Block, Frank; Eisenbeiss, Maik; Hardie, Bruce G. S.; Lemmens, Aurelie; Saffert, Peter

    In this article, we discuss the state of the art of models for customer engagement and the problems that are inherent to calibrating and implementing these models. The authors first provide an overview of the data available for customer analytics and discuss recent developments. Next, the authors

  8. European Analytical Column

    DEFF Research Database (Denmark)

    Karlberg, B.; Grasserbauer, M.; Andersen, Jens Enevold Thaulov

    2009-01-01

    for European analytical chemistry. During the period 2002–07, Professor Grasserbauer was Director of the Institute for Environment and Sustainability, Joint Research Centre of the European Commission (EC), Ispra, Italy. There is no doubt that many challenges exist at the present time for all of us representing...

  9. Analytical Chemistry Laboratory

    Science.gov (United States)

    Anderson, Mark

    2013-01-01

    The Analytical Chemistry and Material Development Group maintains a capability in chemical analysis, materials R&D failure analysis and contamination control. The uniquely qualified staff and facility support the needs of flight projects, science instrument development and various technical tasks, as well as Cal Tech.

  10. Roentgenological diagnoss of central segmental lung cancer

    International Nuclear Information System (INIS)

    Gurevich, L.A.; Fedchenko, G.G.

    1984-01-01

    Basing on an analysis of the results of clinicoroentgenological examination of 268 patments roentgenological semiotics of segmental lung cancer is presented. Some peculiarities of the X-ray picture of cancer of different segments of the lungs were revealed depending on tumor site and growth type. For the syndrome of segmental darkening the comprehensive X-ray methods where the chief method is tomography of the segmental bronchi are proposed

  11. Review of segmentation process in consumer markets

    OpenAIRE

    Veronika Jadczaková

    2013-01-01

    Although there has been a considerable debate on market segmentation over five decades, attention was merely devoted to single stages of the segmentation process. In doing so, stages as segmentation base selection or segments profiling have been heavily covered in the extant literature, whereas stages as implementation of the marketing strategy or market definition were of a comparably lower interest. Capitalizing on this shortcoming, this paper strives to close the gap and provide each step...

  12. Multispectral analytical image fusion

    International Nuclear Information System (INIS)

    Stubbings, T.C.

    2000-04-01

    With new and advanced analytical imaging methods emerging, the limits of physical analysis capabilities and furthermore of data acquisition quantities are constantly pushed, claiming high demands to the field of scientific data processing and visualisation. Physical analysis methods like Secondary Ion Mass Spectrometry (SIMS) or Auger Electron Spectroscopy (AES) and others are capable of delivering high-resolution multispectral two-dimensional and three-dimensional image data; usually this multispectral data is available in form of n separate image files with each showing one element or other singular aspect of the sample. There is high need for digital image processing methods enabling the analytical scientist, confronted with such amounts of data routinely, to get rapid insight into the composition of the sample examined, to filter the relevant data and to integrate the information of numerous separate multispectral images to get the complete picture. Sophisticated image processing methods like classification and fusion provide possible solution approaches to this challenge. Classification is a treatment by multivariate statistical means in order to extract analytical information. Image fusion on the other hand denotes a process where images obtained from various sensors or at different moments of time are combined together to provide a more complete picture of a scene or object under investigation. Both techniques are important for the task of information extraction and integration and often one technique depends on the other. Therefore overall aim of this thesis is to evaluate the possibilities of both techniques regarding the task of analytical image processing and to find solutions for the integration and condensation of multispectral analytical image data in order to facilitate the interpretation of the enormous amounts of data routinely acquired by modern physical analysis instruments. (author)

  13. Market Segmentation from a Behavioral Perspective

    Science.gov (United States)

    Wells, Victoria K.; Chang, Shing Wan; Oliveira-Castro, Jorge; Pallister, John

    2010-01-01

    A segmentation approach is presented using both traditional demographic segmentation bases (age, social class/occupation, and working status) and a segmentation by benefits sought. The benefits sought in this case are utilitarian and informational reinforcement, variables developed from the Behavioral Perspective Model (BPM). Using data from 1,847…

  14. Parallel fuzzy connected image segmentation on GPU

    OpenAIRE

    Zhuge, Ying; Cao, Yong; Udupa, Jayaram K.; Miller, Robert W.

    2011-01-01

    Purpose: Image segmentation techniques using fuzzy connectedness (FC) principles have shown their effectiveness in segmenting a variety of objects in several large applications. However, one challenge in these algorithms has been their excessive computational requirements when processing large image datasets. Nowadays, commodity graphics hardware provides a highly parallel computing environment. In this paper, the authors present a parallel fuzzy connected image segmentation algorithm impleme...

  15. LIFE-STYLE SEGMENTATION WITH TAILORED INTERVIEWING

    NARCIS (Netherlands)

    KAMAKURA, WA; WEDEL, M

    The authors present a tailored interviewing procedure for life-style segmentation. The procedure assumes that a life-style measurement instrument has been designed. A classification of a sample of consumers into life-style segments is obtained using a latent-class model. With these segments, the

  16. The Process of Marketing Segmentation Strategy Selection

    OpenAIRE

    Ionel Dumitru

    2007-01-01

    The process of marketing segmentation strategy selection represents the essence of strategical marketing. We present hereinafter the main forms of the marketing statategy segmentation: undifferentiated marketing, differentiated marketing, concentrated marketing and personalized marketing. In practice, the companies use a mix of these marketing segmentation methods in order to maximize the proffit and to satisfy the consumers’ needs.

  17. Consumer energy - conservation policy: an analytical approach

    Energy Technology Data Exchange (ETDEWEB)

    McDougall, G.H.G.; Ritchie, J.R.B.

    1984-06-01

    To capture the potential energy savings available in the consumer sector an analytical approach to conservation policy is proposed. A policy framework is described, and the key constructs including a payoff matrix analysis and a consumer impact analysis are discussed. Implications derived from the considerable amount of prior consumer research are provided to illustrate the effect on the design and implementation of future programs. The result of this analytical approach to conservation policy (economic stability and economic security) are goals well worth pursuing. 13 references, 2 tables.

  18. Nodewise analytical calculation of the transfer function

    International Nuclear Information System (INIS)

    Makai, Mihaly

    1994-01-01

    The space dependence of neutron noise has so far been mostly investigated in homogeneous core models. Application of core diagnostic methods to locate a malfunction requires however that the transfer function be calculated for real, inhomogeneous cores. A code suitable for such purpose must be able to handle complex arithmetic and delta-function source. Further requirements are analytical dependence in one spatial variable and fast execution. The present work describes the TIDE program written to fulfil the above requirements. The core is subdivided into homogeneous, square assemblies. An analytical solution is given, which is a generalisation of the inhomogeneous response matrix method. (author)

  19. Holistic versus Analytic Evaluation of EFL Writing: A Case Study

    Science.gov (United States)

    Ghalib, Thikra K.; Al-Hattami, Abdulghani A.

    2015-01-01

    This paper investigates the performance of holistic and analytic scoring rubrics in the context of EFL writing. Specifically, the paper compares EFL students' scores on a writing task using holistic and analytic scoring rubrics. The data for the study was collected from 30 participants attending an English undergraduate program in a Yemeni…

  20. Gatlinburg conference: barometer of progress in analytical chemistry

    International Nuclear Information System (INIS)

    Shults, W.D.

    1981-01-01

    Much progress has been made in the field of analytical chemistry over the past twenty-five years. The AEC-ERDA-DOE family of laboratories contributed greatly to this progress. It is not surprising then to find a close correlation between program content of past Gatlinburg conferences and developments in analytical methodology. These conferences have proved to be a barometer of technical status

  1. Developments in analytical instrumentation

    Science.gov (United States)

    Petrie, G.

    The situation regarding photogrammetric instrumentation has changed quite dramatically over the last 2 or 3 years with the withdrawal of most analogue stereo-plotting machines from the market place and their replacement by analytically based instrumentation. While there have been few new developments in the field of comparators, there has been an explosive development in the area of small, relatively inexpensive analytical stereo-plotters based on the use of microcomputers. In particular, a number of new instruments have been introduced by manufacturers who mostly have not been associated previously with photogrammetry. Several innovative concepts have been introduced in these small but capable instruments, many of which are aimed at specialised applications, e.g. in close-range photogrammetry (using small-format cameras); for thematic mapping (by organisations engaged in environmental monitoring or resources exploitation); for map revision, etc. Another innovative and possibly significant development has been the production of conversion kits to convert suitable analogue stereo-plotting machines such as the Topocart, PG-2 and B-8 into fully fledged analytical plotters. The larger and more sophisticated analytical stereo-plotters are mostly being produced by the traditional mainstream photogrammetric systems suppliers with several new instruments and developments being introduced at the top end of the market. These include the use of enlarged photo stages to handle images up to 25 × 50 cm format; the complete integration of graphics workstations into the analytical plotter design; the introduction of graphics superimposition and stereo-superimposition; the addition of correlators for the automatic measurement of height, etc. The software associated with this new analytical instrumentation is now undergoing extensive re-development with the need to supply photogrammetric data as input to the more sophisticated G.I.S. systems now being installed by clients, instead

  2. Automatic segmentation of vertebrae from radiographs

    DEFF Research Database (Denmark)

    Mysling, Peter; Petersen, Peter Kersten; Nielsen, Mads

    2011-01-01

    Segmentation of vertebral contours is an essential task in the design of automatic tools for vertebral fracture assessment. In this paper, we propose a novel segmentation technique which does not require operator interaction. The proposed technique solves the segmentation problem in a hierarchical...... is constrained by a conditional shape model, based on the variability of the coarse spine location estimates. The technique is evaluated on a data set of manually annotated lumbar radiographs. The results compare favorably to the previous work in automatic vertebra segmentation, in terms of both segmentation...

  3. Color image Segmentation using automatic thresholding techniques

    International Nuclear Information System (INIS)

    Harrabi, R.; Ben Braiek, E.

    2011-01-01

    In this paper, entropy and between-class variance based thresholding methods for color images segmentation are studied. The maximization of the between-class variance (MVI) and the entropy (ME) have been used as a criterion functions to determine an optimal threshold to segment images into nearly homogenous regions. Segmentation results from the two methods are validated and the segmentation sensitivity for the test data available is evaluated, and a comparative study between these methods in different color spaces is presented. The experimental results demonstrate the superiority of the MVI method for color image segmentation.

  4. MOVING WINDOW SEGMENTATION FRAMEWORK FOR POINT CLOUDS

    Directory of Open Access Journals (Sweden)

    G. Sithole

    2012-07-01

    Full Text Available As lidar point clouds become larger streamed processing becomes more attractive. This paper presents a framework for the streamed segmentation of point clouds with the intention of segmenting unstructured point clouds in real-time. The framework is composed of two main components. The first component segments points within a window shifting over the point cloud. The second component stitches the segments within the windows together. In this fashion a point cloud can be streamed through these two components in sequence, thus producing a segmentation. The algorithm has been tested on airborne lidar point cloud and some results of the performance of the framework are presented.

  5. Analytical Chemistry Laboratory, progress report for FY 1993

    Energy Technology Data Exchange (ETDEWEB)

    1993-12-01

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1993 (October 1992 through September 1993). This annual report is the tenth for the ACL and describes continuing effort on projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. The ACL also has research programs in analytical chemistry, conducts instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems. Some routine or standard analyses are done, but it is common for the Argonne programs to generate unique problems that require development or modification of methods and adaption of techniques to obtain useful analytical data. The ACL is administratively within the Chemical Technology Division (CMT), its principal ANL client, but provides technical support for many of the technical divisions and programs at ANL. The ACL has four technical groups--Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis--which together include about 45 technical staff members. Talents and interests of staff members cross the group lines, as do many projects within the ACL.

  6. Multifunctional nanoparticles: Analytical prospects

    International Nuclear Information System (INIS)

    Dios, Alejandro Simon de; Diaz-Garcia, Marta Elena

    2010-01-01

    Multifunctional nanoparticles are among the most exciting nanomaterials with promising applications in analytical chemistry. These applications include (bio)sensing, (bio)assays, catalysis and separations. Although most of these applications are based on the magnetic, optical and electrochemical properties of multifunctional nanoparticles, other aspects such as the synergistic effect of the functional groups and the amplification effect associated with the nanoscale dimension have also been observed. Considering not only the nature of the raw material but also the shape, there is a huge variety of nanoparticles. In this review only magnetic, quantum dots, gold nanoparticles, carbon and inorganic nanotubes as well as silica, titania and gadolinium oxide nanoparticles are addressed. This review presents a narrative summary on the use of multifuncional nanoparticles for analytical applications, along with a discussion on some critical challenges existing in the field and possible solutions that have been or are being developed to overcome these challenges.

  7. Nuclear analytical chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Brune, D.; Forkman, B.; Persson, B.

    1984-01-01

    This book covers the general theories and techniques of nuclear chemical analysis, directed at applications in analytical chemistry, nuclear medicine, radiophysics, agriculture, environmental sciences, geological exploration, industrial process control, etc. The main principles of nuclear physics and nuclear detection on which the analysis is based are briefly outlined. An attempt is made to emphasise the fundamentals of activation analysis, detection and activation methods, as well as their applications. The book provides guidance in analytical chemistry, agriculture, environmental and biomedical sciences, etc. The contents include: the nuclear periodic system; nuclear decay; nuclear reactions; nuclear radiation sources; interaction of radiation with matter; principles of radiation detectors; nuclear electronics; statistical methods and spectral analysis; methods of radiation detection; neutron activation analysis; charged particle activation analysis; photon activation analysis; sample preparation and chemical separation; nuclear chemical analysis in biological and medical research; the use of nuclear chemical analysis in the field of criminology; nuclear chemical analysis in environmental sciences, geology and mineral exploration; and radiation protection.

  8. Analytical chemists and dinosaurs

    International Nuclear Information System (INIS)

    Brooks, R.R.

    1987-01-01

    The role of the analytical chemist in the development of the extraterrestrial impact theory for mass extinctions at the terminal Cretaceous Period is reviewed. High iridium concentrations in Cretaceous/Tertiary boundary clays have been linked to a terrestrial impact from an iridium-rich asteroid or large meteorite som 65 million years ago. Other evidence in favour of the occurrence of such an impact has been provided by the detection of shocked quartz grains originating from impact and of amorphous carbon particles similar to soot, derived presumably from wordwide wildfires at the terminal Cretaceous. Further evidence provided by the analytical chemist involves the determination of isotopic ratios such as 144 Nd/ 143 Nd, 187 Os/ 186 Os, and 87 Sr/ 86 Sr. Countervailing arguments put forward by the gradualist school (mainly palaeontological) as opposed to the catastrophists (mainly chemists and geochemists) are also presented and discussed

  9. Nuclear analytical chemistry

    International Nuclear Information System (INIS)

    Brune, D.; Forkman, B.; Persson, B.

    1984-01-01

    This book covers the general theories and techniques of nuclear chemical analysis, directed at applications in analytical chemistry, nuclear medicine, radiophysics, agriculture, environmental sciences, geological exploration, industrial process control, etc. The main principles of nuclear physics and nuclear detection on which the analysis is based are briefly outlined. An attempt is made to emphasise the fundamentals of activation analysis, detection and activation methods, as well as their applications. The book provides guidance in analytical chemistry, agriculture, environmental and biomedical sciences, etc. The contents include: the nuclear periodic system; nuclear decay; nuclear reactions; nuclear radiation sources; interaction of radiation with matter; principles of radiation detectors; nuclear electronics; statistical methods and spectral analysis; methods of radiation detection; neutron activation analysis; charged particle activation analysis; photon activation analysis; sample preparation and chemical separation; nuclear chemical analysis in biological and medical research; the use of nuclear chemical analysis in the field of criminology; nuclear chemical analysis in environmental sciences, geology and mineral exploration; and radiation protection

  10. Hermeneutical and analytical jurisprudence

    Directory of Open Access Journals (Sweden)

    Spaić Bojan

    2014-01-01

    Full Text Available The article examines the main strands of development in jurisprudence in the last few decades from the standpoint of the metatheoretical differentiation between analytical and hermeneutical perspective in the study of law. The author claims that recent jurisprudent accounts can rarely be positioned within the traditional dichotomy natural law theories - legal positivism, and that this dichotomy is not able to account for the differences between contemporary conceptions of law. As an alternative the difference between the analytical and hermeneutical traditions in philosophy are explained, as they have crucially influenced posthartian strands in Anglo-American philosophy and postkelsenian strands in continental philosophy of law. Finally, the influence of hermeneutical philosophy and legal theory is examined in regards of the development of a hermeneutical theory of law and the development of legal hermeneutics.

  11. Analytical chemists and dinosaurs

    Energy Technology Data Exchange (ETDEWEB)

    Brooks, R R

    1987-05-01

    The role of the analytical chemist in the development of the extraterrestrial impact theory for mass extinctions at the terminal Cretaceous Period is reviewed. High iridium concentrations in Cretaceous/Tertiary boundary clays have been linked to a terrestrial impact from an iridium-rich asteroid or large meteorite som 65 million years ago. Other evidence in favour of the occurrence of such an impact has been provided by the detection of shocked quartz grains originating from impact and of amorphous carbon particles similar to soot, derived presumably from wordwide wildfires at the terminal Cretaceous. Further evidence provided by the analytical chemist involves the determination of isotopic ratios such as /sup 144/Nd//sup 143/Nd, /sup 187/Os//sup 186/Os, and /sup 87/Sr//sup 86/Sr. Countervailing arguments put forward by the gradualist school (mainly palaeontological) as opposed to the catastrophists (mainly chemists and geochemists) are also presented and discussed.

  12. Analytical chemistry of actinides

    International Nuclear Information System (INIS)

    Chollet, H.; Marty, P.

    2001-01-01

    Different characterization methods specifically applied to the actinides are presented in this review such as ICP/OES (inductively coupled plasma-optical emission spectrometry), ICP/MS (inductively coupled plasma spectroscopy-mass spectrometry), TIMS (thermal ionization-mass spectrometry) and GD/OES (flow discharge optical emission). Molecular absorption spectrometry and capillary electrophoresis are also available to complete the excellent range of analytical tools at our disposal. (authors)

  13. Communication Theoretic Data Analytics

    OpenAIRE

    Chen, Kwang-Cheng; Huang, Shao-Lun; Zheng, Lizhong; Poor, H. Vincent

    2015-01-01

    Widespread use of the Internet and social networks invokes the generation of big data, which is proving to be useful in a number of applications. To deal with explosively growing amounts of data, data analytics has emerged as a critical technology related to computing, signal processing, and information networking. In this paper, a formalism is considered in which data is modeled as a generalized social network and communication theory and information theory are thereby extended to data analy...

  14. Social Data Analytics Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi

    2014-01-01

    This paper presents the design, development and demonstrative case studies of the Social Data Analytics Tool, SODATO. Adopting Action Design Framework [1], the objective of SODATO [2] is to collect, store, analyze, and report big social data emanating from the social media engagement of and social...... media conversations about organizations. We report and discuss results from two demonstrative case studies that were conducted using SODATO and conclude with implications and future work....

  15. Analytic chemistry of molybdenum

    International Nuclear Information System (INIS)

    Parker, G.A.

    1983-01-01

    Electrochemical, colorimetric, gravimetric, spectroscopic, and radiochemical methods for the determination of molybdenum are summarized in this book. Some laboratory procedures are described in detail while literature citations are given for others. The reader is also referred to older comprehensive reviews of the analytical chemistry of molybdenum. Contents, abridged: Gravimetric methods. Titrimetric methods. Colorimetric methods. X-ray fluorescence. Voltammetry. Catalytic methods. Molybdenum in non-ferrous alloys. Molydbenum compounds

  16. Competing on analytics.

    Science.gov (United States)

    Davenport, Thomas H

    2006-01-01

    We all know the power of the killer app. It's not just a support tool; it's a strategic weapon. Companies questing for killer apps generally focus all their firepower on the one area that promises to create the greatest competitive advantage. But a new breed of organization has upped the stakes: Amazon, Harrah's, Capital One, and the Boston Red Sox have all dominated their fields by deploying industrial-strength analytics across a wide variety of activities. At a time when firms in many industries offer similar products and use comparable technologies, business processes are among the few remaining points of differentiation--and analytics competitors wring every last drop of value from those processes. Employees hired for their expertise with numbers or trained to recognize their importance are armed with the best evidence and the best quantitative tools. As a result, they make the best decisions. In companies that compete on analytics, senior executives make it clear--from the top down--that analytics is central to strategy. Such organizations launch multiple initiatives involving complex data and statistical analysis, and quantitative activity is managed atthe enterprise (not departmental) level. In this article, professor Thomas H. Davenport lays out the characteristics and practices of these statistical masters and describes some of the very substantial changes other companies must undergo in order to compete on quantitative turf. As one would expect, the transformation requires a significant investment in technology, the accumulation of massive stores of data, and the formulation of company-wide strategies for managing the data. But, at least as important, it also requires executives' vocal, unswerving commitment and willingness to change the way employees think, work, and are treated.

  17. Introduction to analytical mechanics

    CERN Document Server

    Gamalath, KAILW

    2011-01-01

    INTRODUCTION TO ANALYTICAL MECHANICS is an attempt to introduce the modern treatment of classical mechanics so that transition to many fields in physics can be made with the least difficulty. This book deal with the formulation of Newtonian mechanics, Lagrangian dynamics, conservation laws relating to symmetries, Hamiltonian dynamics Hamilton's principle, Poisson brackets, canonical transformations which are invaluable in formulating the quantum mechanics and Hamilton-Jacobi equation which provides the transition to wave mechanics.

  18. Analytical and physical electrochemistry

    CERN Document Server

    Girault, Hubert H

    2004-01-01

    The study of electrochemistry is pertinent to a wide variety of fields, including bioenergetics, environmental sciences, and engineering sciences. In addition, electrochemistry plays a fundamental role in specific applications as diverse as the conversion and storage of energy and the sequencing of DNA.Intended both as a basic course for undergraduate students and as a reference work for graduates and researchers, Analytical and Physical Electrochemistry covers two fundamental aspects of electrochemistry: electrochemistry in solution and interfacial electrochemistry. By bringing these two subj

  19. Inorganic Analytical Chemistry

    DEFF Research Database (Denmark)

    Berg, Rolf W.

    The book is a treatise on inorganic analytical reactions in aqueous solution. It covers about half of the elements in the periodic table, i.e. the most important ones : H, Li, B, C, N, O, Na, Mg, Al, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Co, Ni, Cu, Zn, As, Se, Br, Sr, Mo, Ag, Cd, Sn, Sb, I, Ba, W,...

  20. Unsupervised Performance Evaluation of Image Segmentation

    Directory of Open Access Journals (Sweden)

    Chabrier Sebastien

    2006-01-01

    Full Text Available We present in this paper a study of unsupervised evaluation criteria that enable the quantification of the quality of an image segmentation result. These evaluation criteria compute some statistics for each region or class in a segmentation result. Such an evaluation criterion can be useful for different applications: the comparison of segmentation results, the automatic choice of the best fitted parameters of a segmentation method for a given image, or the definition of new segmentation methods by optimization. We first present the state of art of unsupervised evaluation, and then, we compare six unsupervised evaluation criteria. For this comparative study, we use a database composed of 8400 synthetic gray-level images segmented in four different ways. Vinet's measure (correct classification rate is used as an objective criterion to compare the behavior of the different criteria. Finally, we present the experimental results on the segmentation evaluation of a few gray-level natural images.

  1. Efficient graph-cut tattoo segmentation

    Science.gov (United States)

    Kim, Joonsoo; Parra, Albert; Li, He; Delp, Edward J.

    2015-03-01

    Law enforcement is interested in exploiting tattoos as an information source to identify, track and prevent gang-related crimes. Many tattoo image retrieval systems have been described. In a retrieval system tattoo segmentation is an important step for retrieval accuracy since segmentation removes background information in a tattoo image. Existing segmentation methods do not extract the tattoo very well when the background includes textures and color similar to skin tones. In this paper we describe a tattoo segmentation approach by determining skin pixels in regions near the tattoo. In these regions graph-cut segmentation using a skin color model and a visual saliency map is used to find skin pixels. After segmentation we determine which set of skin pixels are connected with each other that form a closed contour including a tattoo. The regions surrounded by the closed contours are considered tattoo regions. Our method segments tattoos well when the background includes textures and color similar to skin.

  2. Metric Learning for Hyperspectral Image Segmentation

    Science.gov (United States)

    Bue, Brian D.; Thompson, David R.; Gilmore, Martha S.; Castano, Rebecca

    2011-01-01

    We present a metric learning approach to improve the performance of unsupervised hyperspectral image segmentation. Unsupervised spatial segmentation can assist both user visualization and automatic recognition of surface features. Analysts can use spatially-continuous segments to decrease noise levels and/or localize feature boundaries. However, existing segmentation methods use tasks-agnostic measures of similarity. Here we learn task-specific similarity measures from training data, improving segment fidelity to classes of interest. Multiclass Linear Discriminate Analysis produces a linear transform that optimally separates a labeled set of training classes. The defines a distance metric that generalized to a new scenes, enabling graph-based segmentation that emphasizes key spectral features. We describe tests based on data from the Compact Reconnaissance Imaging Spectrometer (CRISM) in which learned metrics improve segment homogeneity with respect to mineralogical classes.

  3. Commercial Midstream Energy Efficiency Incentive Programs: Guidelines for Future Program Design, Implementation, and Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Milostan, Catharina [Argonne National Lab. (ANL), Argonne, IL (United States); Levin, Todd [Argonne National Lab. (ANL), Argonne, IL (United States); Muehleisen, Ralph T. [Argonne National Lab. (ANL), Argonne, IL (United States); Guzowski, Leah Bellah B. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-01-01

    Many electric utilities operate energy efficiency incentive programs that encourage increased dissemination and use of energy-efficient (EE) products in their service territories. The programs can be segmented into three broad categories—downstream incentive programs target product end users, midstream programs target product distributors, and upstream programs target product manufacturers. Traditional downstream programs have had difficulty engaging Small Business/Small Portfolio (SBSP) audiences, and an opportunity exists to expand Commercial Midstream Incentive Programs (CMIPs) to reach this market segment instead.

  4. Interferon Induced Focal Segmental Glomerulosclerosis

    Directory of Open Access Journals (Sweden)

    Yusuf Kayar

    2016-01-01

    Full Text Available Behçet’s disease is an inflammatory disease of unknown etiology which involves recurring oral and genital aphthous ulcers and ocular lesions as well as articular, vascular, and nervous system involvement. Focal segmental glomerulosclerosis (FSGS is usually seen in viral infections, immune deficiency syndrome, sickle cell anemia, and hyperfiltration and secondary to interferon therapy. Here, we present a case of FSGS identified with kidney biopsy in a patient who had been diagnosed with Behçet’s disease and received interferon-alpha treatment for uveitis and presented with acute renal failure and nephrotic syndrome associated with interferon.

  5. A contrario line segment detection

    CERN Document Server

    von Gioi, Rafael Grompone

    2014-01-01

    The reliable detection of low-level image structures is an old and still challenging problem in computer vision. This?book leads a detailed tour through the LSD algorithm, a line segment detector designed to be fully automatic. Based on the a contrario framework, the algorithm works efficiently without the need of any parameter tuning. The design criteria are thoroughly explained and the algorithm's good and bad results are illustrated on real and synthetic images. The issues involved, as well as the strategies used, are common to many geometrical structure detection problems and some possible

  6. Did Globalization Lead to Segmentation?

    DEFF Research Database (Denmark)

    Di Vaio, Gianfranco; Enflo, Kerstin Sofia

    Economic historians have stressed that income convergence was a key feature of the 'OECD-club' and that globalization was among the accelerating forces of this process in the long-run. This view has however been challenged, since it suffers from an ad hoc selection of countries. In the paper......, a mixture model is applied to a sample of 64 countries to endogenously analyze the cross-country growth behavior over the period 1870-2003. Results show that growth patterns were segmented in two worldwide regimes, the first one being characterized by convergence, and the other one denoted by divergence...

  7. Big Data Analytics for Demand Response: Clustering Over Space and Time

    Energy Technology Data Exchange (ETDEWEB)

    Chelmis, Charalampos [Univ. of Southern California, Los Angeles, CA (United States); Kolte, Jahanvi [Nirma Univ., Gujarat (India); Prasanna, Viktor K. [Univ. of Southern California, Los Angeles, CA (United States)

    2015-10-29

    The pervasive deployment of advanced sensing infrastructure in Cyber-Physical systems, such as the Smart Grid, has resulted in an unprecedented data explosion. Such data exhibit both large volumes and high velocity characteristics, two of the three pillars of Big Data, and have a time-series notion as datasets in this context typically consist of successive measurements made over a time interval. Time-series data can be valuable for data mining and analytics tasks such as identifying the “right” customers among a diverse population, to target for Demand Response programs. However, time series are challenging to mine due to their high dimensionality. In this paper, we motivate this problem using a real application from the smart grid domain. We explore novel representations of time-series data for BigData analytics, and propose a clustering technique for determining natural segmentation of customers and identification of temporal consumption patterns. Our method is generizable to large-scale, real-world scenarios, without making any assumptions about the data. We evaluate our technique using real datasets from smart meters, totaling ~ 18,200,000 data points, and show the efficacy of our technique in efficiency detecting the number of optimal number of clusters.

  8. Automated statistical modeling of analytical measurement systems

    International Nuclear Information System (INIS)

    Jacobson, J.J.

    1992-01-01

    The statistical modeling of analytical measurement systems at the Idaho Chemical Processing Plant (ICPP) has been completely automated through computer software. The statistical modeling of analytical measurement systems is one part of a complete quality control program used by the Remote Analytical Laboratory (RAL) at the ICPP. The quality control program is an integration of automated data input, measurement system calibration, database management, and statistical process control. The quality control program and statistical modeling program meet the guidelines set forth by the American Society for Testing Materials and American National Standards Institute. A statistical model is a set of mathematical equations describing any systematic bias inherent in a measurement system and the precision of a measurement system. A statistical model is developed from data generated from the analysis of control standards. Control standards are samples which are made up at precise known levels by an independent laboratory and submitted to the RAL. The RAL analysts who process control standards do not know the values of those control standards. The object behind statistical modeling is to describe real process samples in terms of their bias and precision and, to verify that a measurement system is operating satisfactorily. The processing of control standards gives us this ability

  9. Business analytics a practitioner's guide

    CERN Document Server

    Saxena, Rahul

    2013-01-01

    This book provides a guide to businesses on how to use analytics to help drive from ideas to execution. Analytics used in this way provides "full lifecycle support" for business and helps during all stages of management decision-making and execution.The framework presented in the book enables the effective interplay of business, analytics, and information technology (business intelligence) both to leverage analytics for competitive advantage and to embed the use of business analytics into the business culture. It lays out an approach for analytics, describes the processes used, and provides gu

  10. Analytical Chemistry Core Capability Assessment - Preliminary Report

    International Nuclear Information System (INIS)

    Barr, Mary E.; Farish, Thomas J.

    2012-01-01

    The concept of 'core capability' can be nebulous one. Even at a fairly specific level, where core capability equals maintaining essential services, it is highly dependent upon the perspective of the requestor. Samples are submitted to analytical services because the requesters do not have the capability to conduct adequate analyses themselves. Some requests are for general chemical information in support of R and D, process control, or process improvement. Many analyses, however, are part of a product certification package and must comply with higher-level customer quality assurance requirements. So which services are essential to that customer - just those for product certification? Does the customer also (indirectly) need services that support process control and improvement? And what is the timeframe? Capability is often expressed in terms of the currently utilized procedures, and most programmatic customers can only plan a few years out, at best. But should core capability consider the long term where new technologies, aging facilities, and personnel replacements must be considered? These questions, and a multitude of others, explain why attempts to gain long-term consensus on the definition of core capability have consistently failed. This preliminary report will not try to define core capability for any specific program or set of programs. Instead, it will try to address the underlying concerns that drive the desire to determine core capability. Essentially, programmatic customers want to be able to call upon analytical chemistry services to provide all the assays they need, and they don't want to pay for analytical chemistry services they don't currently use (or use infrequently). This report will focus on explaining how the current analytical capabilities and methods evolved to serve a variety of needs with a focus on why some analytes have multiple analytical techniques, and what determines the infrastructure for these analyses. This information will be

  11. Pathogenesis of Focal Segmental Glomerulosclerosis

    Directory of Open Access Journals (Sweden)

    Beom Jin Lim

    2016-11-01

    Full Text Available Focal segmental glomerulosclerosis (FSGS is characterized by focal and segmental obliteration of glomerular capillary tufts with increased matrix. FSGS is classified as collapsing, tip, cellular, perihilar and not otherwise specified variants according to the location and character of the sclerotic lesion. Primary or idiopathic FSGS is considered to be related to podocyte injury, and the pathogenesis of podocyte injury has been actively investigated. Several circulating factors affecting podocyte permeability barrier have been proposed, but not proven to cause FSGS. FSGS may also be caused by genetic alterations. These genes are mainly those regulating slit diaphragm structure, actin cytoskeleton of podocytes, and foot process structure. The mode of inheritance and age of onset are different according to the gene involved. Recently, the role of parietal epithelial cells (PECs has been highlighted. Podocytes and PECs have common mesenchymal progenitors, therefore, PECs could be a source of podocyte repopulation after podocyte injury. Activated PECs migrate along adhesion to the glomerular tuft and may also contribute to the progression of sclerosis. Markers of activated PECs, including CD44, could be used to distinguish FSGS from minimal change disease. The pathogenesis of FSGS is very complex; however, understanding basic mechanisms of podocyte injury is important not only for basic research, but also for daily diagnostic pathology practice.

  12. Brain Tumor Image Segmentation in MRI Image

    Science.gov (United States)

    Peni Agustin Tjahyaningtijas, Hapsari

    2018-04-01

    Brain tumor segmentation plays an important role in medical image processing. Treatment of patients with brain tumors is highly dependent on early detection of these tumors. Early detection of brain tumors will improve the patient’s life chances. Diagnosis of brain tumors by experts usually use a manual segmentation that is difficult and time consuming because of the necessary automatic segmentation. Nowadays automatic segmentation is very populer and can be a solution to the problem of tumor brain segmentation with better performance. The purpose of this paper is to provide a review of MRI-based brain tumor segmentation methods. There are number of existing review papers, focusing on traditional methods for MRI-based brain tumor image segmentation. this paper, we focus on the recent trend of automatic segmentation in this field. First, an introduction to brain tumors and methods for brain tumor segmentation is given. Then, the state-of-the-art algorithms with a focus on recent trend of full automatic segmentaion are discussed. Finally, an assessment of the current state is presented and future developments to standardize MRI-based brain tumor segmentation methods into daily clinical routine are addressed.

  13. A new framework for interactive images segmentation

    International Nuclear Information System (INIS)

    Ashraf, M.; Sarim, M.; Shaikh, A.B.

    2017-01-01

    Image segmentation has become a widely studied research problem in image processing. There exist different graph based solutions for interactive image segmentation but the domain of image segmentation still needs persistent improvements. The segmentation quality of existing techniques generally depends on the manual input provided in beginning, therefore, these algorithms may not produce quality segmentation with initial seed labels provided by a novice user. In this work we investigated the use of cellular automata in image segmentation and proposed a new algorithm that follows a cellular automaton in label propagation. It incorporates both the pixel's local and global information in the segmentation process. We introduced the novel global constraints in automata evolution rules; hence proposed scheme of automata evolution is more effective than the automata based earlier evolution schemes. Global constraints are also effective in deceasing the sensitivity towards small changes made in manual input; therefore proposed approach is less dependent on label seed marks. It can produce the quality segmentation with modest user efforts. Segmentation results indicate that the proposed algorithm performs better than the earlier segmentation techniques. (author)

  14. Developing Healthcare Data Analytics APPs with Open Data Science Tools.

    Science.gov (United States)

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong

    2017-01-01

    Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.

  15. Analytical Chemistry Laboratory progress report for FY 1985

    Energy Technology Data Exchange (ETDEWEB)

    Green, D.W.; Heinrich, R.R.; Jensen, K.J.

    1985-12-01

    The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of technical support services to the scientific and engineering programs at ANL. In addition, ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques. The purpose of this report is to summarize the technical and administrative activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1985 (October 1984 through September 1985). This is the second annual report for the ACL. 4 figs., 1 tab.

  16. Analytical Chemistry Laboratory progress report for FY 1985

    International Nuclear Information System (INIS)

    Green, D.W.; Heinrich, R.R.; Jensen, K.J.

    1985-12-01

    The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of technical support services to the scientific and engineering programs at ANL. In addition, ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques. The purpose of this report is to summarize the technical and administrative activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1985 (October 1984 through September 1985). This is the second annual report for the ACL. 4 figs., 1 tab

  17. Mars Analytical Microimager

    Science.gov (United States)

    Batory, Krzysztof J.; Govindjee; Andersen, Dale; Presley, John; Lucas, John M.; Sears, S. Kelly; Vali, Hojatollah

    Unambiguous detection of extraterrestrial nitrogenous hydrocarbon microbiology requires an instrument both to recognize potential biogenic specimens and to successfully discriminate them from geochemical settings. Such detection should ideally be in-situ and not jeopardize other experiments by altering samples. Taken individually most biomarkers are inconclusive. For example, since amino acids can be synthesized abiotically they are not always considered reliable biomarkers. An enantiomeric imbalance, which is characteristic of all terrestrial life, may be questioned because chirality can also be altered abiotically. However, current scientific understanding holds that aggregates of identical proteins or proteinaceous complexes, with their well-defined amino acid residue sequences, are indisputable biomarkers. Our paper describes the Mars Analytical Microimager, an instrument for the simultaneous imaging of generic autofluorescent biomarkers and overall morphology. Autofluorescence from ultraviolet to near-infrared is emitted by all known terrestrial biology, and often as consistent complex bands uncharacteristic of abiotic mineral luminescence. The MAM acquires morphology, and even sub-micron morphogenesis, at a 3-centimeter working distance with resolution approaching a laser scanning microscope. Luminescence is simultaneously collected via a 2.5-micron aperture, thereby permitting accurate correlation of multi-dimensional optical behavior with specimen morphology. A variable wavelength excitation source and photospectrometer serve to obtain steady-state and excitation spectra of biotic and luminescent abiotic sources. We believe this is the first time instrumentation for detecting hydrated or desiccated microbiology non-destructively in-situ has been demonstrated. We have obtained excellent preliminary detection of biota and inorganic matrix discrimination from terrestrial polar analogues, and perimetric morphology of individual magnetotactic bacteria. Proposed

  18. MERRA Analytic Services

    Science.gov (United States)

    Schnase, J. L.; Duffy, D. Q.; McInerney, M. A.; Tamkin, G. S.; Thompson, J. H.; Gill, R.; Grieg, C. M.

    2012-12-01

    MERRA Analytic Services (MERRA/AS) is a cyberinfrastructure resource for developing and evaluating a new generation of climate data analysis capabilities. MERRA/AS supports OBS4MIP activities by reducing the time spent in the preparation of Modern Era Retrospective-Analysis for Research and Applications (MERRA) data used in data-model intercomparison. It also provides a testbed for experimental development of high-performance analytics. MERRA/AS is a cloud-based service built around the Virtual Climate Data Server (vCDS) technology that is currently used by the NASA Center for Climate Simulation (NCCS) to deliver Intergovernmental Panel on Climate Change (IPCC) data to the Earth System Grid Federation (ESGF). Crucial to its effectiveness, MERRA/AS's servers will use a workflow-generated realizable object capability to perform analyses over the MERRA data using the MapReduce approach to parallel storage-based computation. The results produced by these operations will be stored by the vCDS, which will also be able to host code sets for those who wish to explore the use of MapReduce for more advanced analytics. While the work described here will focus on the MERRA collection, these technologies can be used to publish other reanalysis, observational, and ancillary OBS4MIP data to ESGF and, importantly, offer an architectural approach to climate data services that can be generalized to applications and customers beyond the traditional climate research community. In this presentation, we describe our approach, experiences, lessons learned,and plans for the future.; (A) MERRA/AS software stack. (B) Example MERRA/AS interfaces.

  19. Documented Safety Analysis for the B695 Segment

    Energy Technology Data Exchange (ETDEWEB)

    Laycak, D

    2008-09-11

    This Documented Safety Analysis (DSA) was prepared for the Lawrence Livermore National Laboratory (LLNL) Building 695 (B695) Segment of the Decontamination and Waste Treatment Facility (DWTF). The report provides comprehensive information on design and operations, including safety programs and safety structures, systems and components to address the potential process-related hazards, natural phenomena, and external hazards that can affect the public, facility workers, and the environment. Consideration is given to all modes of operation, including the potential for both equipment failure and human error. The facilities known collectively as the DWTF are used by LLNL's Radioactive and Hazardous Waste Management (RHWM) Division to store and treat regulated wastes generated at LLNL. RHWM generally processes low-level radioactive waste with no, or extremely low, concentrations of transuranics (e.g., much less than 100 nCi/g). Wastes processed often contain only depleted uranium and beta- and gamma-emitting nuclides, e.g., {sup 90}Sr, {sup 137}Cs, or {sup 3}H. The mission of the B695 Segment centers on container storage, lab-packing, repacking, overpacking, bulking, sampling, waste transfer, and waste treatment. The B695 Segment is used for storage of radioactive waste (including transuranic and low-level), hazardous, nonhazardous, mixed, and other waste. Storage of hazardous and mixed waste in B695 Segment facilities is in compliance with the Resource Conservation and Recovery Act (RCRA). LLNL is operated by the Lawrence Livermore National Security, LLC, for the Department of Energy (DOE). The B695 Segment is operated by the RHWM Division of LLNL. Many operations in the B695 Segment are performed under a Resource Conservation and Recovery Act (RCRA) operation plan, similar to commercial treatment operations with best demonstrated available technologies. The buildings of the B695 Segment were designed and built considering such operations, using proven building

  20. Documented Safety Analysis for the B695 Segment

    International Nuclear Information System (INIS)

    Laycak, D.

    2008-01-01

    This Documented Safety Analysis (DSA) was prepared for the Lawrence Livermore National Laboratory (LLNL) Building 695 (B695) Segment of the Decontamination and Waste Treatment Facility (DWTF). The report provides comprehensive information on design and operations, including safety programs and safety structures, systems and components to address the potential process-related hazards, natural phenomena, and external hazards that can affect the public, facility workers, and the environment. Consideration is given to all modes of operation, including the potential for both equipment failure and human error. The facilities known collectively as the DWTF are used by LLNL's Radioactive and Hazardous Waste Management (RHWM) Division to store and treat regulated wastes generated at LLNL. RHWM generally processes low-level radioactive waste with no, or extremely low, concentrations of transuranics (e.g., much less than 100 nCi/g). Wastes processed often contain only depleted uranium and beta- and gamma-emitting nuclides, e.g., 90 Sr, 137 Cs, or 3 H. The mission of the B695 Segment centers on container storage, lab-packing, repacking, overpacking, bulking, sampling, waste transfer, and waste treatment. The B695 Segment is used for storage of radioactive waste (including transuranic and low-level), hazardous, nonhazardous, mixed, and other waste. Storage of hazardous and mixed waste in B695 Segment facilities is in compliance with the Resource Conservation and Recovery Act (RCRA). LLNL is operated by the Lawrence Livermore National Security, LLC, for the Department of Energy (DOE). The B695 Segment is operated by the RHWM Division of LLNL. Many operations in the B695 Segment are performed under a Resource Conservation and Recovery Act (RCRA) operation plan, similar to commercial treatment operations with best demonstrated available technologies. The buildings of the B695 Segment were designed and built considering such operations, using proven building systems, and keeping

  1. Analytical elements of mechanics

    CERN Document Server

    Kane, Thomas R

    2013-01-01

    Analytical Elements of Mechanics, Volume 1, is the first of two volumes intended for use in courses in classical mechanics. The books aim to provide students and teachers with a text consistent in content and format with the author's ideas regarding the subject matter and teaching of mechanics, and to disseminate these ideas. The book opens with a detailed exposition of vector algebra, and no prior knowledge of this subject is required. This is followed by a chapter on the topic of mass centers, which is presented as a logical extension of concepts introduced in connection with centroids. A

  2. Analytical chemistry in space

    CERN Document Server

    Wainerdi, Richard E

    1970-01-01

    Analytical Chemistry in Space presents an analysis of the chemical constitution of space, particularly the particles in the solar wind, of the planetary atmospheres, and the surfaces of the moon and planets. Topics range from space engineering considerations to solar system atmospheres and recovered extraterrestrial materials. Mass spectroscopy in space exploration is also discussed, along with lunar and planetary surface analysis using neutron inelastic scattering. This book is comprised of seven chapters and opens with a discussion on the possibilities for exploration of the solar system by

  3. Analysis and analytical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Batuecas Rodriguez, T [Department of Chemistry and Isotopes, Junta de Energia Nuclear, Madrid (Spain)

    1967-01-01

    The technology associated with the use of organic coolants in nuclear reactors depends to a large extent on the determination and control of their physical and chemical properties, and particularly on the viability, speed, sensitivity, precision and accuracy (depending on the intended usage) of the methods employed in detection and analytical determination. This has led to the study and development of numerous techniques, some specially designed for the extreme conditions involved in working with the types of product in question and others adapted from existing techniques. In the specific case of polyphenyl and hydropolyphenyl mixtures, which have been the principal subjects of study to date and offer greatest promise, the analytical problems are broadly as follows: Composition of initial product or virgin coolant composition of macro components and amounts of organic and inorganic impurities; Coolant during and after operation. Determination of gases and organic compounds produced by pyrolysis and radiolysis (degradation and polymerization products); Control of systems for purifying and regenerating the coolant after use. Dissolved pressurization gases; Detection of intermediate products during decomposition; these are generally very unstable (free radicals); Degree of fouling and film formation. Tests to determine potential formation of films; Corrosion of structural elements and canning materials; Health and safety. Toxicity, inflammability and impurities that can be activated. Although some of the above problems are closely interrelated and entail similar techniques, they vary as to degree of difficulty. Another question is the difficulty of distinguishing clearly between techniques for determining physical and physico-chemical properties, on one hand, and analytical techniques on the other. Any classification is therefore somewhat arbitrary (for example, in the case of dosimetry and techniques for determining mean molecular weights or electrical conductivity

  4. Analytical chemistry experiment

    International Nuclear Information System (INIS)

    Park, Seung Jo; Paeng, Seong Gwan; Jang, Cheol Hyeon

    1992-08-01

    This book deals with analytical chemistry experiment with eight chapters. It explains general matters that require attention on experiment, handling of medicine with keep and class, the method for handling and glass devices, general control during experiment on heating, cooling, filtering, distillation and extraction and evaporation and dry, glass craft on purpose of the craft, how to cut glass tube and how to bend glass tube, volumetric analysis on neutralization titration and precipitation titration, gravimetric analysis on solubility product, filter and washing and microorganism experiment with necessary tool, sterilization disinfection incubation and appendixes.

  5. Analytic aspects of convexity

    CERN Document Server

    Colesanti, Andrea; Gronchi, Paolo

    2018-01-01

    This book presents the proceedings of the international conference Analytic Aspects in Convexity, which was held in Rome in October 2016. It offers a collection of selected articles, written by some of the world’s leading experts in the field of Convex Geometry, on recent developments in this area: theory of valuations; geometric inequalities; affine geometry; and curvature measures. The book will be of interest to a broad readership, from those involved in Convex Geometry, to those focusing on Functional Analysis, Harmonic Analysis, Differential Geometry, or PDEs. The book is a addressed to PhD students and researchers, interested in Convex Geometry and its links to analysis.

  6. Local analytic geometry

    CERN Document Server

    Abhyankar, Shreeram Shankar

    1964-01-01

    This book provides, for use in a graduate course or for self-study by graduate students, a well-motivated treatment of several topics, especially the following: (1) algebraic treatment of several complex variables; (2) geometric approach to algebraic geometry via analytic sets; (3) survey of local algebra; (4) survey of sheaf theory. The book has been written in the spirit of Weierstrass. Power series play the dominant role. The treatment, being algebraic, is not restricted to complex numbers, but remains valid over any complete-valued field. This makes it applicable to situations arising from

  7. Advanced analytical techniques

    International Nuclear Information System (INIS)

    Mrochek, J.E.; Shumate, S.E.; Genung, R.K.; Bahner, C.T.; Lee, N.E.; Dinsmore, S.R.

    1976-01-01

    The development of several new analytical techniques for use in clinical diagnosis and biomedical research is reported. These include: high-resolution liquid chromatographic systems for the early detection of pathological molecular constituents in physiologic body fluids; gradient elution chromatography for the analysis of protein-bound carbohydrates in blood serum samples, with emphasis on changes in sera from breast cancer patients; electrophoretic separation techniques coupled with staining of specific proteins in cellular isoenzymes for the monitoring of genetic mutations and abnormal molecular constituents in blood samples; and the development of a centrifugal elution chromatographic technique for the assay of specific proteins and immunoglobulins in human blood serum samples

  8. Framework for pedagogical learning analytics

    OpenAIRE

    Heilala, Ville

    2018-01-01

    Learning analytics is an emergent technological practice and a multidisciplinary scientific discipline, which goal is to facilitate effective learning and knowledge of learning. In this design science research, I combine knowledge discovery process, a concept of pedagogical knowledge, ethics of learning analytics and microservice architecture. The result is a framework for pedagogical learning analytics. The framework is applied and evaluated in the context of agency analytics. The framework ...

  9. Who puts the most energy into energy conservation? A segmentation of energy consumers based on energy-related behavioral characteristics

    International Nuclear Information System (INIS)

    Sütterlin, Bernadette; Brunner, Thomas A.; Siegrist, Michael

    2011-01-01

    The present paper aims to identify and describe different types of energy consumers in a more comprehensive way than previous segmentation studies using cluster analysis. Energy consumers were segmented based on their energy-related behavioral characteristics. In addition to purchase- and curtailment-related energy-saving behavior, consumer classification was also based on acceptance of policy measures and energy-related psychosocial factors, so the used behavioral segmentation base was more comprehensive compared to other studies. Furthermore, differentiation between the energy-saving purchase of daily products, such as food, and of energy efficient appliances allowed a more differentiated characterization of the energy consumer segments. The cluster analysis revealed six energy consumer segments: the idealistic, the selfless inconsequent, the thrifty, the materialistic, the convenience-oriented indifferent, and the problem-aware well-being-oriented energy consumer. Findings emphasize that using a broader and more distinct behavioral base is crucial for an adequate and differentiated description of energy consumer types. The paper concludes by highlighting the most promising energy consumer segments and discussing possible segment-specific marketing and policy strategies. - Highlights: ► By applying a cluster-analytic approach, new energy consumer segments are identified. ► A comprehensive, differentiated description of the different energy consumer types is provided. ► A distinction between purchase of daily products and energy efficient appliances is essential. ► Behavioral variables are a more suitable base for segmentation than general characteristics.

  10. Multi-product dynamic advertisement planning in a segmented market

    Directory of Open Access Journals (Sweden)

    Aggarwal Sugandha

    2017-01-01

    Full Text Available In this paper, a dynamic multi-objective linear integer programming model is proposed to optimally distribute a firm’s advertising budget among multiple products and media in a segmented market. To make the media plan responsive to the changes in the market, the distribution is carried out dynamically by dividing the planning horizon into smaller periods. The model incorporates the effect of the previous period advertising reach on the current period (taken through retention factor, and it also considers cross-product effect of simultaneously advertising different products. An application of the model is presented for an insurance firm that markets five different products, using goal programming approach.

  11. Technical Safety Requirements for the B695 Segment

    Energy Technology Data Exchange (ETDEWEB)

    Laycak, D

    2008-09-11

    This document contains Technical Safety Requirements (TSRs) for the Radioactive and Hazardous Waste Management (RHWM) Division's B695 Segment of the Decontamination and Waste Treatment Facility (DWTF) at Lawrence Livermore National Laboratory (LLNL). The TSRs constitute requirements regarding the safe operation of the B695 Segment. The TSRs are derived from the Documented Safety Analysis (DSA) for the B695 Segment (LLNL 2007). The analysis presented there determined that the B695 Segment is a low-chemical hazard, Hazard Category 3, nonreactor nuclear facility. The TSRs consist primarily of inventory limits as well as controls to preserve the underlying assumptions in the hazard analyses. Furthermore, appropriate commitments to safety programs are presented in the administrative controls section of the TSRs. The B695 Segment (B695 and the west portion of B696) is a waste treatment and storage facility located in the northeast quadrant of the LLNL main site. The approximate area and boundary of the B695 Segment are shown in the B695 Segment DSA. Activities typically conducted in the B695 Segment include container storage, lab-packing, repacking, overpacking, bulking, sampling, waste transfer, and waste treatment. B695 is used to store and treat radioactive, mixed, and hazardous waste, and it also contains equipment used in conjunction with waste processing operations to treat various liquid and solid wastes. The portion of the building called Building 696 Solid Waste Processing Area (SWPA), also referred to as B696S in this report, is used primarily to manage solid radioactive, mixed, and hazardous waste. Operations specific to the SWPA include sorting and segregating waste, lab-packing, sampling, and crushing empty drums that previously contained waste. Furthermore, a Waste Packaging Unit will be permitted to treat hazardous and mixed waste. RHWM generally processes LLW with no, or extremely low, concentrations of transuranics (i.e., much less than 100 n

  12. Technical Safety Requirements for the B695 Segment

    International Nuclear Information System (INIS)

    Laycak, D.

    2008-01-01

    This document contains Technical Safety Requirements (TSRs) for the Radioactive and Hazardous Waste Management (RHWM) Division's B695 Segment of the Decontamination and Waste Treatment Facility (DWTF) at Lawrence Livermore National Laboratory (LLNL). The TSRs constitute requirements regarding the safe operation of the B695 Segment. The TSRs are derived from the Documented Safety Analysis (DSA) for the B695 Segment (LLNL 2007). The analysis presented there determined that the B695 Segment is a low-chemical hazard, Hazard Category 3, nonreactor nuclear facility. The TSRs consist primarily of inventory limits as well as controls to preserve the underlying assumptions in the hazard analyses. Furthermore, appropriate commitments to safety programs are presented in the administrative controls section of the TSRs. The B695 Segment (B695 and the west portion of B696) is a waste treatment and storage facility located in the northeast quadrant of the LLNL main site. The approximate area and boundary of the B695 Segment are shown in the B695 Segment DSA. Activities typically conducted in the B695 Segment include container storage, lab-packing, repacking, overpacking, bulking, sampling, waste transfer, and waste treatment. B695 is used to store and treat radioactive, mixed, and hazardous waste, and it also contains equipment used in conjunction with waste processing operations to treat various liquid and solid wastes. The portion of the building called Building 696 Solid Waste Processing Area (SWPA), also referred to as B696S in this report, is used primarily to manage solid radioactive, mixed, and hazardous waste. Operations specific to the SWPA include sorting and segregating waste, lab-packing, sampling, and crushing empty drums that previously contained waste. Furthermore, a Waste Packaging Unit will be permitted to treat hazardous and mixed waste. RHWM generally processes LLW with no, or extremely low, concentrations of transuranics (i.e., much less than 100 n

  13. Division of Analytical Chemistry, 1998

    DEFF Research Database (Denmark)

    Hansen, Elo Harald

    1999-01-01

    The article recounts the 1998 activities of the Division of Analytical Chemistry (DAC- formerly the Working Party on Analytical Chemistry, WPAC), which body is a division of the Federation of European Chemical Societies (FECS). Elo Harald Hansen is the Danish delegate, representing The Danish...... Chemical Society/The Society for Analytical Chemistry....

  14. Learning Analytics: Readiness and Rewards

    Science.gov (United States)

    Friesen, Norm

    2013-01-01

    This position paper introduces the relatively new field of learning analytics, first by considering the relevant meanings of both "learning" and "analytics," and then by looking at two main levels at which learning analytics can be or has been implemented in educational organizations. Although integrated turnkey systems or…

  15. Quality Indicators for Learning Analytics

    Science.gov (United States)

    Scheffel, Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus

    2014-01-01

    This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on…

  16. Hierarchical image segmentation for learning object priors

    Energy Technology Data Exchange (ETDEWEB)

    Prasad, Lakshman [Los Alamos National Laboratory; Yang, Xingwei [TEMPLE UNIV.; Latecki, Longin J [TEMPLE UNIV.; Li, Nan [TEMPLE UNIV.

    2010-11-10

    The proposed segmentation approach naturally combines experience based and image based information. The experience based information is obtained by training a classifier for each object class. For a given test image, the result of each classifier is represented as a probability map. The final segmentation is obtained with a hierarchial image segmentation algorithm that considers both the probability maps and the image features such as color and edge strength. We also utilize image region hierarchy to obtain not only local but also semi-global features as input to the classifiers. Moreover, to get robust probability maps, we take into account the region context information by averaging the probability maps over different levels of the hierarchical segmentation algorithm. The obtained segmentation results are superior to the state-of-the-art supervised image segmentation algorithms.

  17. Image Segmentation Using Minimum Spanning Tree

    Science.gov (United States)

    Dewi, M. P.; Armiati, A.; Alvini, S.

    2018-04-01

    This research aim to segmented the digital image. The process of segmentation is to separate the object from the background. So the main object can be processed for the other purposes. Along with the development of technology in digital image processing application, the segmentation process becomes increasingly necessary. The segmented image which is the result of the segmentation process should accurate due to the next process need the interpretation of the information on the image. This article discussed the application of minimum spanning tree on graph in segmentation process of digital image. This method is able to separate an object from the background and the image will change to be the binary images. In this case, the object that being the focus is set in white, while the background is black or otherwise.

  18. Image Segmentation via Fractal Dimension

    Science.gov (United States)

    1987-12-01

    Contipany, 1986. 7. Harrington, Steven. Computer Graphics A Programming Aproach. New York: McGrawU-Hill Book Company, 1987. 8. Norev, Moshe. Picture...ITEX TORHS conversion program . My fazily has been a constant source of relaxation and enoouragement, and I wish to thank them for their understanding...Conclusion 29.... ....................... 9........... . -11 III. Experimental Method 3..-.1..................... .. ....... -1 Overviev

  19. The analytic renormalization group

    Directory of Open Access Journals (Sweden)

    Frank Ferrari

    2016-08-01

    Full Text Available Finite temperature Euclidean two-point functions in quantum mechanics or quantum field theory are characterized by a discrete set of Fourier coefficients Gk, k∈Z, associated with the Matsubara frequencies νk=2πk/β. We show that analyticity implies that the coefficients Gk must satisfy an infinite number of model-independent linear equations that we write down explicitly. In particular, we construct “Analytic Renormalization Group” linear maps Aμ which, for any choice of cut-off μ, allow to express the low energy Fourier coefficients for |νk|<μ (with the possible exception of the zero mode G0, together with the real-time correlators and spectral functions, in terms of the high energy Fourier coefficients for |νk|≥μ. Operating a simple numerical algorithm, we show that the exact universal linear constraints on Gk can be used to systematically improve any random approximate data set obtained, for example, from Monte-Carlo simulations. Our results are illustrated on several explicit examples.

  20. Smart city analytics

    DEFF Research Database (Denmark)

    Hansen, Casper; Hansen, Christian; Alstrup, Stephen

    2017-01-01

    We present an ensemble learning method that predicts large increases in the hours of home care received by citizens. The method is supervised, and uses different ensembles of either linear (logistic regression) or non-linear (random forests) classifiers. Experiments with data available from 2013 ...... is very useful when full records are not accessible or available. Smart city analytics does not necessarily require full city records. To our knowledge this preliminary study is the first to predict large increases in home care for smart city analytics.......We present an ensemble learning method that predicts large increases in the hours of home care received by citizens. The method is supervised, and uses different ensembles of either linear (logistic regression) or non-linear (random forests) classifiers. Experiments with data available from 2013...... to 2017 for every citizen in Copenhagen receiving home care (27,775 citizens) show that prediction can achieve state of the art performance as reported in similar health related domains (AUC=0.715). We further find that competitive results can be obtained by using limited information for training, which...

  1. Toxic Anterior Segment Syndrome (TASS

    Directory of Open Access Journals (Sweden)

    Özlem Öner

    2011-12-01

    Full Text Available Toxic anterior segment syndrome (TASS is a sterile intraocular inflammation caused by noninfectious substances, resulting in extensive toxic damage to the intraocular tissues. Possible etiologic factors of TASS include surgical trauma, bacterial endotoxin, intraocular solutions with inappropriate pH and osmolality, preservatives, denatured ophthalmic viscosurgical devices (OVD, inadequate sterilization, cleaning and rinsing of surgical devices, intraocular lenses, polishing and sterilizing compounds which are related to intraocular lenses. The characteristic signs and symptoms such as blurred vision, corneal edema, hypopyon and nonreactive pupil usually occur 24 hours after the cataract surgery. The differential diagnosis of TASS from infectious endophthalmitis is important. The main treatment for TASS formation is prevention. TASS is a cataract surgery complication that is more commonly seen nowadays. In this article, the possible underlying causes as well as treatment and prevention methods of TASS are summarized. (Turk J Oph thal mol 2011; 41: 407-13

  2. Communication with market segments - travel agencies' perspective

    OpenAIRE

    Lorena Bašan; Jasmina Dlačić; Željko Trezner

    2013-01-01

    Purpose – The purpose of this paper is to research the travel agencies’ communication with market segments. Communication with market segments takes into account marketing communication means as well as the implementation of different business orientations. Design – Special emphasis is placed on the use of different marketing communication means and their efficiency. Research also explores business orientation adaptation when approaching different market segments. Methodology – In explo...

  3. Distance measures for image segmentation evaluation

    OpenAIRE

    Monteiro, Fernando C.; Campilho, Aurélio

    2012-01-01

    In this paper we present a study of evaluation measures that enable the quantification of the quality of an image segmentation result. Despite significant advances in image segmentation techniques, evaluation of these techniques thus far has been largely subjective. Typically, the effectiveness of a new algorithm is demonstrated only by the presentation of a few segmented images and is otherwise left to subjective evaluation by the reader. Such an evaluation criterion can be useful for differ...

  4. IFRS 8 Operating Segments - A Closer Look

    OpenAIRE

    Muthupandian, K S

    2008-01-01

    The International Accounting Standards Board issued the International Financial Reporting Standard 8 Operating Segments. Segment information is one of the most vital aspects of financial reporting for investors and other users. The IFRS 8 requires an entity to adopt the ‘management approach’ to reporting on the financial performance of its operating segments. This article presents a closer look of the standard (objective, scope, and disclosures).

  5. MRI Brain Tumor Segmentation Methods- A Review

    OpenAIRE

    Gursangeet, Kaur; Jyoti, Rani

    2016-01-01

    Medical image processing and its segmentation is an active and interesting area for researchers. It has reached at the tremendous place in diagnosing tumors after the discovery of CT and MRI. MRI is an useful tool to detect the brain tumor and segmentation is performed to carry out the useful portion from an image. The purpose of this paper is to provide an overview of different image segmentation methods like watershed algorithm, morphological operations, neutrosophic sets, thresholding, K-...

  6. Speaker Segmentation and Clustering Using Gender Information

    Science.gov (United States)

    2006-02-01

    used in the first stages of segmentation forder information in the clustering of the opposite-gender speaker diarization of news broadcasts. files, the...AFRL-HE-WP-TP-2006-0026 AIR FORCE RESEARCH LABORATORY Speaker Segmentation and Clustering Using Gender Information Brian M. Ore General Dynamics...COVERED (From - To) February 2006 ProceedinLgs 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Speaker Segmentation and Clustering Using Gender Information 5b

  7. Benchmarking of Remote Sensing Segmentation Methods

    Czech Academy of Sciences Publication Activity Database

    Mikeš, Stanislav; Haindl, Michal; Scarpa, G.; Gaetano, R.

    2015-01-01

    Roč. 8, č. 5 (2015), s. 2240-2248 ISSN 1939-1404 R&D Projects: GA ČR(CZ) GA14-10911S Institutional support: RVO:67985556 Keywords : benchmark * remote sensing segmentation * unsupervised segmentation * supervised segmentation Subject RIV: BD - Theory of Information Impact factor: 2.145, year: 2015 http://library.utia.cas.cz/separaty/2015/RO/haindl-0445995.pdf

  8. Track segment synthesis method for NTA film

    International Nuclear Information System (INIS)

    Kumazawa, Shigeru

    1980-03-01

    A method is presented for synthesizing track segments extracted from a gray-level digital picture of NTA film in automatic counting system. In order to detect each track in an arbitrary direction, even if it has some gaps, as a set of the track segments, the method links extracted segments along the track, in succession, to the linked track segments, according to whether each extracted segment bears a similarity of direction to the track or not and whether it is connected with the linked track segments or not. In the case of a large digital picture, the method is applied to each subpicture, which is a strip of the picture, and then concatenates subsets of track segments linked at each subpicture as a set of track segments belonging to a track. The method was applied to detecting tracks in various directions over the eight 364 x 40-pixel subpictures with the gray scale of 127/pixel (picture element) of the microphotograph of NTA film. It was proved to be able to synthesize track segments correctly for every track in the picture. (author)

  9. Segmenting hospitals for improved management strategy.

    Science.gov (United States)

    Malhotra, N K

    1989-09-01

    The author presents a conceptual framework for the a priori and clustering-based approaches to segmentation and evaluates them in the context of segmenting institutional health care markets. An empirical study is reported in which the hospital market is segmented on three state-of-being variables. The segmentation approach also takes into account important organizational decision-making variables. The sophisticated Thurstone Case V procedure is employed. Several marketing implications for hospitals, other health care organizations, hospital suppliers, and donor publics are identified.

  10. Review of segmentation process in consumer markets

    Directory of Open Access Journals (Sweden)

    Veronika Jadczaková

    2013-01-01

    Full Text Available Although there has been a considerable debate on market segmentation over five decades, attention was merely devoted to single stages of the segmentation process. In doing so, stages as segmentation base selection or segments profiling have been heavily covered in the extant literature, whereas stages as implementation of the marketing strategy or market definition were of a comparably lower interest. Capitalizing on this shortcoming, this paper strives to close the gap and provide each step of the segmentation process with equal treatment. Hence, the objective of this paper is two-fold. First, a snapshot of the segmentation process in a step-by-step fashion will be provided. Second, each step (where possible will be evaluated on chosen criteria by means of description, comparison, analysis and synthesis of 32 academic papers and 13 commercial typology systems. Ultimately, the segmentation stages will be discussed with empirical findings prevalent in the segmentation studies and last but not least suggestions calling for further investigation will be presented. This seven-step-framework may assist when segmenting in practice allowing for more confidential targeting which in turn might prepare grounds for creating of a differential advantage.

  11. Interactive segmentation techniques algorithms and performance evaluation

    CERN Document Server

    He, Jia; Kuo, C-C Jay

    2013-01-01

    This book focuses on interactive segmentation techniques, which have been extensively studied in recent decades. Interactive segmentation emphasizes clear extraction of objects of interest, whose locations are roughly indicated by human interactions based on high level perception. This book will first introduce classic graph-cut segmentation algorithms and then discuss state-of-the-art techniques, including graph matching methods, region merging and label propagation, clustering methods, and segmentation methods based on edge detection. A comparative analysis of these methods will be provided

  12. Using alternative segmentation techniques to examine residential customer`s energy needs, wants, and preferences

    Energy Technology Data Exchange (ETDEWEB)

    Hollander, C.; Kidwell, S. [Union Electric Co., St. Louis, MO (United States); Banks, J.; Taylor, E. [Cambridge Reports/Research International, MA (United States)

    1994-11-01

    The primary objective of this study was to examine residential customers` attitudes toward energy usage, conservation, and efficiency, and to examine the implications of these attitudes for how the utility should design and communicate about programs and services in these areas. This study combined focus groups and customer surveys, and utilized several customer segmentation schemes -- grouping customers by geodemographics, as well as customers` energy and environmental values, beliefs, and opinions -- to distinguish different segments of customers.

  13. Normality in Analytical Psychology

    Science.gov (United States)

    Myers, Steve

    2013-01-01

    Although C.G. Jung’s interest in normality wavered throughout his career, it was one of the areas he identified in later life as worthy of further research. He began his career using a definition of normality which would have been the target of Foucault’s criticism, had Foucault chosen to review Jung’s work. However, Jung then evolved his thinking to a standpoint that was more aligned to Foucault’s own. Thereafter, the post Jungian concept of normality has remained relatively undeveloped by comparison with psychoanalysis and mainstream psychology. Jung’s disjecta membra on the subject suggest that, in contemporary analytical psychology, too much focus is placed on the process of individuation to the neglect of applications that consider collective processes. Also, there is potential for useful research and development into the nature of conflict between individuals and societies, and how normal people typically develop in relation to the spectrum between individuation and collectivity. PMID:25379262

  14. Analytic Summability Theory

    KAUST Repository

    Alabdulmohsin, Ibrahim M.

    2018-01-01

    The theory of summability of divergent series is a major branch of mathematical analysis that has found important applications in engineering and science. It addresses methods of assigning natural values to divergent sums, whose prototypical examples include the Abel summation method, the Cesaro means, and the Borel summability method. As will be established in subsequent chapters, the theory of summability of divergent series is intimately connected to the theory of fractional finite sums. In this chapter, we introduce a generalized definition of series as well as a new summability method for computing the value of series according to such a definition. We show that the proposed summability method is both regular and linear, and that it arises quite naturally in the study of local polynomial approximations of analytic functions. The materials presented in this chapter will be foundational to all subsequent chapters.

  15. Analytic Summability Theory

    KAUST Repository

    Alabdulmohsin, Ibrahim M.

    2018-03-07

    The theory of summability of divergent series is a major branch of mathematical analysis that has found important applications in engineering and science. It addresses methods of assigning natural values to divergent sums, whose prototypical examples include the Abel summation method, the Cesaro means, and the Borel summability method. As will be established in subsequent chapters, the theory of summability of divergent series is intimately connected to the theory of fractional finite sums. In this chapter, we introduce a generalized definition of series as well as a new summability method for computing the value of series according to such a definition. We show that the proposed summability method is both regular and linear, and that it arises quite naturally in the study of local polynomial approximations of analytic functions. The materials presented in this chapter will be foundational to all subsequent chapters.

  16. Generalized analytic continuation

    CERN Document Server

    Ross, William T

    2002-01-01

    The theory of generalized analytic continuation studies continuations of meromorphic functions in situations where traditional theory says there is a natural boundary. This broader theory touches on a remarkable array of topics in classical analysis, as described in the book. This book addresses the following questions: (1) When can we say, in some reasonable way, that component functions of a meromorphic function on a disconnected domain, are "continuations" of each other? (2) What role do such "continuations" play in certain aspects of approximation theory and operator theory? The authors use the strong analogy with the summability of divergent series to motivate the subject. In this vein, for instance, theorems can be described as being "Abelian" or "Tauberian". The introductory overview carefully explains the history and context of the theory. The authors begin with a review of the works of Poincaré, Borel, Wolff, Walsh, and Gončar, on continuation properties of "Borel series" and other meromorphic func...

  17. Analytical applications of spectroscopy

    International Nuclear Information System (INIS)

    Creaser, C.S.

    1988-01-01

    This book provides an up to date overview of recent developments in analytical spectroscopy, with a particular emphasis on the common themes of chromatography - spectroscopy combinations, Fourier transform methods, and data handling techniques, which have played an increasingly important part in the development of all spectroscopic techniques. The book contains papers originally presented at a conference entitled 'Spectroscopy Across The Spectrum' held jointly with the first 'International Near Infrared Spectroscopy Conference' at the University of East Anglia, Norwich, UK, in July 1987, which have been edited and rearranged with some additional material. Each section includes reviews of key areas of current research as well as short reports of new developments. The fields covered are: Near Infrared Spectroscopy; Infrared Spectroscopy; Mass Spectroscopy; NMR Spectroscopy; Atomic and UV/Visible Spectroscopy; Chemometrics and Data Analysis. (author)

  18. Analytic device including nanostructures

    KAUST Repository

    Di Fabrizio, Enzo M.; Fratalocchi, Andrea; Totero Gongora, Juan Sebastian; Coluccio, Maria Laura; Candeloro, Patrizio; Cuda, Gianni

    2015-01-01

    A device for detecting an analyte in a sample comprising: an array including a plurality of pixels, each pixel including a nanochain comprising: a first nanostructure, a second nanostructure, and a third nanostructure, wherein size of the first nanostructure is larger than that of the second nanostructure, and size of the second nanostructure is larger than that of the third nanostructure, and wherein the first nanostructure, the second nanostructure, and the third nanostructure are positioned on a substrate such that when the nanochain is excited by an energy, an optical field between the second nanostructure and the third nanostructure is stronger than an optical field between the first nanostructure and the second nanostructure, wherein the array is configured to receive a sample; and a detector arranged to collect spectral data from a plurality of pixels of the array.

  19. Improved steamflood analytical model

    Energy Technology Data Exchange (ETDEWEB)

    Chandra, S.; Mamora, D.D. [Society of Petroleum Engineers, Richardson, TX (United States)]|[Texas A and M Univ., TX (United States)

    2005-11-01

    Predicting the performance of steam flooding can help in the proper execution of enhanced oil recovery (EOR) processes. The Jones model is often used for analytical steam flooding performance prediction, but it does not accurately predict oil production peaks. In this study, an improved steam flood model was developed by modifying 2 of the 3 components of the capture factor in the Jones model. The modifications were based on simulation results from a Society of Petroleum Engineers (SPE) comparative project case model. The production performance of a 5-spot steamflood pattern unit was simulated and compared with results obtained from the Jones model. Three reservoir types were simulated through the use of 3-D Cartesian black oil models. In order to correlate the simulation and the Jones analytical model results for the start and height of the production peak, the dimensionless steam zone size was modified to account for a decrease in oil viscosity during steam flooding and its dependence on the steam injection rate. In addition, the dimensionless volume of displaced oil produced was modified from its square-root format to an exponential form. The modified model improved results for production performance by up to 20 years of simulated steam flooding, compared to the Jones model. Results agreed with simulation results for 13 different cases, including 3 different sets of reservoir and fluid properties. Reservoir engineers will benefit from the improved accuracy of the model. Oil displacement calculations were based on methods proposed in earlier research, in which the oil displacement rate is a function of cumulative oil steam ratio. The cumulative oil steam ratio is a function of overall thermal efficiency. Capture factor component formulae were presented, as well as charts of oil production rates and cumulative oil-steam ratios for various reservoirs. 13 refs., 4 tabs., 29 figs.

  20. Segmentation of liver tumors on CT images

    International Nuclear Information System (INIS)

    Pescia, D.

    2011-01-01

    This thesis is dedicated to 3D segmentation of liver tumors in CT images. This is a task of great clinical interest since it allows physicians benefiting from reproducible and reliable methods for segmenting such lesions. Accurate segmentation would indeed help them during the evaluation of the lesions, the choice of treatment and treatment planning. Such a complex segmentation task should cope with three main scientific challenges: (i) the highly variable shape of the structures being sought, (ii) their similarity of appearance compared with their surrounding medium and finally (iii) the low signal to noise ratio being observed in these images. This problem is addressed in a clinical context through a two step approach, consisting of the segmentation of the entire liver envelope, before segmenting the tumors which are present within the envelope. We begin by proposing an atlas-based approach for computing pathological liver envelopes. Initially images are pre-processed to compute the envelopes that wrap around binary masks in an attempt to obtain liver envelopes from estimated segmentation of healthy liver parenchyma. A new statistical atlas is then introduced and used to segmentation through its diffeomorphic registration to the new image. This segmentation is achieved through the combination of image matching costs as well as spatial and appearance prior using a multi-scale approach with MRF. The second step of our approach is dedicated to lesions segmentation contained within the envelopes using a combination of machine learning techniques and graph based methods. First, an appropriate feature space is considered that involves texture descriptors being determined through filtering using various scales and orientations. Then, state of the art machine learning techniques are used to determine the most relevant features, as well as the hyper plane that separates the feature space of tumoral voxels to the ones corresponding to healthy tissues. Segmentation is then