WorldWideScience

Sample records for program analytical segmentation

  1. Creating Web Area Segments with Google Analytics

    Science.gov (United States)

    Segments allow you to quickly access data for a predefined set of Sessions or Users, such as government or education users, or sessions in a particular state. You can then apply this segment to any report within the Google Analytics (GA) interface.

  2. Joint shape segmentation with linear programming

    KAUST Repository

    Huang, Qixing

    2011-01-01

    We present an approach to segmenting shapes in a heterogenous shape database. Our approach segments the shapes jointly, utilizing features from multiple shapes to improve the segmentation of each. The approach is entirely unsupervised and is based on an integer quadratic programming formulation of the joint segmentation problem. The program optimizes over possible segmentations of individual shapes as well as over possible correspondences between segments from multiple shapes. The integer quadratic program is solved via a linear programming relaxation, using a block coordinate descent procedure that makes the optimization feasible for large databases. We evaluate the presented approach on the Princeton segmentation benchmark and show that joint shape segmentation significantly outperforms single-shape segmentation techniques. © 2011 ACM.

  3. Writing analytic element programs in Python.

    Science.gov (United States)

    Bakker, Mark; Kelson, Victor A

    2009-01-01

    The analytic element method is a mesh-free approach for modeling ground water flow at both the local and the regional scale. With the advent of the Python object-oriented programming language, it has become relatively easy to write analytic element programs. In this article, an introduction is given of the basic principles of the analytic element method and of the Python programming language. A simple, yet flexible, object-oriented design is presented for analytic element codes using multiple inheritance. New types of analytic elements may be added without the need for any changes in the existing part of the code. The presented code may be used to model flow to wells (with either a specified discharge or drawdown) and streams (with a specified head). The code may be extended by any hydrogeologist with a healthy appetite for writing computer code to solve more complicated ground water flow problems. Copyright © 2009 The Author(s). Journal Compilation © 2009 National Ground Water Association.

  4. Integrating Linear Programming and Analytical Hierarchical ...

    African Journals Online (AJOL)

    A comprehensive Linear Programming model established, including 106 variables and 43 ecological-socio-economic constraints. Land capability and suitability evaluation accomplished using ecological factors and Comparative Advantages of the uses and the factors, respectively. Analytical Hierarchical Process followed ...

  5. 5 keys to business analytics program success

    CERN Document Server

    Boyer, John; Green, Brian; Harris, Tracy; Van De Vanter, Kay

    2012-01-01

    With business analytics is becoming increasingly strategic to all types of organizations and with many companies struggling to create a meaningful impact with this emerging technology, this work-based on the combined experience of 10 organizations that display excellence and expertise on the subject-shares the best practices, discusses the management aspects and sociology that drives success, and uncovers the five key aspects behind the success of some of the top business analytics programs in the industry. Readers will learn about numerous topics, including how to create and manage a changing

  6. Analytical program: 1975 Bikini radiological survey

    Energy Technology Data Exchange (ETDEWEB)

    Mount, M.E.; Robison, W.L.; Thompson, S.E.; Hamby, K.O.; Prindle, A.L.; Levy, H.B.

    1976-11-11

    The analytical program for samples of soil, vegetation, and animal tissue collected during the June 1975 field survey of Bikini and Eneu islands is described. The phases of this program are discussed in chronological order: initial processing of samples, gamma spectrometry, and wet chemistry. Included are discussions of quality control programs, reproducibility of measurements, and comparisons of gamma spectrometry with wet chemistry determinations of /sup 241/Am. Wet chemistry results are used to examine differences in Pu:Am ratios and Pu-isotope ratios as a function of the type of sample and the location where samples were collected.

  7. Differential segmentation responses to an alcohol social marketing program.

    Science.gov (United States)

    Dietrich, Timo; Rundle-Thiele, Sharyn; Schuster, Lisa; Drennan, Judy; Russell-Bennett, Rebekah; Leo, Cheryl; Gullo, Matthew J; Connor, Jason P

    2015-10-01

    This study seeks to establish whether meaningful subgroups exist within a 14-16 year old adolescent population and if these segments respond differently to the Game On: Know Alcohol (GOKA) intervention, a school-based alcohol social marketing program. This study is part of a larger cluster randomized controlled evaluation of the GOKA program implemented in 14 schools in 2013/2014. TwoStep cluster analysis was conducted to segment 2,114 high school adolescents (14-16 years old) on the basis of 22 demographic, behavioral, and psychographic variables. Program effects on knowledge, attitudes, behavioral intentions, social norms, alcohol expectancies, and drinking refusal self-efficacy of identified segments were subsequently examined. Three segments were identified: (1) Abstainers, (2) Bingers, and (3) Moderate Drinkers. Program effects varied significantly across segments. The strongest positive change effects post-participation were observed for Bingers, while mixed effects were evident for Moderate Drinkers and Abstainers. These findings provide preliminary empirical evidence supporting the application of social marketing segmentation in alcohol education programs. Development of targeted programs that meet the unique needs of each of the three identified segments will extend the social marketing footprint in alcohol education. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Instructional format and segment length in interactive video programs

    NARCIS (Netherlands)

    Verhagen, Pleunes Willem; Breman, Jeroen; Breman, Jeroen; Simonson, Michael R.; Anderson, Mary Lagomarcino

    1995-01-01

    The purpose of this study was to gather further insight into a previous investigation of the relationship between self-chosen and program-controlled segment length of an interactive videodisk program, and performance on post- and retention tests. The initial study by Verhagen, which questioned what

  9. Segmentation of myocardium from cardiac MR images using a novel dynamic programming based segmentation method.

    Science.gov (United States)

    Qian, Xiaohua; Lin, Yuan; Zhao, Yue; Wang, Jing; Liu, Jing; Zhuang, Xiahai

    2015-03-01

    Myocardium segmentation in cardiac magnetic resonance (MR) images plays a vital role in clinical diagnosis of the cardiovascular diseases. Because of the low contrast and large variation in intensity and shapes, myocardium segmentation has been a challenging task. A dynamic programming (DP) based segmentation method, incorporating the likelihood and shape information of the myocardium, is developed for segmenting myocardium in cardiac MR images. The endocardium, i.e., the left ventricle blood cavity, is segmented for initialization, and then the optimal epicardium contour is determined using the polar-transformed image and DP scheme. In the DP segmentation scheme, three techniques are proposed to improve the segmentation performance: (1) the likelihood image of the myocardium is constructed to define the external cost in the DP, thus the cost function incorporates prior probability estimation; (2) the adaptive search range is introduced to determine the polar-transformed image, thereby excluding irrelevant tissues; (3) the connectivity constrained DP algorithm is developed to obtain an optimal closed contour. Four metrics, including the Dice metric (Dice), root mean squared error (RMSE), reliability, and correlation coefficient, are used to assess the segmentation accuracy. The authors evaluated the performance of the proposed method on a private dataset and the MICCAI 2009 challenge dataset. The authors also explored the effects of the three new techniques of the DP scheme in the proposed method. For the qualitative evaluation, the segmentation results of the proposed method were clinically acceptable. For the quantitative evaluation, the mean (Dice) for the endocardium and epicardium was 0.892 and 0.927, respectively; the mean RMSE was 2.30 mm for the endocardium and 2.39 mm for the epicardium. In addition, the three new techniques in the proposed DP scheme, i.e., the likelihood image of the myocardium, the adaptive search range, and the connectivity constrained

  10. Dynamic Programming Based Segmentation in Biomedical Imaging.

    Science.gov (United States)

    Ungru, Kathrin; Jiang, Xiaoyi

    2017-01-01

    Many applications in biomedical imaging have a demand on automatic detection of lines, contours, or boundaries of bones, organs, vessels, and cells. Aim is to support expert decisions in interactive applications or to include it as part of a processing pipeline for automatic image analysis. Biomedical images often suffer from noisy data and fuzzy edges. Therefore, there is a need for robust methods for contour and line detection. Dynamic programming is a popular technique that satisfies these requirements in many ways. This work gives a brief overview over approaches and applications that utilize dynamic programming to solve problems in the challenging field of biomedical imaging.

  11. New digital demodulator with matched filters and curve segmentation techniques for BFSK demodulation: Analytical description

    Directory of Open Access Journals (Sweden)

    Jorge Torres Gómez

    2015-09-01

    Full Text Available The present article relates in general to digital demodulation of Binary Frequency Shift Keying (BFSK. The objective of the present research is to obtain a new processing method for demodulating BFSK-signals in order to reduce hardware complexity in comparison with other methods reported. The solution proposed here makes use of the matched filter theory and curve segmentation algorithms. This paper describes the integration and configuration of a Sampler Correlator and curve segmentation blocks in order to obtain a digital receiver for a proper demodulation of the received signal. The proposed solution is shown to strongly reduce hardware complexity. In this part a presentation of the proposed solution regarding the analytical expressions is addressed. The paper covers in detail the elements needed for properly configuring the system. In a second part it is presented the implementation of the system for FPGA technology and the simulation results in order to validate the overall performance.

  12. A Big Data Analytics Methodology Program in the Health Sector

    Science.gov (United States)

    Lawler, James; Joseph, Anthony; Howell-Barber, H.

    2016-01-01

    The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…

  13. Analytic central path, sensitivity analysis and parametric linear programming

    NARCIS (Netherlands)

    A.G. Holder; J.F. Sturm; S. Zhang (Shuzhong)

    1998-01-01

    textabstractIn this paper we consider properties of the central path and the analytic center of the optimal face in the context of parametric linear programming. We first show that if the right-hand side vector of a standard linear program is perturbed, then the analytic center of the optimal face

  14. Pre-analytical and analytical variation of drug determination in segmented hair using ultra-performance liquid chromatography-tandem mass spectrometry

    DEFF Research Database (Denmark)

    Nielsen, Marie Katrine Klose; Johansen, Sys Stybe; Linnet, Kristian

    2014-01-01

    variation was estimated to be less than 15% for almost all compounds. The method was successfully applied to 25 segmented hair specimens from deceased drug addicts showing a broad pattern of poly-drug use. The pre-analytical sampling variation was estimated from the genuine duplicate measurements of two...

  15. Validating an image segmentation program devised for staging lymphoma.

    Science.gov (United States)

    Slattery, Anthony

    2017-10-02

    Hybrid positron emission tomography-computed tomography (PET-CT) imaging systems are an important tool for assessing the progression of lymphoma. PET-CT systems offer the ability to quantitatively assess lymphocytic bone involvement throughout the body. There is no standard methodology for staging lymphoma patients using PET-CT images. Automatic image segmentation algorithms could offer medical specialists a means to evaluate bone involvement from PET-CT images in a consistent manner. To devise and validate an image segmentation program that may assist staging lymphoma by determining the degree of bone involvement based from PET-CT studies. A custom-made program was developed to segment regions-of-interest from images by utilising an enhanced fuzzy clustering technique that incorporates spatial information. The program was subsequently tested on digital and physical phantoms using four different performance metrics before being employed to extract the bony regions of clinical PET-CT images acquired from 248 patients staged for lymphoma. The algorithm was satisfactorily able to delineate regions-of-interest within all phantoms. When applied to the clinical PET-CT images, the algorithm was capable of accurately segmenting bony regions in less than half of the subjects (n = 103). The performance of the algorithm was adversely affected by the presence of oral contrast, metal implants and the poor image quality afforded by low dose CT images in general. Significant changes are necessary before the algorithm can be employed clinically in an unsupervised fashion. However, with further work performed, the algorithm could potentially prove useful for medical specialists staging lymphoma in the future.

  16. NASA's mobile satellite communications program; ground and space segment technologies

    Science.gov (United States)

    Naderi, F.; Weber, W. J.; Knouse, G. H.

    1984-01-01

    This paper describes the Mobile Satellite Communications Program of the United States National Aeronautics and Space Administration (NASA). The program's objectives are to facilitate the deployment of the first generation commercial mobile satellite by the private sector, and to technologically enable future generations by developing advanced and high risk ground and space segment technologies. These technologies are aimed at mitigating severe shortages of spectrum, orbital slot, and spacecraft EIRP which are expected to plague the high capacity mobile satellite systems of the future. After a brief introduction of the concept of mobile satellite systems and their expected evolution, this paper outlines the critical ground and space segment technologies. Next, the Mobile Satellite Experiment (MSAT-X) is described. MSAT-X is the framework through which NASA will develop advanced ground segment technologies. An approach is outlined for the development of conformal vehicle antennas, spectrum and power-efficient speech codecs, and modulation techniques for use in the non-linear faded channels and efficient multiple access schemes. Finally, the paper concludes with a description of the current and planned NASA activities aimed at developing complex large multibeam spacecraft antennas needed for future generation mobile satellite systems.

  17. FASP, an analytic resource appraisal program for petroleum play analysis

    Science.gov (United States)

    Crovelli, R.A.; Balay, R.H.

    1986-01-01

    An analytic probabilistic methodology for resource appraisal of undiscovered oil and gas resources in play analysis is presented in a FORTRAN program termed FASP. This play-analysis methodology is a geostochastic system for petroleum resource appraisal in explored as well as frontier areas. An established geologic model considers both the uncertainty of the presence of the assessed hydrocarbon and its amount if present. The program FASP produces resource estimates of crude oil, nonassociated gas, dissolved gas, and gas for a geologic play in terms of probability distributions. The analytic method is based upon conditional probability theory and many laws of expectation and variance. ?? 1986.

  18. How MBA Programs Are Using the GMAT's Analytical Writing Assessment.

    Science.gov (United States)

    Noll, Cheryl L.; Stowers, Robert H.

    1998-01-01

    Finds that 86% of the 59 MBA programs completing a survey used scores from the Analytical Writing Assessment of the Graduate Management Admission Test to refine their admissions decisions, but only a few schools used the test diagnostically in making such other decisions as placing students in writing-development courses, waiving communication…

  19. Analytic Study of the Tadoma Method: Effects on Hand Position of Segmental Speech Perception.

    Science.gov (United States)

    Reed, Charlotte M.; And Others

    1989-01-01

    Small-set segmental identification experiments were conducted with three deaf-blind subjects who were highly experienced users of the Tadoma method. Systematic variations in the positioning of the hand on the speaker's face for Tadoma produced systematic effects on percent-correct scores, information transfer, and perception of individual…

  20. Health informatics and analytics - building a program to integrate business analytics across clinical and administrative disciplines.

    Science.gov (United States)

    Tremblay, Monica Chiarini; Deckard, Gloria J; Klein, Richard

    2016-07-01

    Health care organizations must develop integrated health information systems to respond to the numerous government mandates driving the movement toward reimbursement models emphasizing value-based and accountable care. Success in this transition requires integrated data analytics, supported by the combination of health informatics, interoperability, business process design, and advanced decision support tools. This case study presents the development of a master's level cross- and multidisciplinary informatics program offered through a business school. The program provides students from diverse backgrounds with the knowledge, leadership, and practical application skills of health informatics, information systems, and data analytics that bridge the interests of clinical and nonclinical professionals. This case presents the actions taken and challenges encountered in navigating intra-university politics, specifying curriculum, recruiting the requisite interdisciplinary faculty, innovating the educational format, managing students with diverse educational and professional backgrounds, and balancing multiple accreditation agencies. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. Segmentation of Portuguese customers’ expectations from fitness programs

    Directory of Open Access Journals (Sweden)

    Ricardo Gouveia Rodrigues

    2017-10-01

    Full Text Available Expectations towards fitness exercises are the major factor in customer satisfaction in the service sector in question. The purpose of this study is to present a segmentation framework for fitness customers, based on their individual expectations. The survey was designed and validated to evaluate individual expectations towards exercises. The study included a randomly recruited sample of 723 subjects (53% males; 47% females; 42.1±19.7 years. Factor analysis and cluster analysis with Ward’s cluster method with squared Euclidean distance were used to analyse the data obtained. Four components were extracted (performance, enjoyment, beauty and health explaining 68.7% of the total variance and three distinct segments were found: Exercise Lovers (n=312, Disinterested (n=161 and Beauty Seekers (n=250. All the factors identified have a significant contribution to differentiate the clusters, the first and third clusters being most similar. The segmentation framework obtained based on customer expectations allows better understanding of customers’ profiles, thus helping the fitness industry develop services more suitable for each type of customers. A follow-up study was conducted 5 years later and the results concur with the initial study.

  2. Visual programming for next-generation sequencing data analytics.

    Science.gov (United States)

    Milicchio, Franco; Rose, Rebecca; Bian, Jiang; Min, Jae; Prosperi, Mattia

    2016-01-01

    High-throughput or next-generation sequencing (NGS) technologies have become an established and affordable experimental framework in biological and medical sciences for all basic and translational research. Processing and analyzing NGS data is challenging. NGS data are big, heterogeneous, sparse, and error prone. Although a plethora of tools for NGS data analysis has emerged in the past decade, (i) software development is still lagging behind data generation capabilities, and (ii) there is a 'cultural' gap between the end user and the developer. Generic software template libraries specifically developed for NGS can help in dealing with the former problem, whilst coupling template libraries with visual programming may help with the latter. Here we scrutinize the state-of-the-art low-level software libraries implemented specifically for NGS and graphical tools for NGS analytics. An ideal developing environment for NGS should be modular (with a native library interface), scalable in computational methods (i.e. serial, multithread, distributed), transparent (platform-independent), interoperable (with external software interface), and usable (via an intuitive graphical user interface). These characteristics should facilitate both the run of standardized NGS pipelines and the development of new workflows based on technological advancements or users' needs. We discuss in detail the potential of a computational framework blending generic template programming and visual programming that addresses all of the current limitations. In the long term, a proper, well-developed (although not necessarily unique) software framework will bridge the current gap between data generation and hypothesis testing. This will eventually facilitate the development of novel diagnostic tools embedded in routine healthcare.

  3. Large Space Telescope (LST) Pointing Control System (PCS) analytical Advanced Technical Development (ATD) program

    Science.gov (United States)

    Seltzer, S. M.

    1974-01-01

    The large space telescope (LST) pointing control system (PCS) advanced technical development (ATD) program is described. The approach used is to describe the overall PCS development effort, showing how the analytical ATD program elements fit into it. Then the analytical ATD program elements are summarized.

  4. Fully automated segmentation of left ventricle using dual dynamic programming in cardiac cine MR images

    Science.gov (United States)

    Jiang, Luan; Ling, Shan; Li, Qiang

    2016-03-01

    Cardiovascular diseases are becoming a leading cause of death all over the world. The cardiac function could be evaluated by global and regional parameters of left ventricle (LV) of the heart. The purpose of this study is to develop and evaluate a fully automated scheme for segmentation of LV in short axis cardiac cine MR images. Our fully automated method consists of three major steps, i.e., LV localization, LV segmentation at end-diastolic phase, and LV segmentation propagation to the other phases. First, the maximum intensity projection image along the time phases of the midventricular slice, located at the center of the image, was calculated to locate the region of interest of LV. Based on the mean intensity of the roughly segmented blood pool in the midventricular slice at each phase, end-diastolic (ED) and end-systolic (ES) phases were determined. Second, the endocardial and epicardial boundaries of LV of each slice at ED phase were synchronously delineated by use of a dual dynamic programming technique. The external costs of the endocardial and epicardial boundaries were defined with the gradient values obtained from the original and enhanced images, respectively. Finally, with the advantages of the continuity of the boundaries of LV across adjacent phases, we propagated the LV segmentation from the ED phase to the other phases by use of dual dynamic programming technique. The preliminary results on 9 clinical cardiac cine MR cases show that the proposed method can obtain accurate segmentation of LV based on subjective evaluation.

  5. Using Maps in Web Analytics to Evaluate the Impact of Web-Based Extension Programs

    Science.gov (United States)

    Veregin, Howard

    2015-01-01

    Maps can be a valuable addition to the Web analytics toolbox for Extension programs that use the Web to disseminate information. Extension professionals use Web analytics tools to evaluate program impacts. Maps add a unique perspective through visualization and analysis of geographic patterns and their relationships to other variables. Maps can…

  6. Inorganic Analytical Service within the Superfund Contract Laboratory Program

    Science.gov (United States)

    This page contains information about the ISM02.3 and ISM02.4 statement of work for the analysis of metals and cyanide at hazardous waste sites. The SOW contains the analytical method and contractual requirements for laboratories.

  7. Cavity contour segmentation in chest radiographs using supervised learning and dynamic programming

    Energy Technology Data Exchange (ETDEWEB)

    Maduskar, Pragnya, E-mail: pragnya.maduskar@radboudumc.nl; Hogeweg, Laurens; Sánchez, Clara I.; Ginneken, Bram van [Diagnostic Image Analysis Group, Radboud University Medical Center, Nijmegen, 6525 GA (Netherlands); Jong, Pim A. de [Department of Radiology, University Medical Center Utrecht, 3584 CX (Netherlands); Peters-Bax, Liesbeth [Department of Radiology, Radboud University Medical Center, Nijmegen, 6525 GA (Netherlands); Dawson, Rodney [University of Cape Town Lung Institute, Cape Town 7700 (South Africa); Ayles, Helen [Department of Infectious and Tropical Diseases, London School of Hygiene and Tropical Medicine, London WC1E 7HT (United Kingdom)

    2014-07-15

    Purpose: Efficacy of tuberculosis (TB) treatment is often monitored using chest radiography. Monitoring size of cavities in pulmonary tuberculosis is important as the size predicts severity of the disease and its persistence under therapy predicts relapse. The authors present a method for automatic cavity segmentation in chest radiographs. Methods: A two stage method is proposed to segment the cavity borders, given a user defined seed point close to the center of the cavity. First, a supervised learning approach is employed to train a pixel classifier using texture and radial features to identify the border pixels of the cavity. A likelihood value of belonging to the cavity border is assigned to each pixel by the classifier. The authors experimented with four different classifiers:k-nearest neighbor (kNN), linear discriminant analysis (LDA), GentleBoost (GB), and random forest (RF). Next, the constructed likelihood map was used as an input cost image in the polar transformed image space for dynamic programming to trace the optimal maximum cost path. This constructed path corresponds to the segmented cavity contour in image space. Results: The method was evaluated on 100 chest radiographs (CXRs) containing 126 cavities. The reference segmentation was manually delineated by an experienced chest radiologist. An independent observer (a chest radiologist) also delineated all cavities to estimate interobserver variability. Jaccard overlap measure Ω was computed between the reference segmentation and the automatic segmentation; and between the reference segmentation and the independent observer's segmentation for all cavities. A median overlap Ω of 0.81 (0.76 ± 0.16), and 0.85 (0.82 ± 0.11) was achieved between the reference segmentation and the automatic segmentation, and between the segmentations by the two radiologists, respectively. The best reported mean contour distance and Hausdorff distance between the reference and the automatic segmentation were

  8. Defining Audience Segments for Extension Programming Using Reported Water Conservation Practices

    Science.gov (United States)

    Monaghan, Paul; Ott, Emily; Wilber, Wendy; Gouldthorpe, Jessica; Racevskis, Laila

    2013-01-01

    A tool from social marketing can help Extension agents understand distinct audience segments among their constituents. Defining targeted audiences for Extension programming is a first step to influencing behavior change among the public. An online survey was conducted using an Extension email list for urban households receiving a monthly lawn and…

  9. Fast segmentation of the left ventricle in cardiac MRI using dynamic programming.

    Science.gov (United States)

    Santiago, Carlos; Nascimento, Jacinto C; Marques, Jorge S

    2018-02-01

    The segmentation of the left ventricle (LV) in cardiac magnetic resonance imaging is a necessary step for the analysis and diagnosis of cardiac function. In most clinical setups, this step is still manually performed by cardiologists, which is time-consuming and laborious. This paper proposes a fast system for the segmentation of the LV that significantly reduces human intervention. A dynamic programming approach is used to obtain the border of the LV. Using very simple assumptions about the expected shape and location of the segmentation, this system is able to deal with many of the challenges associated with this problem. The system was evaluated on two public datasets: one with 33 patients, comprising a total of 660 magnetic resonance volumes and another with 45 patients, comprising a total of 90 volumes. Quantitative evaluation of the segmentation accuracy and computational complexity was performed. The proposed system is able to segment a whole volume in 1.5 seconds and achieves an average Dice similarity coefficient of 86.0% and an average perpendicular distance of 2.4 mm, which compares favorably with other state-of-the-art methods. A system for the segmentation of the left ventricle in cardiac magnetic resonance imaging is proposed. It is a fast framework that significantly reduces the amount of time and work required of cardiologists. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Improved dynamic-programming-based algorithms for segmentation of masses in mammograms.

    Science.gov (United States)

    Rojas Domínguez, Alfonso; Nandi, Asoke K

    2007-11-01

    In this paper, two new boundary tracing algorithms for segmentation of breast masses are presented. These new algorithms are based on the dynamic programming-based boundary tracing (DPBT) algorithm proposed in Timp and Karssemeijer, [S. Timp and N. Karssemeijer, Med. Phys. 31, 958-971 (2004)] The DPBT algorithm contains two main steps: (1) construction of a local cost function, and (2) application of dynamic programming to the selection of the optimal boundary based on the local cost function. The validity of some assumptions used in the design of the DPBT algorithm is tested in this paper using a set of 349 mammographic images. Based on the results of the tests, modifications to the computation of the local cost function have been designed and have resulted in the Improved-DPBT (IDPBT) algorithm. A procedure for the dynamic selection of the strength of the components of the local cost function is presented that makes these parameters independent of the image dataset. Incorporation of this dynamic selection procedure has produced another new algorithm which we have called ID2PBT. Methods for the determination of some other parameters of the DPBT algorithm that were not covered in the original paper are presented as well. The merits of the new IDPBT and ID2PBT algorithms are demonstrated experimentally by comparison against the DPBT algorithm. The segmentation results are evaluated with base on the area overlap measure and other segmentation metrics. Both of the new algorithms outperform the original DPBT; the improvements in the algorithms performance are more noticeable around the values of the segmentation metrics corresponding to the highest segmentation accuracy, i.e., the new algorithms produce more optimally segmented regions, rather than a pronounced increase in the average quality of all the segmented regions.

  11. Dynamic programming in parallel boundary detection with application to ultrasound intima-media segmentation.

    Science.gov (United States)

    Zhou, Yuan; Cheng, Xinyao; Xu, Xiangyang; Song, Enmin

    2013-12-01

    Segmentation of carotid artery intima-media in longitudinal ultrasound images for measuring its thickness to predict cardiovascular diseases can be simplified as detecting two nearly parallel boundaries within a certain distance range, when plaque with irregular shapes is not considered. In this paper, we improve the implementation of two dynamic programming (DP) based approaches to parallel boundary detection, dual dynamic programming (DDP) and piecewise linear dual dynamic programming (PL-DDP). Then, a novel DP based approach, dual line detection (DLD), which translates the original 2-D curve position to a 4-D parameter space representing two line segments in a local image segment, is proposed to solve the problem while maintaining efficiency and rotation invariance. To apply the DLD to ultrasound intima-media segmentation, it is imbedded in a framework that employs an edge map obtained from multiplication of the responses of two edge detectors with different scales and a coupled snake model that simultaneously deforms the two contours for maintaining parallelism. The experimental results on synthetic images and carotid arteries of clinical ultrasound images indicate improved performance of the proposed DLD compared to DDP and PL-DDP, with respect to accuracy and efficiency. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. Endocardium and Epicardium Segmentation in MR Images Based on Developed Otsu and Dynamic Programming

    Directory of Open Access Journals (Sweden)

    Shengzhou XU

    2014-03-01

    Full Text Available In order to accurately extract the endocardium and epicardium of the left ventricle from cardiac magnetic resonance (MR images, a method based on developed Otsu and dynamic programming has been proposed. First, regions with high gray value are divided into several left ventricle candidate regions by the developed Otsu algorithm, which based on constraining the search range of the ideal segmentation threshold. Then, left ventricular blood pool is selected from the candidate regions and its convex hull is found out as the endocardium. The epicardium is derived by applying dynamic programming method to find a closed path with minimum local cost. The local cost function of the dynamic programming method consists of two factors: boundary gradient and shape features. In order to improve the accuracy of segmentation, a non-maxima gradient suppression technique is adopted to get the boundary gradient. The experimental result of 138 MR images show that the method proposed has high accuracy and robustness.

  13. System and Method for Providing a Climate Data Analytic Services Application Programming Interface Distribution Package

    Science.gov (United States)

    Schnase, John L. (Inventor); Duffy, Daniel Q. (Inventor); Tamkin, Glenn S. (Inventor)

    2016-01-01

    A system, method and computer-readable storage devices for providing a climate data analytic services application programming interface distribution package. The example system can provide various components. The system provides a climate data analytic services application programming interface library that enables software applications running on a client device to invoke the capabilities of a climate data analytic service. The system provides a command-line interface that provides a means of interacting with a climate data analytic service by issuing commands directly to the system's server interface. The system provides sample programs that call on the capabilities of the application programming interface library and can be used as templates for the construction of new client applications. The system can also provide test utilities, build utilities, service integration utilities, and documentation.

  14. Automatic Nuclear Segmentation Using Multiscale Radial Line Scanning With Dynamic Programming.

    Science.gov (United States)

    Xu, Hongming; Lu, Cheng; Berendt, Richard; Jha, Naresh; Mandal, Mrinal

    2017-10-01

    In the diagnosis of various cancers by analyzing histological images, automatic nuclear segmentation is an important step. However, nuclear segmentation is a difficult problem because of overlapping nuclei, inhomogeneous staining, and presence of noisy pixels and other tissue components. In this paper, we present an automatic technique for nuclear segmentation in skin histological images. The proposed technique first applies a bank of generalized Laplacian of Gaussian kernels to detect nuclear seeds. Based on the detected nuclear seeds, a multiscale radial line scanning method combined with dynamic programming is applied to extract a set of candidate nuclear boundaries. The gradient, intensity, and shape information are then integrated to determine the optimal boundary for each nucleus in the image. Nuclear overlap limitation is finally imposed based on a Dice coefficient measure such that the obtained nuclear contours do not severely intersect with each other. Experiments have been thoroughly performed on two datasets with H&E and Ki-67 stained images, which show that the proposed technique is superior to conventional schemes of nuclear segmentation.

  15. One Size (Never) Fits All: Segment Differences Observed Following a School-Based Alcohol Social Marketing Program

    Science.gov (United States)

    Dietrich, Timo; Rundle-Thiele, Sharyn; Leo, Cheryl; Connor, Jason

    2015-01-01

    Background: According to commercial marketing theory, a market orientation leads to improved performance. Drawing on the social marketing principles of segmentation and audience research, the current study seeks to identify segments to examine responses to a school-based alcohol social marketing program. Methods: A sample of 371 year 10 students…

  16. Fully automated segmentation of whole breast using dynamic programming in dynamic contrast enhanced MR images.

    Science.gov (United States)

    Jiang, Luan; Hu, Xiaoxin; Xiao, Qin; Gu, Yajia; Li, Qiang

    2017-06-01

    Amount of fibroglandular tissue (FGT) and level of background parenchymal enhancement (BPE) in breast dynamic contrast enhanced magnetic resonance images (DCE-MRI) are suggested as strong indices for assessing breast cancer risk. Whole breast segmentation is the first important task for quantitative analysis of FGT and BPE in three-dimensional (3-D) DCE-MRI. The purpose of this study is to develop and evaluate a fully automated technique for accurate segmentation of the whole breast in 3-D fat-suppressed DCE-MRI. The whole breast segmentation consisted of two major steps, i.e., the delineation of chest wall line and breast skin line. First, a sectional dynamic programming method was employed to trace the upper and/or lower boundaries of the chest wall by use of the positive and/or negative gradient within a band along the chest wall in each 2-D slice. Second, another dynamic programming was applied to delineate the skin-air boundary slice-by-slice based on the saturated gradient of the enhanced image obtained with the prior statistical distribution of gray levels of the breast skin line. Starting from the central slice, these two steps employed a Gaussian function to limit the search range of boundaries in adjacent slices based on the continuity of chest wall line and breast skin line. Finally, local breast skin line detection was applied around armpit to complete the whole breast segmentation. The method was validated with a representative dataset of 100 3-D breast DCE-MRI scans through objective quantification and subjective evaluation. The MR scans in the dataset were acquired with four MR scanners in five spatial resolutions. The cases were assessed with four breast density ratings by radiologists based on Breast Imaging Reporting and Data System (BI-RADS) of American College of Radiology. Our segmentation algorithm achieved a Dice volume overlap measure of 95.8 ± 1.2% and volume difference measure of 8.4 ± 2.4% between the automatically and manually

  17. A national analytical quality assurance program: Developing guidelines and analytical tools for the forest inventory and analysis program

    Science.gov (United States)

    Phyllis C. Adams; Glenn A. Christensen

    2012-01-01

    A rigorous quality assurance (QA) process assures that the data and information provided by the Forest Inventory and Analysis (FIA) program meet the highest possible standards of precision, completeness, representativeness, comparability, and accuracy. FIA relies on its analysts to check the final data quality prior to release of a State’s data to the national FIA...

  18. A chance-constrained programming level set method for longitudinal segmentation of lung tumors in CT.

    Science.gov (United States)

    Rouchdy, Youssef; Bloch, Isabelle

    2011-01-01

    This paper presents a novel stochastic level set method for the longitudinal tracking of lung tumors in computed tomography (CT). The proposed model addresses the limitations of registration based and segmentation based methods for longitudinal tumor tracking. It combines the advantages of each approach using a new probabilistic framework, namely Chance-Constrained Programming (CCP). Lung tumors can shrink or grow over time, which can be reflected in large changes of shape, appearance and volume in CT images. Traditional level set methods with a priori knowledge about shape are not suitable since the tumors are undergoing random and large changes in shape. Our CCP level set model allows to introduce a flexible prior to track structures with a highly variable shape by permitting a constraint violation of the prior up to a specified probability level. The chance constraints are computed from two given points by the user or from segmented tumors from a reference image. The reference image can be one of the images studied or an external template. We present a numerical scheme to approximate the solution of the proposed model and apply it to track lung tumors in CT. Finally, we compare our approach with a Bayesian level set. The CCP level set model gives the best results: it is more coherent with the manual segmentation.

  19. Semi-automatic 3D lung nodule segmentation in CT using dynamic programming

    Science.gov (United States)

    Sargent, Dustin; Park, Sun Young

    2017-02-01

    We present a method for semi-automatic segmentation of lung nodules in chest CT that can be extended to general lesion segmentation in multiple modalities. Most semi-automatic algorithms for lesion segmentation or similar tasks use region-growing or edge-based contour finding methods such as level-set. However, lung nodules and other lesions are often connected to surrounding tissues, which makes these algorithms prone to growing the nodule boundary into the surrounding tissue. To solve this problem, we apply a 3D extension of the 2D edge linking method with dynamic programming to find a closed surface in a spherical representation of the nodule ROI. The algorithm requires a user to draw a maximal diameter across the nodule in the slice in which the nodule cross section is the largest. We report the lesion volume estimation accuracy of our algorithm on the FDA lung phantom dataset, and the RECIST diameter estimation accuracy on the lung nodule dataset from the SPIE 2016 lung nodule classification challenge. The phantom results in particular demonstrate that our algorithm has the potential to mitigate the disparity in measurements performed by different radiologists on the same lesions, which could improve the accuracy of disease progression tracking.

  20. The Solid* toolset for software visual analytics of program structure and metrics comprehension : From research prototype to product

    NARCIS (Netherlands)

    Reniers, Dennie; Voinea, Lucian; Ersoy, Ozan; Telea, Alexandru

    2014-01-01

    Software visual analytics (SVA) tools combine static program analysis and fact extraction with information visualization to support program comprehension. However, building efficient and effective SVA tools is highly challenging, as it involves extensive software development in program analysis,

  1. Standard guide for establishing a quality assurance program for analytical chemistry laboratories within the nuclear industry

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2006-01-01

    1.1 This guide covers the establishment of a quality assurance (QA) program for analytical chemistry laboratories within the nuclear industry. Reference to key elements of ANSI/ISO/ASQC Q9001, Quality Systems, provides guidance to the functional aspects of analytical laboratory operation. When implemented as recommended, the practices presented in this guide will provide a comprehensive QA program for the laboratory. The practices are grouped by functions, which constitute the basic elements of a laboratory QA program. 1.2 The essential, basic elements of a laboratory QA program appear in the following order: Section Organization 5 Quality Assurance Program 6 Training and Qualification 7 Procedures 8 Laboratory Records 9 Control of Records 10 Control of Procurement 11 Control of Measuring Equipment and Materials 12 Control of Measurements 13 Deficiencies and Corrective Actions 14

  2. Online Programs as Tools to Improve Parenting: A meta-analytic review

    NARCIS (Netherlands)

    dr. Christa C.C. Nieuwboer; Prof. Dr. Ruben R.G. Fukkink; Prof. Dr. Jo J.M.A Hermanns

    2013-01-01

    Background. A number of parenting programs, aimed at improving parenting competencies,have recently been adapted or designed with the use of online technologies. Although webbased services have been claimed to hold promise for parent support, a meta-analytic review of online parenting interventions

  3. Waste Tank Organic Safety Program: Analytical methods development. Progress report, FY 1994

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, J.A.; Clauss, S.A.; Grant, K.E. [and others

    1994-09-01

    The objectives of this task are to develop and document extraction and analysis methods for organics in waste tanks, and to extend these methods to the analysis of actual core samples to support the Waste Tank organic Safety Program. This report documents progress at Pacific Northwest Laboratory (a) during FY 1994 on methods development, the analysis of waste from Tank 241-C-103 (Tank C-103) and T-111, and the transfer of documented, developed analytical methods to personnel in the Analytical Chemistry Laboratory (ACL) and 222-S laboratory. This report is intended as an annual report, not a completed work.

  4. A meta-analytic review of eating disorder prevention programs: encouraging findings.

    Science.gov (United States)

    Stice, Eric; Shaw, Heather; Marti, C Nathan

    2007-01-01

    This meta-analytic review found that 51% of eating disorder prevention programs reduced eating disorder risk factors and 29% reduced current or future eating pathology. Larger effects occurred for programs that were selected (versus universal), interactive (versus didactic), multisession (versus single session), solely offered to females (versus both sexes), offered to participants over 15 years of age (versus younger ones), and delivered by professional interventionists (versus endogenous providers). Programs with body acceptance and dissonance-induction content and without psychoeducational content and programs evaluated in trials using validated measures and a shorter follow-up period also produced larger effects. Results identify promising programs and delineate sample, format, and design features associated with larger effects, which may inform the design of more effective prevention programs in the future.

  5. Requirements for quality control of analytical data for the Environmental Restoration Program

    Energy Technology Data Exchange (ETDEWEB)

    Engels, J.

    1992-12-01

    The Environmental Restoration (ER) Program was established for the investigation and remediation of inactive US Department of Energy (DOE) sites and facilities that have been declared surplus in terms of their previous uses. The purpose of this document is to Specify ER requirements for quality control (QC) of analytical data. Activities throughout all phases of the investigation may affect the quality of the final data product, thus are subject to control specifications. Laboratory control is emphasized in this document, and field concerns will be addressed in a companion document Energy Systems, in its role of technical coordinator and at the request of DOE-OR, extends the application of these requirements to all participants in ER activities. Because every instance and concern may not be addressed in this document, participants are encouraged to discuss any questions with the ER Quality Assurance (QA) Office, the Analytical Environmental Support Group (AESG), or the Analytical Project Office (APO).

  6. Handling movement epenthesis and hand segmentation ambiguities in continuous sign language recognition using nested dynamic programming.

    Science.gov (United States)

    Yang, Ruiduo; Sarkar, Sudeep; Loeding, Barbara

    2010-03-01

    We consider two crucial problems in continuous sign language recognition from unaided video sequences. At the sentence level, we consider the movement epenthesis (me) problem and at the feature level, we consider the problem of hand segmentation and grouping. We construct a framework that can handle both of these problems based on an enhanced, nested version of the dynamic programming approach. To address movement epenthesis, a dynamic programming (DP) process employs a virtual me option that does not need explicit models. We call this the enhanced level building (eLB) algorithm. This formulation also allows the incorporation of grammar models. Nested within this eLB is another DP that handles the problem of selecting among multiple hand candidates. We demonstrate our ideas on four American Sign Language data sets with simple background, with the signer wearing short sleeves, with complex background, and across signers. We compared the performance with Conditional Random Fields (CRF) and Latent Dynamic-CRF-based approaches. The experiments show more than 40 percent improvement over CRF or LDCRF approaches in terms of the frame labeling rate. We show the flexibility of our approach when handling a changing context. We also find a 70 percent improvement in sign recognition rate over the unenhanced DP matching algorithm that does not accommodate the me effect.

  7. Analytical Services Fiscal Year 1996 Multi-year Program Plan Fiscal Year Work Plan WBS 1.5.1, Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    This document contains the Fiscal Year 1996 Work Plan and Multi-Year Program Plan for the Analytical Services Program at the Hanford Reservation in Richland, Washington. The Analytical Services Program provides vital support to the Hanford Site mission and provides technically sound, defensible, cost effective, high quality analytical chemistry data for the site programs. This report describes the goals and strategies for continuance of the Analytical Services Program through fiscal year 1996 and beyond.

  8. A Comparison of Four Software Programs for Implementing Decision Analytic Cost-Effectiveness Models.

    Science.gov (United States)

    Hollman, Chase; Paulden, Mike; Pechlivanoglou, Petros; McCabe, Christopher

    2017-08-01

    The volume and technical complexity of both academic and commercial research using decision analytic modelling has increased rapidly over the last two decades. The range of software programs used for their implementation has also increased, but it remains true that a small number of programs account for the vast majority of cost-effectiveness modelling work. We report a comparison of four software programs: TreeAge Pro, Microsoft Excel, R and MATLAB. Our focus is on software commonly used for building Markov models and decision trees to conduct cohort simulations, given their predominance in the published literature around cost-effectiveness modelling. Our comparison uses three qualitative criteria as proposed by Eddy et al.: "transparency and validation", "learning curve" and "capability". In addition, we introduce the quantitative criterion of processing speed. We also consider the cost of each program to academic users and commercial users. We rank the programs based on each of these criteria. We find that, whilst Microsoft Excel and TreeAge Pro are good programs for educational purposes and for producing the types of analyses typically required by health technology assessment agencies, the efficiency and transparency advantages of programming languages such as MATLAB and R become increasingly valuable when more complex analyses are required.

  9. Evaluating the use of programming games for building early analytical thinking skills

    Directory of Open Access Journals (Sweden)

    H. Tsalapatas

    2015-11-01

    Full Text Available Analytical thinking is a transversal skill that helps learners excel academically independently of theme area. It is on high demand in the world of work especially in innovation related sectors. It involves finding a viable solution to a problem by identifying goals, parameters, and resources available for deployment. These are strategy elements in game play. They further constitute good practices in programming. This work evaluates how serious games based on visual programming as a solution synthesis tool within exploration, inquiry, and collaboration can help learners build structured mindsets. It analyses how a visual programming environment that supports experimentation for building intuition on potential solutions to logical puzzles, and then encourages learners to synthesize a solution interactively, helps learners through gaming principles to build self-esteem on their problem solving ability, to develop algorithmic thinking capacity, and to stay engaged in learning.

  10. Community-Based Mental Health and Behavioral Programs for Low-Income Urban Youth: A Meta-Analytic Review

    Science.gov (United States)

    Farahmand, Farahnaz K.; Duffy, Sophia N.; Tailor, Megha A.; Dubois, David L.; Lyon, Aaron L.; Grant, Kathryn E.; Zarlinski, Jennifer C.; Masini, Olivia; Zander, Keith J.; Nathanson, Alison M.

    2012-01-01

    A meta-analytic review of 33 studies and 41 independent samples was conducted of the effectiveness of community-based mental health and behavioral programs for low-income urban youth. Findings indicated positive effects, with an overall mean effect of 0.25 at post-test. While this is comparable to previous meta-analytic intervention research with…

  11. Generalized Analytical Program of Thyristor Phase Control Circuit with Series and Parallel Resonance Load

    OpenAIRE

    Nakanishi, Sen-ichiro; Ishida, Hideaki; Himei, Toyoji

    1981-01-01

    The systematic analytical method is reqUired for the ac phase control circuit by means of an inverse parallel thyristor pair which has a series and parallel L-C resonant load, because the phase control action causes abnormal and interesting phenomena, such as an extreme increase of voltage and current, an unique increase and decrease of contained higher harmonics, and a wide variation of power factor, etc. In this paper, the program for the analysis of the thyristor phase control circuit with...

  12. Segmentation of antiperspirants and deodorants

    OpenAIRE

    Král, Tomáš

    2009-01-01

    The goal of Master's Thesis on topic Segmentation of antiperspirants and deodorants is to discover differences in consumer's behaviour, determinate and describe segments of consumers based on these differences and propose marketing strategy for the most attractive segments. Theoretical part describes market segmentation in general, process of segmentation and segmentation criteria. Analytic part characterizes Czech market of antiperspirants and deodorants, analyzes ACNielsen market data and d...

  13. Respiratory Protection Program Compliance in Iranian Hospitals: Application of Fuzzy Analytical Hierarchy Process.

    Science.gov (United States)

    Honarbakhsh, Marzieh; Jahangiri, Mehdi; Ghaem, Haleh; Farhadi, Payam

    2017-07-01

    In hospitals, health care workers (HCWs) are exposed to a wide range of respiratory hazards, which requires using respiratory protective equipment and implementing Respiratory Protection Programs (RPPs). The aim of this cross-sectional study was to investigate RPP implementation in 36 teaching hospitals located in the Fars province of Iran. A researcher-developed checklist, including nine components of the RPP standard, was completed by industrial hygienists in the study hospitals. The Fuzzy Analytical Hierarchy Process (FAHP) was used to determine the weight coefficient of RPP components. Finally, a Respiratory Protection Program Index (RPPI) was developed to calculate hospital compliance with RPP. The results showed that RPP were not fully implemented in the studied hospitals, and the highest and lowest RPPI scores were related to training and fit testing, respectively. To promote the implementation of RPP, significant efforts are required for all components, especially fit testing and worker evaluation.

  14. Analytical approaches used in stream benthic macroinvertebrate biomonitoring programs of State agencies in the United States

    Science.gov (United States)

    Carter, James L.; Resh, Vincent H.

    2013-01-01

    Biomonitoring programs based on benthic macroinvertebrates are well-established worldwide. Their value, however, depends on the appropriateness of the analytical techniques used. All United States State, benthic macroinvertebrate biomonitoring programs were surveyed regarding the purposes of their programs, quality-assurance and quality-control procedures used, habitat and water-chemistry data collected, treatment of macroinvertebrate data prior to analysis, statistical methods used, and data-storage considerations. State regulatory mandates (59 percent of programs), biotic index development (17 percent), and Federal requirements (15 percent) were the most frequently reported purposes of State programs, with the specific tasks of satisfying the requirements for 305b/303d reports (89 percent), establishment and monitoring of total maximum daily loads, and developing biocriteria being the purposes most often mentioned. Most states establish reference sites (81 percent), but classify them using State-specific methods. The most often used technique for determining the appropriateness of a reference site was Best Professional Judgment (86 percent of these states). Macroinvertebrate samples are almost always collected by using a D-frame net, and duplicate samples are collected from approximately 10 percent of sites for quality assurance and quality control purposes. Most programs have macroinvertebrate samples processed by contractors (53 percent) and have identifications confirmed by a second taxonomist (85 percent). All States collect habitat data, with most using the Rapid Bioassessment Protocol visual-assessment approach, which requires ~1 h/site. Dissolved oxygen, pH, and conductivity are measured in more than 90 percent of programs. Wide variation exists in which taxa are excluded from analyses and the level of taxonomic resolution used. Species traits, such as functional feeding groups, are commonly used (96 percent), as are tolerance values for organic pollution

  15. Quality assurance programs developed and implemented by the US Department of Energy`s Analytical Services Program for environmental restoration and waste management activities

    Energy Technology Data Exchange (ETDEWEB)

    Lillian, D.; Bottrell, D. [Dept. of Energy, Germntown, MD (United States)

    1993-12-31

    The U.S. Department of Energy`s (DOE`s) Office of Environmental Restoration and Waste Management (EM) has been tasked with addressing environmental contamination and waste problems facing the Department. A key element of any environmental restoration or waste management program is environmental data. An effective and efficient sampling and analysis program is required to generate credible environmental data. The bases for DOE`s EM Analytical Services Program (ASP) are contained in the charter and commitments in Secretary of Energy Notice SEN-13-89, EM program policies and requirements, and commitments to Congress and the Office of Inspector General (IG). The Congressional commitment by DOE to develop and implement an ASP was in response to concerns raised by the Chairman of the Congressional Environment, Energy, and Natural Resources Subcommittee, and the Chairman of the Congressional Oversight and Investigations Subcommittee of the Committee on Energy and Commerce, regarding the production of analytical data. The development and implementation of an ASP also satisfies the IG`s audit report recommendations on environmental analytical support, including development and implementation of a national strategy for acquisition of quality sampling and analytical services. These recommendations were endorsed in Departmental positions, which further emphasize the importance of the ASP to EM`s programs. In September 1990, EM formed the Laboratory Management Division (LMD) in the Office of Technology Development to provide the programmatic direction needed to establish and operate an EM-wide ASP program. In January 1992, LMD issued the {open_quotes}Analytical Services Program Five-Year Plan.{close_quotes} This document described LMD`s strategy to ensure the production of timely, cost-effective, and credible environmental data. This presentation describes the overall LMD Analytical Services Program and, specifically, the various QA programs.

  16. Analytical Chemistry Laboratory Quality Assurance Project Plan for the Transuranic Waste Characterization Program

    Energy Technology Data Exchange (ETDEWEB)

    Sailer, S.J.

    1996-08-01

    This Quality Assurance Project Plan (QAPJP) specifies the quality of data necessary and the characterization techniques employed at the Idaho National Engineering Laboratory (INEL) to meet the objectives of the Department of Energy (DOE) Waste Isolation Pilot Plant (WIPP) Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) requirements. This QAPJP is written to conform with the requirements and guidelines specified in the QAPP and the associated documents referenced in the QAPP. This QAPJP is one of a set of five interrelated QAPjPs that describe the INEL Transuranic Waste Characterization Program (TWCP). Each of the five facilities participating in the TWCP has a QAPJP that describes the activities applicable to that particular facility. This QAPJP describes the roles and responsibilities of the Idaho Chemical Processing Plant (ICPP) Analytical Chemistry Laboratory (ACL) in the TWCP. Data quality objectives and quality assurance objectives are explained. Sample analysis procedures and associated quality assurance measures are also addressed; these include: sample chain of custody; data validation; usability and reporting; documentation and records; audits and 0385 assessments; laboratory QC samples; and instrument testing, inspection, maintenance and calibration. Finally, administrative quality control measures, such as document control, control of nonconformances, variances and QA status reporting are described.

  17. Analytical quality of environmental analysis: Recent results and future trends of the IAEA-ILMR's Analytical Quality Control Program

    Energy Technology Data Exchange (ETDEWEB)

    Ballestra, S.; Vas, D.; Holm, E.; Lopez, J.J.; Parsi, P. (International Laboratory of Marine Radioactivity (Monaco))

    1988-01-01

    The Analytical Quality Control Services Program of the IAEA-ILMR covers a wide variety of intercalibration and reference materials. The purpose of the program is to ensure the comparability of the results obtained by the different participants and to enable laboratories engaged in low-level analyses of marine environmental materials to control their analytical performance. Within the past five years, the International Laboratory of Marine Radioactivity in Monaco has organized eight intercomparison exercises, on a world-wide basis, on natural materials of marine origin comprising sea water, sediment, seaweed and fish flesh. Results on artificial (fission and activation products, transuranium elements) and natural radionuclides were compiled and evaluated. Reference concentration values were established for a number of the intercalibration samples allowing them to become certified as reference materials available for general distribution. The results of the fish flesh sample and those of the deep-sea sediment are reviewed. The present status of three on-going intercomparison exercises on post-Chernobyl samples IAEA-306 (Baltic Sea sediment), IAEA-307 (Mediterranean sea-plant Posidonia oceanica) and IAEA-308 (Mediterranean mixed seaweed) is also described. 1 refs., 4 tabs.

  18. Impact of a workplace exercise program on neck and shoulder segments in office workers

    OpenAIRE

    Machado-Matos, M.; Arezes, P.

    2016-01-01

    Work-related musculoskeletal disorders are a common problem among office workers. The purpose of this study is to evaluate the impact of a workplace exercise program on neck and shoulder pain and flexibility in office workers. The workstation assessment was performed using Rapid Office Strain Assessment. Workers were assessed for pain pre- and post-implementation of the workplace exercise program using the Nordic Questionnaire for Musculoskeletal Symptoms, and for flexibility. The program las...

  19. On the location selection problem using analytic hierarchy process and multi-choice goal programming

    Science.gov (United States)

    Ho, Hui-Ping; Chang, Ching-Ter; Ku, Cheng-Yuan

    2013-01-01

    Location selection is a crucial decision in cost/benefit analysis of restaurants, coffee shops and others. However, it is difficult to be solved because there are many conflicting multiple goals in the problem of location selection. In order to solve the problem, this study integrates analytic hierarchy process (AHP) and multi-choice goal programming (MCGP) as a decision aid to obtain an appropriate house from many alternative locations that better suit the preferences of renters under their needs. This study obtains weights from AHP and implements it upon each goal using MCGP for the location selection problem. According to the function of multi-aspiration provided by MCGP, decision makers can set multi-aspiration for each location goal to rank the candidate locations. Compared to the unaided selection processes, the integrated approach of AHP and MCGP is a better scientific and efficient method than traditional methods in finding a suitable location for buying or renting a house for business, especially under multiple qualitative and quantitative criteria within a shorter evaluation time. In addition, a real case is provided to demonstrate the usefulness of the proposed method. The results show that the proposed method is able to provide better quality decision than normal manual methods.

  20. Adaptive elastic segmentation of brain MRI via shape-model-guided evolutionary programming.

    Science.gov (United States)

    Pitiot, Alain; Toga, Arthur W; Thompson, Paul M

    2002-08-01

    This paper presents a fully automated segmentation method for medical images. The goal is to localize and parameterize a variety of types of structure in these images for subsequent quantitative analysis. We propose a new hybrid strategy that combines a general elastic template matching approach and an evolutionary heuristic. The evolutionary algorithm uses prior statistical information about the shape of the target structure to control the behavior of a number of deformable templates. Each template, modeled in the form of a B-spline, is warped in a potential field which is itself dynamically adapted. Such a hybrid scheme proves to be promising: by maintaining a population of templates, we cover a large domain of the solution space under the global guidance of the evolutionary heuristic, and thoroughly explore interesting areas. We address key issues of automated image segmentation systems. The potential fields are initially designed based on the spatial features of the edges in the input image, and are subjected to spatially adaptive diffusion to guarantee the deformation of the template. This also improves its global consistency and convergence speed. The deformation algorithm can modify the internal structure of the templates to allow a better match. We investigate in detail the preprocessing phase that the images undergo before they can be used more effectively in the iterative elastic matching procedure: a texture classifier, trained via linear discriminant analysis of a learning set, is used to enhance the contrast of the target structure with respect to surrounding tissues. We show how these techniques interact within a statistically driven evolutionary scheme to achieve a better tradeoff between template flexibility and sensitivity to noise and outliers. We focus on understanding the features of template matching that are most beneficial in terms of the achieved match. Examples from simulated and real image data are discussed, with considerations of

  1. On the Multilevel Nature of Meta-Analysis: A Tutorial, Comparison of Software Programs, and Discussion of Analytic Choices.

    Science.gov (United States)

    Pastor, Dena A; Lazowski, Rory A

    2017-09-27

    The term "multilevel meta-analysis" is encountered not only in applied research studies, but in multilevel resources comparing traditional meta-analysis to multilevel meta-analysis. In this tutorial, we argue that the term "multilevel meta-analysis" is redundant since all meta-analysis can be formulated as a special kind of multilevel model. To clarify the multilevel nature of meta-analysis the four standard meta-analytic models are presented using multilevel equations and fit to an example data set using four software programs: two specific to meta-analysis (metafor in R and SPSS macros) and two specific to multilevel modeling (PROC MIXED in SAS and HLM). The same parameter estimates are obtained across programs underscoring that all meta-analyses are multilevel in nature. Despite the equivalent results, not all software programs are alike and differences are noted in the output provided and estimators available. This tutorial also recasts distinctions made in the literature between traditional and multilevel meta-analysis as differences between meta-analytic choices, not between meta-analytic models, and provides guidance to inform choices in estimators, significance tests, moderator analyses, and modeling sequence. The extent to which the software programs allow flexibility with respect to these decisions is noted, with metafor emerging as the most favorable program reviewed.

  2. Learning About Love: A Meta-Analytic Study of Individually-Oriented Relationship Education Programs for Adolescents and Emerging Adults.

    Science.gov (United States)

    Simpson, David M; Leonhardt, Nathan D; Hawkins, Alan J

    2018-03-01

    Despite recent policy initiatives and substantial federal funding of individually oriented relationship education programs for youth, there have been no meta-analytic reviews of this growing field. This meta-analytic study draws on 17 control-group studies and 13 one-group/pre-post studies to evaluate the effectiveness of relationship education programs on adolescents' and emerging adults' relationship knowledge, attitudes, and skills. Overall, control-group studies produced a medium effect (d = .36); one-group/pre-post studies also produced a medium effect (d = .47). However, the lack of studies with long-term follow-ups of relationship behaviors in the young adult years is a serious weakness in the field, limiting what we can say about the value of these programs for helping youth achieve their aspirations for healthy romantic relationships and stable marriages.

  3. ORBITALES. A program for the calculation of wave functions with an analytical central potential; ORBITALES. Programa de calculo de Funciones de Onda para una Potencial Central Analitico

    Energy Technology Data Exchange (ETDEWEB)

    Yunta Carretero; Rodriguez Mayquez, E.

    1974-07-01

    In this paper is described the objective, basis, carrying out in FORTRAN language and use of the program ORBITALES. This program calculate atomic wave function in the case of ths analytical central potential (Author) 8 refs.

  4. Novel Programs, International Adoptions, or Contextual Adaptations? Meta-Analytical Results From German and Swedish Intervention Research.

    Science.gov (United States)

    Sundell, Knut; Beelmann, Andreas; Hasson, Henna; von Thiele Schwarz, Ulrica

    2016-01-01

    One of the major dilemmas in intervention and implementation research is adaptation versus adherence. High fidelity to an intervention protocol is essential for internal validity. At the same time, it has been argued that adaptation is necessary for improving the adoption and use of interventions by, for example, improving the match between an intervention and its cultural context, thus improving external validity. This study explores the origins of intervention programs (i.e., novel programs, programs adopted from other contexts with or without adaptation) in two meta-analytic intervention data sets from two European countries and compares the effect sizes of the outcomes of the interventions evaluated. Results are based on two samples of studies evaluating German child and youth preventative interventions (k = 158), and Swedish evaluations of a variety of psychological and social interventions (k = 139). The studies were categorized as novel programs, international adoption and contextual adaptation, with a total of six subcategories. In the German sample, after statistically controlling for some crucial methodological aspects, novel programs were significantly more effective than adopted programs. In the Swedish sample, a trend was found suggesting that adopted programs were less effective than adapted and novel programs. If these results are generalizable and unbiased, they favor novel and adapted programs over adopted programs with no adaptation and indicate that adoption of transported programs should not be done without considering adaptation.

  5. European specialist porphyria laboratories: diagnostic strategies, analytical quality, clinical interpretation, and reporting as assessed by an external quality assurance program.

    Science.gov (United States)

    Aarsand, Aasne K; Villanger, Jørild H; Støle, Egil; Deybach, Jean-Charles; Marsden, Joanne; To-Figueras, Jordi; Badminton, Mike; Elder, George H; Sandberg, Sverre

    2011-11-01

    The porphyrias are a group of rare metabolic disorders whose diagnosis depends on identification of specific patterns of porphyrin precursor and porphyrin accumulation in urine, blood, and feces. Diagnostic tests for porphyria are performed by specialized laboratories in many countries. Data regarding the analytical and diagnostic performance of these laboratories are scarce. We distributed 5 sets of multispecimen samples from different porphyria patients accompanied by clinical case histories to 18-21 European specialist porphyria laboratories/centers as part of a European Porphyria Network organized external analytical and postanalytical quality assessment (EQA) program. The laboratories stated which analyses they would normally have performed given the case histories and reported results of all porphyria-related analyses available, interpretative comments, and diagnoses. Reported diagnostic strategies initially showed considerable diversity, but the number of laboratories applying adequate diagnostic strategies increased during the study period. We found an average interlaboratory CV of 50% (range 12%-152%) for analytes in absolute concentrations. Result normalization by forming ratios to the upper reference limits did not reduce this variation. Sixty-five percent of reported results were within biological variation-based analytical quality specifications. Clinical interpretation of the obtained analytical results was accurate, and most laboratories established the correct diagnosis in all distributions. Based on a case-based EQA scheme, variations were apparent in analytical and diagnostic performance between European specialist porphyria laboratories. Our findings reinforce the use of EQA schemes as an essential tool to assess both analytical and diagnostic processes and thereby to improve patient care in rare diseases.

  6. Reporting trends and outcomes in ST-segment-elevation myocardial infarction national hospital quality assessment programs.

    Science.gov (United States)

    McCabe, James M; Kennedy, Kevin F; Eisenhauer, Andrew C; Waldman, Howard M; Mort, Elizabeth A; Pomerantsev, Eugene; Resnic, Frederic S; Yeh, Robert W

    2014-01-14

    For patients who undergo primary percutaneous coronary intervention (PCI) for ST-segment-elevation myocardial infarction, the door-to-balloon time is an important performance measure reported to the Centers for Medicare & Medicaid Services (CMS) and tied to hospital quality assessment and reimbursement. We sought to assess the use and impact of exclusion criteria associated with the CMS measure of door-to-balloon time in primary PCI. All primary PCI-eligible patients at 3 Massachusetts hospitals (Brigham and Women's, Massachusetts General, and North Shore Medical Center) were evaluated for CMS reporting status. Rates of CMS reporting exclusion were the primary end points of interest. Key secondary end points were between-group differences in patient characteristics, door-to-balloon times, and 1-year mortality rates. From 2005 to 2011, 26% (408) of the 1548 primary PCI cases were excluded from CMS reporting. This percentage increased over the study period from 13.9% in 2005 to 36.7% in the first 3 quarters of 2011 (Preported cases met door-to-balloon time goals in 2011, this was true of only 61% of CMS-excluded cases and consequently 82.6% of all primary PCI cases performed that year. The 1-year mortality for CMS-excluded patients was double that of CMS-included patients (13.5% versus 6.6%; Preports collected by CMS, and this percentage has grown substantially over time. These findings may have significant implications for our understanding of process improvement in primary PCI and mechanisms for reimbursement through Medicare.

  7. Analytical Results for Agricultural Soils Samples from a Monitoring Program Near Deer Trail, Colorado (USA)

    Science.gov (United States)

    Crock, J.G.; Smith, D.B.; Yager, T.J.B.

    2009-01-01

    Since late 1993, Metro Wastewater Reclamation District of Denver (Metro District, MWRD), a large wastewater treatment plant in Denver, Colorado, has applied Grade I, Class B biosolids to about 52,000 acres of nonirrigated farmland and rangeland near Deer Trail, Colorado, USA. In cooperation with the Metro District in 1993, the U.S. Geological Survey (USGS) began monitoring groundwater at part of this site. In 1999, the USGS began a more comprehensive monitoring study of the entire site to address stakeholder concerns about the potential chemical effects of biosolids applications to water, soil, and vegetation. This more comprehensive monitoring program has recently been extended through 2010. Monitoring components of the more comprehensive study include biosolids collected at the wastewater treatment plant, soil, crops, dust, alluvial and bedrock groundwater, and stream bed sediment. Soils for this study were defined as the plow zone of the dry land agricultural fields - the top twelve inches of the soil column. This report presents analytical results for the soil samples collected at the Metro District farm land near Deer Trail, Colorado, during three separate sampling events during 1999, 2000, and 2002. Soil samples taken in 1999 were to be a representation of the original baseline of the agricultural soils prior to any biosolids application. The soil samples taken in 2000 represent the soils after one application of biosolids to the middle field at each site and those taken in 2002 represent the soils after two applications. There have been no biosolids applied to any of the four control fields. The next soil sampling is scheduled for the spring of 2010. Priority parameters for biosolids identified by the stakeholders and also regulated by Colorado when used as an agricultural soil amendment include the total concentrations of nine trace elements (arsenic, cadmium, copper, lead, mercury, molybdenum, nickel, selenium, and zinc), plutonium isotopes, and gross

  8. Segmenting the Adult Education Market.

    Science.gov (United States)

    Aurand, Tim

    1994-01-01

    Describes market segmentation and how the principles of segmentation can be applied to the adult education market. Indicates that applying segmentation techniques to adult education programs results in programs that are educationally and financially satisfying and serve an appropriate population. (JOW)

  9. Data Acquisition Programming (LabVIEW): An Aid to Teaching Instrumental Analytical Chemistry.

    Science.gov (United States)

    Gostowski, Rudy

    A course was developed at Austin Peay State University (Tennessee) which offered an opportunity for hands-on experience with the essential components of modern analytical instruments. The course aimed to provide college students with the skills necessary to construct a simple model instrument, including the design and fabrication of electronic…

  10. Multidendritic sensory neurons in the adult Drosophila abdomen: origins, dendritic morphology, and segment- and age-dependent programmed cell death

    Directory of Open Access Journals (Sweden)

    Sugimura Kaoru

    2009-10-01

    Full Text Available Abstract Background For the establishment of functional neural circuits that support a wide range of animal behaviors, initial circuits formed in early development have to be reorganized. One way to achieve this is local remodeling of the circuitry hardwiring. To genetically investigate the underlying mechanisms of this remodeling, one model system employs a major group of Drosophila multidendritic sensory neurons - the dendritic arborization (da neurons - which exhibit dramatic dendritic pruning and subsequent growth during metamorphosis. The 15 da neurons are identified in each larval abdominal hemisegment and are classified into four categories - classes I to IV - in order of increasing size of their receptive fields and/or arbor complexity at the mature larval stage. Our knowledge regarding the anatomy and developmental basis of adult da neurons is still fragmentary. Results We identified multidendritic neurons in the adult Drosophila abdomen, visualized the dendritic arbors of the individual neurons, and traced the origins of those cells back to the larval stage. There were six da neurons in abdominal hemisegment 3 or 4 (A3/4 of the pharate adult and the adult just after eclosion, five of which were persistent larval da neurons. We quantitatively analyzed dendritic arbors of three of the six adult neurons and examined expression in the pharate adult of key transcription factors that result in the larval class-selective dendritic morphologies. The 'baseline design' of A3/4 in the adult was further modified in a segment-dependent and age-dependent manner. One of our notable findings is that a larval class I neuron, ddaE, completed dendritic remodeling in A2 to A4 and then underwent caspase-dependent cell death within 1 week after eclosion, while homologous neurons in A5 and in more posterior segments degenerated at pupal stages. Another finding is that the dendritic arbor of a class IV neuron, v'ada, was immediately reshaped during post

  11. Supporting Imagers' VOICE: A National Training Program in Comparative Effectiveness Research and Big Data Analytics.

    Science.gov (United States)

    Kang, Stella K; Rawson, James V; Recht, Michael P

    2017-12-05

    Provided methodologic training, more imagers can contribute to the evidence basis on improved health outcomes and value in diagnostic imaging. The Value of Imaging Through Comparative Effectiveness Research Program was developed to provide hands-on, practical training in five core areas for comparative effectiveness and big biomedical data research: decision analysis, cost-effectiveness analysis, evidence synthesis, big data principles, and applications of big data analytics. The program's mixed format consists of web-based modules for asynchronous learning as well as in-person sessions for practical skills and group discussion. Seven diagnostic radiology subspecialties and cardiology are represented in the first group of program participants, showing the collective potential for greater depth of comparative effectiveness research in the imaging community. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  12. AI based HealthCare Platform for Real Time, Predictive and Prescriptive Analytics using Reactive Programming

    Science.gov (United States)

    Kaur, Jagreet; Singh Mann, Kulwinder, Dr.

    2018-01-01

    AI in Healthcare needed to bring real, actionable insights and Individualized insights in real time for patients and Doctors to support treatment decisions., We need a Patient Centred Platform for integrating EHR Data, Patient Data, Prescriptions, Monitoring, Clinical research and Data. This paper proposes a generic architecture for enabling AI based healthcare analytics Platform by using open sources Technologies Apache beam, Apache Flink Apache Spark, Apache NiFi, Kafka, Tachyon, Gluster FS, NoSQL- Elasticsearch, Cassandra. This paper will show the importance of applying AI based predictive and prescriptive analytics techniques in Health sector. The system will be able to extract useful knowledge that helps in decision making and medical monitoring in real-time through an intelligent process analysis and big data processing.

  13. School-Based Mental Health and Behavioral Programs for Low-Income, Urban Youth: A Systematic and Meta-Analytic Review

    Science.gov (United States)

    Farahmand, Farahnaz K.; Grant, Kathryn E.; Polo, Antonio J.; Duffy, Sophia N.; Dubois, David L.

    2011-01-01

    A systematic and meta-analytic review was conducted of the effectiveness of school-based mental health and behavioral programs for low-income, urban youth. Applying criteria from an earlier systematic review (Rones & Hoagwood, 2000) of such programs for all populations indicated substantially fewer effective programs for low-income, urban…

  14. A hybrid analytical network process and fuzzy goal programming for supplier selection: A case study of auto part maker

    Directory of Open Access Journals (Sweden)

    Hesam Zande Hesami

    2011-10-01

    Full Text Available The aim of this research is to present a hybrid model to select auto part suppliers. The proposed method of this paper uses factor analysis to find the most influencing factors on part maker selection and the results are validated using different statistical tests such as Cronbach's Alpha and Kaiser-Meyer.The hybrid model uses analytical network process to rank different part maker suppliers and fuzzy goal programming to choose the appropriate alternative among various choices. The implementation of the proposed model of this paper is used for a case study of real-world problem and the results are discussed.

  15. Vendor Selection and Supply Quotas Determination by Using the Analytic Hierarchy Process and a New Multi-objective Programming Method

    Directory of Open Access Journals (Sweden)

    Tunjo Perić

    2017-03-01

    Full Text Available In this article, we propose a new methodology for solving the vendor selection and the supply quotas determination problem. The proposed methodology combines the Analytic Hierarchy Process (AHP for determining the coefficients of the objective functions and a new multiple objective programming method based on the cooperative game theory for vendor selection and supply quotas determination. The proposed methodology is tested on the problem of flour purchase by a company that manufactures bakery products. For vendor selection and supply quotas determination we use three complex criteria: (1 purchasing costs, (2 product quality, and (3 vendor reliability.

  16. An Analytical Framework for Internationalization through English-Taught Degree Programs: A Dutch Case Study

    Science.gov (United States)

    Kotake, Masako

    2017-01-01

    The growing importance of internationalization and the global dominance of English in higher education mean pressures on expanding English-taught degree programs (ETDPs) in non-English-speaking countries. Strategic considerations are necessary to successfully integrate ETDPs into existing programs and to optimize the effects of…

  17. Development and implementation of information systems for the DOE's National Analytical Management Program (NAMP).

    Energy Technology Data Exchange (ETDEWEB)

    Streets, W. E.

    1999-01-29

    The Department of Energy (DOE) faces a challenging environmental management effort, including environmental protection, environmental restoration, waste management, and decommissioning. This effort requires extensive sampling and analysis to determine the type and level of contamination and the appropriate technology for cleanup, and to verify compliance with environmental regulations. Data obtained from these sampling and analysis activities are used to support environmental management decisions. Confidence in the data is critical, having legal, regulatory, and therefore, economic impact. To promote quality in the planning, management, and performance of these sampling and analysis operations, DOE's Office of Environmental Management (EM) has established the National Analytical Management Program (NAMP). With a focus on reducing the estimated costs of over $200M per year for EM's analytical services, NAMP has been charged with developing products that will decrease the costs for DOE complex-wide environmental management while maintaining quality in all aspects of the analytical data generation. As part of this thrust to streamline operations, NAMP is developing centralized information systems that will allow DOE complex personnel to share information about EM contacts at the various sites, pertinent methodologies for environmental restoration and waste management, costs of analyses, and performance of contracted laboratories.

  18. Analytical results for minicipal biosolids samples from a monitoring program near Deer Trail, Colorado (U.S.A.) 2010

    Science.gov (United States)

    Crock, J.G.; Smith, D.B.; Yager, T.J.B.; Berry, C.J.; Adams, M.G.

    2011-01-01

    Since late 1993, Metro Wastewater Reclamation District of Denver (Metro District), a large wastewater treatment plant in Denver, Colo., has applied Grade I, Class B biosolids to about 52,000 acres of nonirrigated farmland and rangeland near Deer Trail, Colo., U.S.A. In cooperation with the Metro District in 1993, the U.S. Geological Survey (USGS) began monitoring groundwater at part of this site. In 1999, the USGS began a more comprehensive monitoring study of the entire site to address stakeholder concerns about the potential chemical effects of biosolids applications to water, soil, and vegetation. This more comprehensive monitoring program was recently extended through the end of 2010 and is now completed. Monitoring components of the more comprehensive study include biosolids collected at the wastewater treatment plant, soil, crops, dust, alluvial and bedrock groundwater, and stream-bed sediment. Streams at the site are dry most of the year, so samples of stream-bed sediment deposited after rain were used to indicate surface-water runoff effects. This report summarizes analytical results for the biosolids samples collected at the Metro District wastewater treatment plant in Denver and analyzed for 2010. In general, the objective of each component of the study was to determine whether concentrations of nine trace elements ("priority analytes") (1) were higher than regulatory limits, (2) were increasing with time, or (3) were significantly higher in biosolids-applied areas than in a similar farmed area where biosolids were not applied (background). Previous analytical results indicate that the elemental composition of biosolids from the Denver plant was consistent during 1999-2009, and this consistency continues with the samples for 2010. Total concentrations of regulated trace elements remain consistently lower than the regulatory limits for the entire monitoring period. Concentrations of none of the priority analytes appear to have increased during the 12 years

  19. Analytical results for municipal biosolids samples from a monitoring program near Deer Trail, Colorado (U.S.A.), 2009

    Science.gov (United States)

    Crock, J.G.; Smith, D.B.; Yager, T.J.B.; Berry, C.J.; Adams, M.G.

    2010-01-01

    Since late 1993, Metro Wastewater Reclamation District of Denver, a large wastewater treatment plant in Denver, Colo., has applied Grade I, Class B biosolids to about 52,000 acres of nonirrigated farmland and rangeland near Deer Trail, Colo., U.S.A. In cooperation with the Metro District in 1993, the U.S. Geological Survey began monitoring groundwater at part of this site. In 1999, the Survey began a more comprehensive monitoring study of the entire site to address stakeholder concerns about the potential chemical effects of biosolids applications to water, soil, and vegetation. This more comprehensive monitoring program has recently been extended through the end of 2010. Monitoring components of the more comprehensive study include biosolids collected at the wastewater treatment plant, soil, crops, dust, alluvial and bedrock groundwater, and stream-bed sediment. Streams at the site are dry most of the year, so samples of stream-bed sediment deposited after rain were used to indicate surface-water effects. This report presents analytical results for the biosolids samples collected at the Metro District wastewater treatment plant in Denver and analyzed for 2009. In general, the objective of each component of the study was to determine whether concentrations of nine trace elements ('priority analytes') (1) were higher than regulatory limits, (2) were increasing with time, or (3) were significantly higher in biosolids-applied areas than in a similar farmed area where biosolids were not applied. Previous analytical results indicate that the elemental composition of biosolids from the Denver plant was consistent during 1999-2008, and this consistency continues with the samples for 2009. Total concentrations of regulated trace elements remain consistently lower than the regulatory limits for the entire monitoring period. Concentrations of none of the priority analytes appear to have increased during the 11 years of this study.

  20. A Concept of Constructing a Common Information Space for High Tech Programs Using Information Analytical Systems

    Science.gov (United States)

    Zakharova, Alexandra A.; Kolegova, Olga A.; Nekrasova, Maria E.

    2016-04-01

    The paper deals with the issues in program management used for engineering innovative products. The existing project management tools were analyzed. The aim is to develop a decision support system that takes into account the features of program management used for high-tech products: research intensity, a high level of technical risks, unpredictable results due to the impact of various external factors, availability of several implementing agencies. The need for involving experts and using intelligent techniques for information processing is demonstrated. A conceptual model of common information space to support communication between members of the collaboration on high-tech programs has been developed. The structure and objectives of the information analysis system “Geokhod” were formulated with the purpose to implement the conceptual model of common information space in the program “Development and production of new class mining equipment - “Geokhod”.

  1. An evaluation of graduated driver licensing programs in North America using a meta-analytic approach

    OpenAIRE

    VANLAAR, Ward; Mayhew, Dan; Marcoux, Kyla; WETS, Geert; BRIJS, Tom; SHOPE, J.

    2009-01-01

    Most jurisdictions in North America have some version of graduated driver licensing (GDL). A sound body of evidence documenting the effectiveness of GDL programs in reducing collisions, fatalities and injuries among novice drivers is available. However, information about the relative importance of individual components of GDL is lacking. The objectives of this study are to calculate a summary statistic of GDL effectiveness and to identify the most effective components of GDL programs using a ...

  2. Chesapeake Bay coordinated split sample program annual report, 1990-1991: Analytical methods and quality assurance workgroup of the Chesapeake Bay program monitoring subcommittee

    Energy Technology Data Exchange (ETDEWEB)

    1991-01-01

    The Chesapeake Bay Program is a federal-state partnership with a goal of restoring the Chesapeake Bay. Its ambient water quality monitoring programs, started in 1984, sample over 150 monitoring stations once or twice a month a month. Due to the size of the Bay watershed (64,000 square miles) and the cooperative nature of the CBP, these monitoring programs involve 10 different analytical laboratories. The Chesapeake Bay Coordinated Split Sample Program (CSSP), initialed in 1988, assesses the comparability of the water quality results from these laboratories. The report summarizes CSSP results for 1990 and 1991, its second and third full years of operation. The CSSP has two main objectives: identifying parameters with low inter-organization agreement, and estimating measurement system variability. The identification of parmeters with low agreement is used as part of the overall Quality Assurance program. Laboratory and program personnel use the information to investigate possible causes of the differences, and take action to increase agreement if possible. Later CSSP results will document any improvements in inter-organization agreement. The variability estimates are most useful to data analysts and modelers who need confidence estimates for monitoring data.

  3. What Can Schools, Colleges, and Youth Programs Do with Predictive Analytics? Practitioner Brief

    Science.gov (United States)

    Balu, Rekha; Porter, Kristin

    2017-01-01

    Many low-income young people are not reaching important milestones for success (for example, completing a program or graduating from school on time). But the social-service organizations and schools that serve them often struggle to identify who is at more or less risk. These institutions often either over- or underestimate risk, missing…

  4. Advanced organic analysis and analytical methods development: FY 1995 progress report. Waste Tank Organic Safety Program

    Energy Technology Data Exchange (ETDEWEB)

    Wahl, K.L.; Campbell, J.A.; Clauss, S.A. [and others

    1995-09-01

    This report describes the work performed during FY 1995 by Pacific Northwest Laboratory in developing and optimizing analysis techniques for identifying organics present in Hanford waste tanks. The main focus was to provide a means for rapidly obtaining the most useful information concerning the organics present in tank waste, with minimal sample handling and with minimal waste generation. One major focus has been to optimize analytical methods for organic speciation. Select methods, such as atmospheric pressure chemical ionization mass spectrometry and matrix-assisted laser desorption/ionization mass spectrometry, were developed to increase the speciation capabilities, while minimizing sample handling. A capillary electrophoresis method was developed to improve separation capabilities while minimizing additional waste generation. In addition, considerable emphasis has been placed on developing a rapid screening tool, based on Raman and infrared spectroscopy, for determining organic functional group content when complete organic speciation is not required. This capability would allow for a cost-effective means to screen the waste tanks to identify tanks that require more specialized and complete organic speciation to determine tank safety.

  5. Analytic programming with FMRI data: a quick-start guide for statisticians using R.

    Directory of Open Access Journals (Sweden)

    Ani Eloyan

    Full Text Available Functional magnetic resonance imaging (fMRI is a thriving field that plays an important role in medical imaging analysis, biological and neuroscience research and practice. This manuscript gives a didactic introduction to the statistical analysis of fMRI data using the R project, along with the relevant R code. The goal is to give statisticians who would like to pursue research in this area a quick tutorial for programming with fMRI data. References of relevant packages and papers are provided for those interested in more advanced analysis.

  6. Segmental neurofibromatosis

    OpenAIRE

    Galhotra, Virat; Sheikh, Soheyl; Jindal, Sanjeev; Singla, Anshu

    2014-01-01

    Segmental neurofibromatosis is a rare disorder, characterized by neurofibromas or cafι-au-lait macules limited to one region of the body. Its occurrence on the face is extremely rare and only few cases of segmental neurofibromatosis over the face have been described so far. We present a case of segmental neurofibromatosis involving the buccal mucosa, tongue, cheek, ear, and neck on the right side of the face.

  7. Performance of genetic programming optimised Bowtie2 on genome comparison and analytic testing (GCAT) benchmarks.

    Science.gov (United States)

    Langdon, W B

    2015-01-01

    Genetic studies are increasingly based on short noisy next generation scanners. Typically complete DNA sequences are assembled by matching short NextGen sequences against reference genomes. Despite considerable algorithmic gains since the turn of the millennium, matching both single ended and paired end strings to a reference remains computationally demanding. Further tailoring Bioinformatics tools to each new task or scanner remains highly skilled and labour intensive. With this in mind, we recently demonstrated a genetic programming based automated technique which generated a version of the state-of-the-art alignment tool Bowtie2 which was considerably faster on short sequences produced by a scanner at the Broad Institute and released as part of The Thousand Genome Project. Bowtie2 (G P) and the original Bowtie2 release were compared on bioplanet's GCAT synthetic benchmarks. Bowtie2 (G P) enhancements were also applied to the latest Bowtie2 release (2.2.3, 29 May 2014) and retained both the GP and the manually introduced improvements. On both singled ended and paired-end synthetic next generation DNA sequence GCAT benchmarks Bowtie2GP runs up to 45% faster than Bowtie2. The lost in accuracy can be as little as 0.2-0.5% but up to 2.5% for longer sequences.

  8. Segmental Neurofibromatosis

    Directory of Open Access Journals (Sweden)

    Yesudian Devakar

    1997-01-01

    Full Text Available Segmental neurofibromatosis is a rare variant of neurofibromatosis in which the lesions are confined to one segment or dermatome of the body. They resemble classical neurofibromas in their morphology, histopathology and electron microscopy. However, systemic associations are usually absent. We report one such case with these classical features.

  9. Segmental Vitiligo.

    Science.gov (United States)

    van Geel, Nanja; Speeckaert, Reinhart

    2017-04-01

    Segmental vitiligo is characterized by its early onset, rapid stabilization, and unilateral distribution. Recent evidence suggests that segmental and nonsegmental vitiligo could represent variants of the same disease spectrum. Observational studies with respect to its distribution pattern point to a possible role of cutaneous mosaicism, whereas the original stated dermatomal distribution seems to be a misnomer. Although the exact pathogenic mechanism behind the melanocyte destruction is still unknown, increasing evidence has been published on the autoimmune/inflammatory theory of segmental vitiligo. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Mentoring Programs to Affect Delinquency and Associated Outcomes of Youth At-Risk: A Comprehensive Meta-Analytic Reviewi

    Science.gov (United States)

    Tolan, Patrick H.; Henry, David B.; Schoeny, Michael S.; Lovegrove, Peter; Nichols, Emily

    2013-01-01

    Objectives To conduct a meta-analytic review of selective and indicated mentoring interventions for effects for youth at risk on delinquency and key associated outcomes (aggression, drug use, academic functioning). We also undertook the first systematic evaluation of intervention implementation features and organization and tested for effects of theorized key processes of mentor program effects. Methods Campbell Collaboration review inclusion criteria and procedures were used to search and evaluate the literature. Criteria included a sample defined as at-risk for delinquency due to individual behavior such as aggression or conduct problems or environmental characteristics such as residence in high-crime community. Studies were required to be random assignment or strong quasi-experimental design. Of 163 identified studies published 1970 - 2011, 46 met criteria for inclusion. Results Mean effects sizes were significant and positive for each outcome category (ranging form d =.11 for Academic Achievement to d = .29 for Aggression). Heterogeneity in effect sizes was noted for all four outcomes. Stronger effects resulted when mentor motivation was professional development but not by other implementation features. Significant improvements in effects were found when advocacy and emotional support mentoring processes were emphasized. Conclusions This popular approach has significant impact on delinquency and associated outcomes for youth at-risk for delinquency. While evidencing some features may relate to effects, the body of literature is remarkably lacking in details about specific program features and procedures. This persistent state of limited reporting seriously impedes understanding about how mentoring is beneficial and ability to maximize its utility. PMID:25386111

  11. Dynamic performance of frictionless fast shutters for ITER: Numerical and analytical sensitivity study for the development of a test program

    Energy Technology Data Exchange (ETDEWEB)

    Panin, Anatoly, E-mail: a.panin@fz-juelich.de [Forschungszentrum Jülich GmbH, Institut für Energie- und Klimaforschung – Plasmaphysik, 52425 Jülich (Germany); Khovayko, Mikhail [St. Petersburg Polytechnic University, Mechanics and Control Processes Department, Computational Mechanics Laboratory, 195251 St. Petersburg (Russian Federation); Krasikov, Yury [Forschungszentrum Jülich GmbH, Institut für Energie- und Klimaforschung – Plasmaphysik, 52425 Jülich (Germany); Nemov, Alexander [St. Petersburg Polytechnic University, Mechanics and Control Processes Department, Computational Mechanics Laboratory, 195251 St. Petersburg (Russian Federation); Biel, Wolfgang; Mertens, Philippe; Neubauer, Olaf; Schrader, Michael [Forschungszentrum Jülich GmbH, Institut für Energie- und Klimaforschung – Plasmaphysik, 52425 Jülich (Germany)

    2015-10-15

    To prolong a lifetime of the ITER first diagnostic mirrors some protective shutters can be engaged. A concept of an elastic shutter that operates frictionless in vacuum has been studied at the Forschungszentrum Jülich, Germany. Under actuation two shutter arms (∼2 m long) bend laterally between two pairs of limiting bumpers thus shielding the optical aperture or opening it for measurements. To increase the shutter efficiency the transition time between its open and closed states can be minimized. This demands a fast shutter that operates in fractions of a second and exhibit essentially dynamic behavior, like impacts with the bumpers that cause the shutter arms’ bouncing and oscillations. The paper presents numerical studies of the shutter dynamic behavior using the explicit and implicit 3D FE transient structural modeling. Simple 1D analytical model was developed to predict the shutter impact kinetic energy that mostly determines its further dynamic response. The structure sensitivity to different parameters was studied and ways for its optimization were laid down. A parametric shutter mockup with easily changeable mechanical characteristics was manufactured. A test program aimed for further shutter optimization, basing on the analysis performed and engaging powerful capabilities of the parametric shutter mockup is discussed in the paper.

  12. Tank 241-S-102, Core 232 analytical results for the final report

    Energy Technology Data Exchange (ETDEWEB)

    STEEN, F.H.

    1998-11-04

    This document is the analytical laboratory report for tank 241-S-102 push mode core segments collected between March 5, 1998 and April 2, 1998. The segments were subsampled and analyzed in accordance with the Tank 241-S-102 Retained Gas Sampler System Sampling and Analysis Plan (TSAP) (McCain, 1998), Letter of Instruction for Compatibility Analysis of Samples from Tank 241-S-102 (LOI) (Thompson, 1998) and the Data Quality Objectives for Tank Farms Waste Compatibility Program (DQO) (Mulkey and Miller, 1998). The analytical results are included in the data summary table (Table 1).

  13. Analytical Results for Municipal Biosolids Samples from a Monitoring Program near Deer Trail, Colorado (U.S.A.), 2007

    Science.gov (United States)

    Crock, J.G.; Smith, D.B.; Yager, T.J.B.; Berry, C.J.; Adams, M.G.

    2008-01-01

    Since late 1993, the Metro Wastewater Reclamation District of Denver (Metro District), a large wastewater treatment plant in Denver, Colorado, has applied Grade I, Class B biosolids to about 52,000 acres of nonirrigated farmland and rangeland near Deer Trail, Colorado (U.S.A.). In cooperation with the Metro District in 1993, the U.S. Geological Survey (USGS) began monitoring ground water at part of this site. In 1999, the USGS began a more comprehensive monitoring study of the entire site to address stakeholder concerns about the potential chemical effects of biosolids applications to water, soil, and vegetation. This more comprehensive monitoring program recently has been extended through 2010. Monitoring components of the more comprehensive study include biosolids collected at the wastewater treatment plant, soil, crops, dust, alluvial and bedrock ground water, and streambed sediment. Streams at the site are dry most of the year, so samples of streambed sediment deposited after rain were used to indicate surface-water effects. This report will present only analytical results for the biosolids samples collected at the Metro District wastewater treatment plant in Denver and analyzed during 2007. We have presented earlier a compilation of analytical results for the biosolids samples collected and analyzed for 1999 through 2006. More information about the other monitoring components is presented elsewhere in the literature. Priority parameters for biosolids identified by the stakeholders and also regulated by Colorado when used as an agricultural soil amendment include the total concentrations of nine trace elements (arsenic, cadmium, copper, lead, mercury, molybdenum, nickel, selenium, and zinc), plutonium isotopes, and gross alpha and beta activity. Nitrogen and chromium also were priority parameters for ground water and sediment components. In general, the objective of each component of the study was to determine whether concentrations of priority parameters (1

  14. Segmentation: Identification of consumer segments

    DEFF Research Database (Denmark)

    Høg, Esben

    2005-01-01

    It is very common to categorise people, especially in the advertising business. Also traditional marketing theory has taken in consumer segments as a favorite topic. Segmentation is closely related to the broader concept of classification. From a historical point of view, classification has its...... a basic understanding of grouping people. Advertising agencies may use segmentation totarget advertisements, while food companies may usesegmentation to develop products to various groups of consumers. MAPP has for example investigated the positioning of fish in relation to other food products....... The traditionalists are characterised by favouring pork, poultry and beef. Since it is difficult to change consumers' tastes, the short-term consequence may be to focus on the "fish lovers" and target the communication towards these consumers. In the long run, "traditionalists" may be persuaded to revise...

  15. O papel dos programas interlaboratoriais para a qualidade dos resultados analíticos Interlaboratorial programs for improving the quality of analytical results

    Directory of Open Access Journals (Sweden)

    Queenie Siu Hang Chui

    2004-12-01

    Full Text Available Interlaboratorial programs are conducted for a number of purposes: to identify problems related to the calibration of instruments, to assess the degree of equivalence of analytical results among several laboratories, to attribute quantity values and its uncertainties in the development of a certified reference material and to verify the performance of laboratories as in proficiency testing, a key quality assurance technique, which is sometimes used in conjunction with accreditation. Several statistics tools are employed to assess the analytical results of laboratories participating in an intercomparison program. Among them are the z-score technique, the elypse of confidence and the Grubbs and Cochran test. This work presents the experience in coordinating an intercomparison exercise in order to determine Ca, Al, Fe, Ti and Mn, as impurities in samples of silicon metal of chemical grade prepared as a candidate for reference material.

  16. Assessment of accuracy and repeatability of anterior segment optical coherence tomography and reproducibility of measurements using a customised software program.

    Science.gov (United States)

    Kim, Eon; Ehrmann, Klaus

    2012-07-01

    The aim was to study the reliability of measurements of the RTVue (Optovue, Fremont, CA, USA) anterior segment optical coherence tomographer (AS-OCT) and assess how results can be improved by analysing raw optical coherence tomography data with customised image analysis software and applying correction factors. Five RTVue AS-OCT instruments (ver. 4.0) were assessed by imaging gauge blocks of three different lengths, single/stepped glass plate (microscope slides) and flat window glass to check for width, depth and linearity of the measurement scans. Five repeats per calibration tool were imaged and averaged. Raw data were exported and loaded into customised image analysis software written in LabWindows/CVI for further analysis. Using two calibration balls with different radii, measurement scans were validated. Repeatability of the optical coherence tomographs and the edge detection procedure were checked and statistical analyses performed. Variations ranging from 0.01 to 1.93 mm in scan width and 0.1 to 0.17 mm in scan depth were found between the five instruments. Slight curvature distortion of 0.06 ± 0.01 mm (mean and standard deviation) was found in the raw images. By isolating the three sources of image distortion and applying individual correction factors, accuracy for corneal curvature measurements could be improved to better than 0.1 mm. Manual edge detection limited the coefficient of repeatability value to 0.06 and 0.08 mm for anterior and posterior radii of curvature, respectively. The coefficient of repeatability of corneal thickness measurements was less than 8 µm. Accuracy of the RTVue AS-OCT varied between instruments. By applying calibration scale factors calculated by customised software, accuracy of thickness and curvature values of the anterior eye was improved. The achievable precision is sufficient to detect clinically relevant corneal curvature variations. © 2012 Vision Cooperative Research Centre. Clinical and Experimental Optometry © 2012

  17. Validating Analytical Methods

    Science.gov (United States)

    Ember, Lois R.

    1977-01-01

    The procedures utilized by the Association of Official Analytical Chemists (AOAC) to develop, evaluate, and validate analytical methods for the analysis of chemical pollutants are detailed. Methods validated by AOAC are used by the EPA and FDA in their enforcement programs and are granted preferential treatment by the courts. (BT)

  18. Mixed segmentation

    DEFF Research Database (Denmark)

    Hansen, Allan Grutt; Bonde, Anders; Aagaard, Morten

    This book is about using recent developments in the fields of data analytics and data visualization to frame new ways of identifying target groups in media communication. Based on a mixed-methods approach, the authors combine psychophysiological monitoring (galvanic skin response) with textual...

  19. ADVANCED CLUSTER BASED IMAGE SEGMENTATION

    Directory of Open Access Journals (Sweden)

    D. Kesavaraja

    2011-11-01

    Full Text Available This paper presents efficient and portable implementations of a useful image segmentation technique which makes use of the faster and a variant of the conventional connected components algorithm which we call parallel Components. In the Modern world majority of the doctors are need image segmentation as the service for various purposes and also they expect this system is run faster and secure. Usually Image segmentation Algorithms are not working faster. In spite of several ongoing researches in Conventional Segmentation and its Algorithms might not be able to run faster. So we propose a cluster computing environment for parallel image Segmentation to provide faster result. This paper is the real time implementation of Distributed Image Segmentation in Clustering of Nodes. We demonstrate the effectiveness and feasibility of our method on a set of Medical CT Scan Images. Our general framework is a single address space, distributed memory programming model. We use efficient techniques for distributing and coalescing data as well as efficient combinations of task and data parallelism. The image segmentation algorithm makes use of an efficient cluster process which uses a novel approach for parallel merging. Our experimental results are consistent with the theoretical analysis and practical results. It provides the faster execution time for segmentation, when compared with Conventional method. Our test data is different CT scan images from the Medical database. More efficient implementations of Image Segmentation will likely result in even faster execution times.

  20. Mixed segmentation

    DEFF Research Database (Denmark)

    Bonde, Anders; Aagaard, Morten; Hansen, Allan Grutt

    content analysis and audience segmentation in a single-source perspective. The aim is to explain and understand target groups in relation to, on the one hand, emotional response to commercials or other forms of audio-visual communication and, on the other hand, living preferences and personality traits....... Innovatively, the research process is documented via an interactive data-visualization tool by which readers and fellow peers can access and, by using various filtering options, further analyze the results and, ultimately, reformulate the problem field....

  1. [Segmental neurofibromatosis].

    Science.gov (United States)

    Wagner, G; Meyer, V; Sachse, M M

    2017-11-08

    Thirteen years ago, a 48-year-old man developed numerous neurofibromas in a circumscribed area on the right chest. At the same time, a bilateral seminoma was diagnosed and treated curatively. There was no evidence for other complications of neurofibromatosis. The family history was inconspicuous. The segmental neurofibromatosis (SN) presented in this patient is the result of a mosaic formation resulting from a mutation of the NF1 gene, a tumor suppressor gene. Concomitant, typical diseases of neurofibromatosis generalisata (NFG), including malignant neoplasms, are the exception to SN.

  2. Segmental neurofibromatosis.

    Science.gov (United States)

    Sobjanek, Michał; Dobosz-Kawałko, Magdalena; Michajłowski, Igor; Pęksa, Rafał; Nowicki, Roman

    2014-12-01

    Segmental neurofibromatosis or type V neurofibromatosis is a rare genodermatosis characterized by neurofibromas, café-au-lait spots and neurofibromas limited to a circumscribed body region. The disease may be associated with systemic involvement and malignancies. The disorder has not been reported yet in the Polish medical literature. A 63-year-old Caucasian woman presented with a 20-year history of multiple, flesh colored, dome-shaped, soft to firm nodules situated in the right lumbar region. A histopathologic evaluation of three excised tumors revealed neurofibromas. No neurological and ophthalmologic symptoms of neurofibromatosis were diagnosed.

  3. Teachers as Producers of Data Analytics: A Case Study of a Teacher-Focused Educational Data Science Program

    Science.gov (United States)

    McCoy, Chase; Shih, Patrick C.

    2016-01-01

    Educational data science (EDS) is an emerging, interdisciplinary research domain that seeks to improve educational assessment, teaching, and student learning through data analytics. Teachers have been portrayed in the EDS literature as users of pre-constructed data dashboards in educational technologies, with little consideration given to them as…

  4. Experimental and analytical program to determine strains in 737 LAP splice joints subjected to normal fuselage pressurization loads

    Energy Technology Data Exchange (ETDEWEB)

    Roach, D.P. [Sandia National Labs., Albuquerque, NM (United States); Jeong, D.Y. [Department of Transportation, Cambridge, MA (United States). John A. Volpe National Transportation Systems Center

    1996-02-01

    The Federal Aviation Administration Technical Center (FAATC) has initiated several research projects to assess the structural integrity of the aging commercial aircraft fleet. One area of research involves the understanding of a phenomenon known as ``Widespread Fatigue Damage`` or WFD, which refers to a type of multiple element cracking that degrades the damage tolerance capability of an aircraft structure. Research on WFD has been performed both experimentally and analytically including finite element modeling of fuselage lap splice joints by the Volpe Center. Fuselage pressurization tests have also been conducted at the FAA`s Airworthiness Assurance NDI Validation Center (AANC) to obtain strain gage data from select locations on the FAA/AANC 737 Transport Aircraft Test Bed. One-hundred strain channels were used to monitor five different lap splice bays including the fuselage skin and substructure elements. These test results have been used to evaluate the accuracy of the analytical models and to support general aircraft analysis efforts. This paper documents the strain fields measured during the AANC tests and successfully correlates the results with analytical predictions.

  5. Applying Learning Analytics for Improving Students Engagement and Learning Outcomes in an MOOCS Enabled Collaborative Programming Course

    Science.gov (United States)

    Lu, Owen H. T.; Huang, Jeff C. H.; Huang, Anna Y. Q.; Yang, Stephen J. H.

    2017-01-01

    As information technology continues to evolve rapidly, programming skills become increasingly crucial. To be able to construct superb programming skills, the training must begin before college or even senior high school. However, when developing comprehensive training programmers, the learning and teaching processes must be considered. In order to…

  6. Using analytical tools for decision-making and program planning in natural resources: breaking the fear barrier

    Science.gov (United States)

    David L. Peterson; Daniel L. Schmoldt

    1999-01-01

    The National Park Service and other public agencies are increasing their emphasis on inventory and monitoring (I&M) programs to obtain the information needed to infer changes in resource conditions and trigger management responses.A few individuals on a planning team can develop I&M programs, although a focused workshop is more effective.Workshops are...

  7. A Multivariate Approach to a Meta-Analytic Review of the Effectiveness of the D.A.R.E. Program

    Directory of Open Access Journals (Sweden)

    Wei Pan

    2009-01-01

    Full Text Available The Drug Abuse Resistance Education (D.A.R.E. program is a widespread but controversial school-based drug prevention program in the United States as well as in many other countries. The present multivariate meta-analysis reviewed 20 studies that assessed the effectiveness of the D.A.R.E. program in the United States. The results showed that the effects of the D.A.R.E. program on drug use did not vary across the studies with a less than small overall effect while the effects on psychosocial behavior varied with still a less than small overall effect. In addition, the characteristics of the studies significantly explained the variation of the heterogeneous effects on psychosocial behavior, which provides empirical evidence for improving the school-based drug prevention program.

  8. Image Segmentation Algorithms Overview

    OpenAIRE

    Yuheng, Song; Hao, Yan

    2017-01-01

    The technology of image segmentation is widely used in medical image processing, face recognition pedestrian detection, etc. The current image segmentation techniques include region-based segmentation, edge detection segmentation, segmentation based on clustering, segmentation based on weakly-supervised learning in CNN, etc. This paper analyzes and summarizes these algorithms of image segmentation, and compares the advantages and disadvantages of different algorithms. Finally, we make a predi...

  9. Analytic materials.

    Science.gov (United States)

    Milton, Graeme W

    2016-11-01

    The theory of inhomogeneous analytic materials is developed. These are materials where the coefficients entering the equations involve analytic functions. Three types of analytic materials are identified. The first two types involve an integer p. If p takes its maximum value, then we have a complete analytic material. Otherwise, it is incomplete analytic material of rank p. For two-dimensional materials, further progress can be made in the identification of analytic materials by using the well-known fact that a 90(°) rotation applied to a divergence-free field in a simply connected domain yields a curl-free field, and this can then be expressed as the gradient of a potential. Other exact results for the fields in inhomogeneous media are reviewed. Also reviewed is the subject of metamaterials, as these materials provide a way of realizing desirable coefficients in the equations.

  10. Analytic trigonometry

    CERN Document Server

    Bruce, William J; Maxwell, E A; Sneddon, I N

    1963-01-01

    Analytic Trigonometry details the fundamental concepts and underlying principle of analytic geometry. The title aims to address the shortcomings in the instruction of trigonometry by considering basic theories of learning and pedagogy. The text first covers the essential elements from elementary algebra, plane geometry, and analytic geometry. Next, the selection tackles the trigonometric functions of angles in general, basic identities, and solutions of equations. The text also deals with the trigonometric functions of real numbers. The fifth chapter details the inverse trigonometric functions

  11. Long-term analytical performance of hemostasis field methods as assessed by evaluation of the results of an external quality assessment program for antithrombin.

    Science.gov (United States)

    Meijer, Piet; de Maat, Moniek P M; Kluft, Cornelis; Haverkate, Frits; van Houwelingen, Hans C

    2002-07-01

    It is important for a laboratory to know the stability of performance of laboratory tests over time. The aim of this study was to adapt from the field of clinical chemistry a method to assess the long-term analytical performance of hemostasis field methods. The linear regression model was used to compare the laboratory results with the consensus mean value of a survey. This model was applied to plasma antithrombin activity using the data for 82 laboratories, collected between 1996 and 1999 in the European Concerted Action on Thrombosis (ECAT) external quality assessment program. The long-term total, random, and systematic error were calculated. The variables introduced to define the long-term performance in this model were the long-term analytical CV (LCV(a)) and the analytical critical difference (ACD), which indicates the minimum difference necessary between two samples measured on a long-term time-scale to consider them statistically significantly different. The systematic error (bias) ranged from 4.5 to 103 units/L. The random error ranged from 24.4 to 242 units/L. For the majority of the laboratories, random error was the main component (>75%) of the total error. The LCV(a), after adjustment for the contribution of the bias, ranged from 2.8% to 48%. The ACD ranged from 78 to 1290 units/L with a median value of 190 units/L. No statistically significant differences were observed for either LCV(a) or ACD between the two different measurement principles for antithrombin activity based on the inhibition of either thrombin or factor Xa. This linear regression model is useful for assessing the total error, random error, and bias for hemostasis field methods. The LCV(a) and ACD for measurement on a long-term time-scale appear to be useful for assessing the long-term analytical performance.

  12. Optimally segmented magnetic structures

    DEFF Research Database (Denmark)

    Insinga, Andrea Roberto; Bahl, Christian; Bjørk, Rasmus

    We present a semi-analytical algorithm for magnet design problems, which calculates the optimal way to subdivide a given design region into uniformly magnetized segments.The availability of powerful rare-earth magnetic materials such as Nd-Fe-B has broadened the range of applications of permanent...... is not available.We will illustrate the results for magnet design problems from different areas, such as electric motors/generators (as the example in the picture), beam focusing for particle accelerators and magnetic refrigeration devices....... magnets[1][2]. However, the powerful rare-earth magnets are generally expensive, so both the scientific and industrial communities have devoted a lot of effort into developing suitable design methods. Even so, many magnet optimization algorithms either are based on heuristic approaches[3...

  13. ZFITTER: A Semi-analytical program for fermion pair production in e+ e- annihilation, from version 6.21 to version 6.42

    Energy Technology Data Exchange (ETDEWEB)

    Arbuzov, A.B.; Awramik, M.; Czakon, M.; Freitas, A.; Grunewald, M.W.; Monig, K.; Riemann, S.; Riemann, T.; /Dubna, JINR /DESY, Zeuthen /Cracow, INP /Wurzburg U. /Silesia

    2005-07-01

    ZFITTER is a Fortran program for the calculation of fermion pair production and radiative corrections at high energy e{sup +}e{sup -} colliders; it is also suitable for other applications where electroweak radiative corrections appear. ZFITTER is based on a semi-analytical approach to the calculation of radiative corrections in the Standard Model. They present a summary of new features of the ZFITTER program version 6.42 compared to version 6.21. The most important additions are: (1) some higher-order QED corrections to fermion pair production, (2) electroweak one-loop corrections to atomic parity violation, (3) electroweak one-loop corrections to {bar {nu}}{sub e}{nu}{sub e} production, (4) electroweak two-loop corrections to the W boson mass and the effective weak mixing angle.

  14. Data Analytics vs. Data Science: A Study of Similarities and Differences in Undergraduate Programs Based on Course Descriptions

    Science.gov (United States)

    Aasheim, Cheryl L.; Williams, Susan; Rutner, Paige; Gardiner, Adrian

    2015-01-01

    The rate at which data is produced and accumulated today is greater than at any point in history with little prospect of slowing. As organizations attempt to collect and analyze this data, there is a tremendous unmet demand for appropriately skilled knowledge workers. In response, universities are developing degree programs in data science and…

  15. 77 FR 39895 - New Analytic Methods and Sampling Procedures for the United States National Residue Program for...

    Science.gov (United States)

    2012-07-06

    ... Residue Program for Meat, Poultry, and Egg Products AGENCY: Food Safety and Inspection Service, USDA... meat, poultry, and egg products for animal drug residues, pesticides, and environmental contaminants in...- delivered items: Send to U.S. Department of Agriculture (USDA), FSIS, Docket Clerk, Patriots Plaza 3, 1400...

  16. Strategic market segmentation

    National Research Council Canada - National Science Library

    Maricic, Branko; Djordjevic, Aleksandar

    2015-01-01

    ..., requires segmented approach to the market that appreciates differences in expectations and preferences of customers. One of significant activities in strategic planning of marketing activities is market segmentation...

  17. Analytic geometry

    CERN Document Server

    Burdette, A C

    1971-01-01

    Analytic Geometry covers several fundamental aspects of analytic geometry needed for advanced subjects, including calculus.This book is composed of 12 chapters that review the principles, concepts, and analytic proofs of geometric theorems, families of lines, the normal equation of the line, and related matters. Other chapters highlight the application of graphing, foci, directrices, eccentricity, and conic-related topics. The remaining chapters deal with the concept polar and rectangular coordinates, surfaces and curves, and planes.This book will prove useful to undergraduate trigonometric st

  18. The Representation of Iran’s Nuclear Program in British Newspaper Editorials: A Critical Discourse Analytic Perspective

    Directory of Open Access Journals (Sweden)

    Mahmood Reza Atai

    2013-09-01

    Full Text Available In this study, Van Dijk’s (1998 model of CDA was utilized in order to examine the representation of Iran’s nuclear program in editorials published by British news casting companies. The analysis of the editorials was carried out at two levels of headlines and full text stories with regard to the linguistic features of lexical choices, nominalization, passivization, overcompleteness, and voice. The results support biased representation in media discourse, in this case Iran’s nuclear program. Likewise, the findings approve Bloor and Bloor (2007 ideological circles of Self (i.e., the West and Other (i.e., Iran or US and THEM in the media. The findings may be utilized to increase Critical Language Awareness (CLA among EFL teachers / students and can promise implications for ESP materials development and EAP courses for the students of journalism.

  19. Compulsion of a linear equation system to the development of analytic formulas for the sumsof some finite series with the help of special computer programming

    Directory of Open Access Journals (Sweden)

    Lenev Vladimir Stepanovitch

    2014-01-01

    Full Text Available The article presents a convincing system of mathematical reasoning allowing us to pass over the stages of recurrent formulas as well as the induction methods in the process of developing analytic formulas using computer programs. The article elaborates the ideas on how to make the computer derive analytic formulas. The author offers us a generalization consisting in using the method of summing up to the more wide range of series, as well as finding approximate specific solutions to some differential equations and summarizations, which can occur, for example, in finite element method. The suggested method of summing the degrees with the coefficient is generalized to:a The total formulas for the powers degrees of real numbers which are not the rational numbers. This will lead to approximate results.b The representation of sums is connected to the solutions of certain differential equations (Cauchy problem, where we can obtain the partial equations in the form of power series with rational coefficients.

  20. Solving multi-objective facility location problem using the fuzzy analytical hierarchy process and goal programming: a case study on infectious waste disposal centers

    Directory of Open Access Journals (Sweden)

    Narong Wichapa

    Full Text Available The selection of a suitable location for infectious waste disposal is one of the major problems in waste management. Determining the location of infectious waste disposal centers is a difficult and complex process because it requires combining social and environmental factors that are hard to interpret, and cost factors that require the allocation of resources. Additionally, it depends on several regulations. Based on the actual conditions of a case study, forty hospitals and three candidate municipalities in the sub-Northeast region of Thailand, we considered multiple factors such as infrastructure, geological and social & environmental factors, calculating global priority weights using the fuzzy analytical hierarchy process (FAHP. After that, a new multi-objective facility location problem model which combines FAHP and goal programming (GP, namely the FAHP-GP model, was tested. The proposed model can lead to selecting new suitable locations for infectious waste disposal by considering both total cost and final priority weight objectives. The novelty of the proposed model is the simultaneous combination of relevant factors that are difficult to interpret and cost factors, which require the allocation of resources. Keywords: Multi-objective facility location problem, Fuzzy analytic hierarchy process, Infectious waste disposal centers

  1. Advances in analytical chemistry

    Science.gov (United States)

    Arendale, W. F.; Congo, Richard T.; Nielsen, Bruce J.

    1991-01-01

    Implementation of computer programs based on multivariate statistical algorithms makes possible obtaining reliable information from long data vectors that contain large amounts of extraneous information, for example, noise and/or analytes that we do not wish to control. Three examples are described. Each of these applications requires the use of techniques characteristic of modern analytical chemistry. The first example, using a quantitative or analytical model, describes the determination of the acid dissociation constant for 2,2'-pyridyl thiophene using archived data. The second example describes an investigation to determine the active biocidal species of iodine in aqueous solutions. The third example is taken from a research program directed toward advanced fiber-optic chemical sensors. The second and third examples require heuristic or empirical models.

  2. Optimalisasi Alokasi Penggunaan Lahan di Sub DAS Ambang: Pendekatan Analitikal Hirarki Proses dan Goal Programming (Optimalization of Land Use Planning in Ambang Sub-Watershed: Analytical Hierarchy Process and Goal Programming Approach

    Directory of Open Access Journals (Sweden)

    Kresno Agus Hendarto

    2011-01-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 The purpose of land use planning on a river basin or a watershed is "to promote the accomplishment of service wide objectives and targets" subject to the lands potential and the public's desires. This paper aimed to describe a representative formulation of the analitical hierarchi process and linear programming application and show how it may be modified for goal programming. The purposive sampling was used to collect primary data. From five persons were represented each stakeholders on a watershed. The secondary data was collected from the report of each stakeholders and internet. The results show that goal programing had generated considerable interest as a tool for land use planning in multiple goal situations. It does present problems in terms of somewhat difficult data requirements-linearity in its usual form, possible inferior solutions, and lack of explicit recognition of tradeoffs. Keywords:  ambang watershed, linear program, analytical hierarchy process, goal programming, land use planning

  3. Analytical quadrics

    CERN Document Server

    Spain, Barry; Ulam, S; Stark, M

    1960-01-01

    Analytical Quadrics focuses on the analytical geometry of three dimensions. The book first discusses the theory of the plane, sphere, cone, cylinder, straight line, and central quadrics in their standard forms. The idea of the plane at infinity is introduced through the homogenous Cartesian coordinates and applied to the nature of the intersection of three planes and to the circular sections of quadrics. The text also focuses on paraboloid, including polar properties, center of a section, axes of plane section, and generators of hyperbolic paraboloid. The book also touches on homogenous coordi

  4. Immersive Analytics (Dagstuhl Seminar 16231)

    OpenAIRE

    Dwyer, Tim; Henry Riche, Nathalie; Klein, Karsten; Stuerzlinger, Wolfgang; Thomas, Bruce

    2016-01-01

    This report documents the program and the outcomes of Dagstuhl Seminar 16231 "Immersive Analytics". Close to 40 researchers and practitioners participated in this seminar to discuss and define the field of Immersive Analytics, to create a community around it, and to identify its research challenges. As the participants had a diverse background in a variety of disciplines, including Human-Computer-Interaction, Augmented and Virtual Reality, Information Visualization, and Visual Analytics, the ...

  5. Integrated Data Repository Toolkit (IDRT). A Suite of Programs to Facilitate Health Analytics on Heterogeneous Medical Data.

    Science.gov (United States)

    Bauer, C R K D; Ganslandt, T; Baum, B; Christoph, J; Engel, I; Löbe, M; Mate, S; Stäubert, S; Drepper, J; Prokosch, H-U; Winter, A; Sax, U

    2016-01-01

    In recent years, research data warehouses moved increasingly into the focus of interest of medical research. Nevertheless, there are only a few center-independent infrastructure solutions available. They aim to provide a consolidated view on medical data from various sources such as clinical trials, electronic health records, epidemiological registries or longitudinal cohorts. The i2b2 framework is a well-established solution for such repositories, but it lacks support for importing and integrating clinical data and metadata. The goal of this project was to develop a platform for easy integration and administration of data from heterogeneous sources, to provide capabilities for linking them to medical terminologies and to allow for transforming and mapping of data streams for user-specific views. A suite of three tools has been developed: the i2b2 Wizard for simplifying administration of i2b2, the IDRT Import and Mapping Tool for loading clinical data from various formats like CSV, SQL, CDISC ODM or biobanks and the IDRT i2b2 Web Client Plugin for advanced export options. The Import and Mapping Tool also includes an ontology editor for rearranging and mapping patient data and structures as well as annotating clinical data with medical terminologies, primarily those used in Germany (ICD-10-GM, OPS, ICD-O, etc.). With the three tools functional, new i2b2-based research projects can be created, populated and customized to researcher's needs in a few hours. Amalgamating data and metadata from different databases can be managed easily. With regards to data privacy a pseudonymization service can be plugged in. Using common ontologies and reference terminologies rather than project-specific ones leads to a consistent understanding of the data semantics. i2b2's promise is to enable clinical researchers to devise and test new hypothesis even without a deep knowledge in statistical programing. The approach presented here has been tested in a number of scenarios with millions

  6. SRL online Analytical Development

    Energy Technology Data Exchange (ETDEWEB)

    Jenkins, C.W.

    1991-01-01

    The Savannah River Site is operated by the Westinghouse Savannah River Co. for the Department of Energy to produce special nuclear materials for defense. R D support for site programs is provided by the Savannah River Laboratory, which I represent. The site is known primarily for its nuclear reactors, but actually three fourths of the efforts at the site are devoted to fuel/target fabrication, fuel/target reprocessing, and waste management. All of these operations rely heavily on chemical processes. The site is therefore a large chemical plant. There are then many potential applications for process analytical chemistry at SRS. The Savannah River Laboratory (SRL) has an Analytical Development Section of roughly 65 personnel that perform analyses for R D efforts at the lab, act as backup to the site Analytical Laboratories Department and develop analytical methods and instruments. I manage a subgroup of the Analytical Development Section called the Process Control Analyzer Development Group. The Prime mission of this group is to develop online/at-line analytical systems for site applications.

  7. SRL online Analytical Development

    Energy Technology Data Exchange (ETDEWEB)

    Jenkins, C.W.

    1991-12-31

    The Savannah River Site is operated by the Westinghouse Savannah River Co. for the Department of Energy to produce special nuclear materials for defense. R&D support for site programs is provided by the Savannah River Laboratory, which I represent. The site is known primarily for its nuclear reactors, but actually three fourths of the efforts at the site are devoted to fuel/target fabrication, fuel/target reprocessing, and waste management. All of these operations rely heavily on chemical processes. The site is therefore a large chemical plant. There are then many potential applications for process analytical chemistry at SRS. The Savannah River Laboratory (SRL) has an Analytical Development Section of roughly 65 personnel that perform analyses for R&D efforts at the lab, act as backup to the site Analytical Laboratories Department and develop analytical methods and instruments. I manage a subgroup of the Analytical Development Section called the Process Control & Analyzer Development Group. The Prime mission of this group is to develop online/at-line analytical systems for site applications.

  8. Video Analytics

    DEFF Research Database (Denmark)

    This book collects the papers presented at two workshops during the 23rd International Conference on Pattern Recognition (ICPR): the Third Workshop on Video Analytics for Audience Measurement (VAAM) and the Second International Workshop on Face and Facial Expression Recognition (FFER) from Real W...

  9. An analytic thomism?

    Directory of Open Access Journals (Sweden)

    Daniel Alejandro Pérez Chamorro.

    2012-12-01

    Full Text Available For 50 years the philosophers of the Anglo-Saxon analytic tradition (E. Anscombre, P. Geach, A. Kenny, P. Foot have tried to follow the Thomas Aquinas School which they use as a source to surpass the Cartesian Epistemology and to develop the virtue ethics. Recently, J. Haldane has inaugurated a program of “analytical thomism” which main result until the present has been his “theory of identity mind/world”. Nevertheless, none of Thomás’ admirers has still found the means of assimilating his metaphysics of being.

  10. Segmented trapped vortex cavity

    Science.gov (United States)

    Grammel, Jr., Leonard Paul (Inventor); Pennekamp, David Lance (Inventor); Winslow, Jr., Ralph Henry (Inventor)

    2010-01-01

    An annular trapped vortex cavity assembly segment comprising includes a cavity forward wall, a cavity aft wall, and a cavity radially outer wall there between defining a cavity segment therein. A cavity opening extends between the forward and aft walls at a radially inner end of the assembly segment. Radially spaced apart pluralities of air injection first and second holes extend through the forward and aft walls respectively. The segment may include first and second expansion joint features at distal first and second ends respectively of the segment. The segment may include a forward subcomponent including the cavity forward wall attached to an aft subcomponent including the cavity aft wall. The forward and aft subcomponents include forward and aft portions of the cavity radially outer wall respectively. A ring of the segments may be circumferentially disposed about an axis to form an annular segmented vortex cavity assembly.

  11. Video Analytics

    DEFF Research Database (Denmark)

    This book collects the papers presented at two workshops during the 23rd International Conference on Pattern Recognition (ICPR): the Third Workshop on Video Analytics for Audience Measurement (VAAM) and the Second International Workshop on Face and Facial Expression Recognition (FFER) from Real...... include: re-identification, consumer behavior analysis, utilizing pupillary response for task difficulty measurement, logo detection, saliency prediction, classification of facial expressions, face recognition, face verification, age estimation, super-resolution, pose estimation, and pain recognition...

  12. Video Analytics

    DEFF Research Database (Denmark)

    include: re-identification, consumer behavior analysis, utilizing pupillary response for task difficulty measurement, logo detection, saliency prediction, classification of facial expressions, face recognition, face verification, age estimation, super-resolution, pose estimation, and pain recognition......This book collects the papers presented at two workshops during the 23rd International Conference on Pattern Recognition (ICPR): the Third Workshop on Video Analytics for Audience Measurement (VAAM) and the Second International Workshop on Face and Facial Expression Recognition (FFER) from Real...

  13. Analytical history

    OpenAIRE

    Bertrand M. Roehner

    2017-01-01

    The purpose of this note is to explain what is "analytical history", a modular and testable analysis of historical events introduced in a book published in 2002 (Roehner and Syme 2002). Broadly speaking, it is a comparative methodology for the analysis of historical events. Comparison is the keystone and hallmark of science. For instance, the extrasolar planets are crucial for understanding our own solar system. Until their discovery, astronomers could observe only one instance. Single instan...

  14. Impact of time of presentation on process performance and outcomes in ST-segment-elevation myocardial infarction: a report from the American Heart Association: Mission Lifeline program.

    Science.gov (United States)

    Dasari, Tarun W; Roe, Matthew T; Chen, Anita Y; Peterson, Eric D; Giugliano, Robert P; Fonarow, Gregg C; Saucedo, Jorge F

    2014-09-01

    Prior studies demonstrated that patients with ST-segment-elevation myocardial infarction presenting during off-hours (weeknights, weekends, and holidays) have slower reperfusion times. Recent nationwide initiatives have emphasized 24/7 quality care in ST-segment-elevation myocardial infarction. It remains unclear whether patients presenting off-hours versus on-hours receive similar quality care in contemporary practice. Using Acute Coronary Treatment and Intervention Outcomes Network-Get With The Guidelines (ACTION-GWTG) database, we examined ST-segment-elevation myocardial infarction performance measures in patients presenting off-hours (n=27 270) versus on-hours (n=15 972; January 2007 to September 2010) at 447 US centers. Key quality measures assessed were aspirin use within first 24 hours, door-to-balloon time, door-to-ECG time, and door-to-needle time. In-hospital risk-adjusted all-cause mortality was calculated. Baseline demographic and clinical characteristics were similar. Aspirin use within 24 hours approached 99% in both groups. Among patients undergoing primary percutaneous coronary intervention (n=41 979; 97.1%), median door-to-balloon times were 56 versus 72 minutes (Pmyocardial infarction was high, regardless of time of presentation. Door-to-balloon time was, however, slightly delayed (by an average of 16 minutes), and risk-adjusted in-hospital mortality was 13% higher in patients presenting off-hours. © 2014 American Heart Association, Inc.

  15. Multi-segmental neurofibromatosis

    OpenAIRE

    Kumar Sudhir; Kumar Ravi

    2004-01-01

    Neurofibromatosis (NF), one of the commonest phakomatoses, is characterized by varied clinical manifestations. Segmental NF is one of the uncommon subtypes of NF. We report a young adult presenting with asymptomatic skin lesions- neurofibromas and café-au-lait macules- over localized areas of the lower back, affecting more than one segment. None of the family members were found to have features of segmental NF. Segmental NF may be misdiagnosed as a birthmark or remain undiagnosed for l...

  16. Automatic Melody Segmentation

    NARCIS (Netherlands)

    Rodríguez López, Marcelo

    2016-01-01

    The work presented in this dissertation investigates music segmentation. In the field of Musicology, segmentation refers to a score analysis technique, whereby notated pieces or passages of these pieces are divided into “units” referred to as sections, periods, phrases, and so on. Segmentation

  17. Selecting university undergraduate student activities via compromised-analytical hierarchy process and 0-1 integer programming to maximize SETARA points

    Science.gov (United States)

    Nazri, Engku Muhammad; Yusof, Nur Ai'Syah; Ahmad, Norazura; Shariffuddin, Mohd Dino Khairri; Khan, Shazida Jan Mohd

    2017-11-01

    Prioritizing and making decisions on what student activities to be selected and conducted to fulfill the aspiration of a university as translated in its strategic plan must be executed with transparency and accountability. It is becoming even more crucial, particularly for universities in Malaysia with the recent budget cut imposed by the Malaysian government. In this paper, we illustrated how 0-1 integer programming (0-1 IP) model was implemented to select which activities among the forty activities proposed by the student body of Universiti Utara Malaysia (UUM) to be implemented for the 2017/2018 academic year. Two different models were constructed. The first model was developed to determine the minimum total budget that should be given to the student body by the UUM management to conduct all the activities that can fulfill the minimum targeted number of activities as stated in its strategic plan. On the other hand, the second model was developed to determine which activities to be selected based on the total budget already allocated beforehand by the UUM management towards fulfilling the requirements as set in its strategic plan. The selection of activities for the second model, was also based on the preference of the members of the student body whereby the preference value for each activity was determined using Compromised-Analytical Hierarchy Process. The outputs from both models were compared and discussed. The technique used in this study will be useful and suitable to be implemented by organizations with key performance indicator-oriented programs and having limited budget allocation issues.

  18. Do programs designed to train working memory, other executive functions, and attention benefit children with ADHD? A meta-analytic review of cognitive, academic, and behavioral outcomes.

    Science.gov (United States)

    Rapport, Mark D; Orban, Sarah A; Kofler, Michael J; Friedman, Lauren M

    2013-12-01

    Children with ADHD are characterized frequently as possessing underdeveloped executive functions and sustained attentional abilities, and recent commercial claims suggest that computer-based cognitive training can remediate these impairments and provide significant and lasting improvement in their attention, impulse control, social functioning, academic performance, and complex reasoning skills. The present review critically evaluates these claims through meta-analysis of 25 studies of facilitative intervention training (i.e., cognitive training) for children with ADHD. Random effects models corrected for publication bias and sampling error revealed that studies training short-term memory alone resulted in moderate magnitude improvements in short-term memory (d=0.63), whereas training attention did not significantly improve attention and training mixed executive functions did not significantly improve the targeted executive functions (both nonsignificant: 95% confidence intervals include 0.0). Far transfer effects of cognitive training on academic functioning, blinded ratings of behavior (both nonsignificant), and cognitive tests (d=0.14) were nonsignificant or negligible. Unblinded raters (d=0.48) reported significantly larger benefits relative to blinded raters and objective tests (both peffects. Critical examination of training targets revealed incongruence with empirical evidence regarding the specific executive functions that are (a) most impaired in ADHD, and (b) functionally related to the behavioral and academic outcomes these training programs are intended to ameliorate. Collectively, meta-analytic results indicate that claims regarding the academic, behavioral, and cognitive benefits associated with extant cognitive training programs are unsupported in ADHD. The methodological limitations of the current evidence base, however, leave open the possibility that cognitive training techniques designed to improve empirically documented executive function

  19. Analytical and Mathematical Modeling and Optimization of Fiber Metal Laminates (FMLs subjected to low-velocity impact via combined response surface regression and zero-One programming

    Directory of Open Access Journals (Sweden)

    Faramarz Ashenai Ghasemi

    Full Text Available This paper presents analytical and mathematical modeling and optimization of the dynamic behavior of the fiber metal laminates (FMLs subjected to low-velocity impact. The deflection to thickness (w/h ratio has been identified through the governing equations of the plate that are solved using the first-order shear deformation theory as well as the Fourier series method. With the help of a two degrees-of-freedom system, consisting of springs-masses, and the Choi's linearized Hertzian contact model the interaction between the impactor and the plate is modeled. Thirty-one experiments are conducted on samples of different layer sequences and volume fractions of Al plies in the composite Structures. A reliable fitness function in the form of a strict linear mathematical function constructed. Using an ordinary least square method, response regression coefficients estimated and a zero-one programming technique proposed to optimize the FML plate behavior subjected to any technological or cost restrictions. The results indicated that FML plate behavior is highly affected by layer sequences and volume fractions of Al plies. The results also showed that, embedding Al plies at outer layers of the structure significantly results in a better response of the structure under low-velocity impact, instead of embedding them in the middle or middle and outer layers of the structure.

  20. Penerapan Metode Analytic Network Process (ANP Untuk Pendukung Keputusan Pemilihan Tema Tugas Akhir (Studi Kasus: Program Studi S1 Informatika ST3 Telkom

    Directory of Open Access Journals (Sweden)

    Dila Nurlaila

    2017-07-01

    Full Text Available Berdasarkan hasil dari survey yang dilakukan terhadap 30 mahasiswa Informatika yang akan mengambil mata kuliah tugas akhir, lebih dari 80% menjawab belum memiliki konsep Tugas Akhir, hal ini menjadi perhatian bahwa masih banyak dari mahasiswa yang belum mengetahui tema Tugas Akhir apa yang akan dambilnya nanti yang sesuai dengan minat dan kompetensinya. Dari hal tersebut akan dilakukan penelitian penerapan metode Analytic Network Process (ANP pada  Pendukung keputusan pemilihan tugas tema Tugas Akhir. ANP merupakan suatu metode dalam decision making yang mempertimbangkan hubungan antar kriteria. Penelitian ini bertujuan untuk menguji tingkat keberhasilan metode ANP dalam mengatasi masalah mahasiswa yang belum mengetahui konsep dari tugas akhir. Langkah pertama, ditentukan kriteria yang menjadi penentu dari pemilihan tema Tugas Akhir di prodi S1 Informatika. Kriteria ini akan dibuat model jaringan ANP menggunakan software super decision dan setiap kriteria akan dilakukan pairwised comparison (perbandingan berpasangan guna untuk mendapatkan pembobotan dari masing – masing kriteria dan sub kriteria. Yang menjadi expert judgement pada pengambil keputusan ini adalah ketua keahlian program studi ICM dan DESTI. Setelah melakukan pengujian dengan membandingkan pilihan secara manual dengan pilihan berdasarkan perhitungan ANP hasilnya sebesar 46,6% tema tugas akhir mahasiswa sesuai dan akurat, hilangnya 53,4% akurasi dikarenakan ketidak sesuaian jawaban mahasiswa saat menentukan nilai peminatan.

  1. Probabilistic Segmentation of Folk Music Recordings

    Directory of Open Access Journals (Sweden)

    Ciril Bohak

    2016-01-01

    Full Text Available The paper presents a novel method for automatic segmentation of folk music field recordings. The method is based on a distance measure that uses dynamic time warping to cope with tempo variations and a dynamic programming approach to handle pitch drifting for finding similarities and estimating the length of repeating segment. A probabilistic framework based on HMM is used to find segment boundaries, searching for optimal match between the expected segment length, between-segment similarities, and likely locations of segment beginnings. Evaluation of several current state-of-the-art approaches for segmentation of commercial music is presented and their weaknesses when dealing with folk music are exposed, such as intolerance to pitch drift and variable tempo. The proposed method is evaluated and its performance analyzed on a collection of 206 folk songs of different ensemble types: solo, two- and three-voiced, choir, instrumental, and instrumental with singing. It outperforms current commercial music segmentation methods for noninstrumental music and is on a par with the best for instrumental recordings. The method is also comparable to a more specialized method for segmentation of solo singing folk music recordings.

  2. Video Analytics

    DEFF Research Database (Denmark)

    This book collects the papers presented at two workshops during the 23rd International Conference on Pattern Recognition (ICPR): the Third Workshop on Video Analytics for Audience Measurement (VAAM) and the Second International Workshop on Face and Facial Expression Recognition (FFER) from Real...... World Videos. The workshops were run on December 4, 2016, in Cancun in Mexico. The two workshops together received 13 papers. Each paper was then reviewed by at least two expert reviewers in the field. In all, 11 papers were accepted to be presented at the workshops. The topics covered in the papers...

  3. Biochemical Technology Program progress report for the period January 1--June 30, 1976. [Centrifugal analyzers and advanced analytical systems for blood and body fluids

    Energy Technology Data Exchange (ETDEWEB)

    Mrochek, J.E.; Burtis, C.A.; Scott, C.D. (comps.)

    1976-09-01

    This document, which covers the period January 1-June 30, 1976, describes progress in the following areas: (1) advanced analytical techniques for the clinical laboratory, (2) fast clinical analyzers, (3) development of a miniaturized analytical clinical laboratory system, (4) centrifugal fast analyzers for animal toxicological studies, and (5) chemical profile of body fluids.

  4. Improved model reduction and tuning of fractional-order PI(λ)D(μ) controllers for analytical rule extraction with genetic programming.

    Science.gov (United States)

    Das, Saptarshi; Pan, Indranil; Das, Shantanu; Gupta, Amitava

    2012-03-01

    Genetic algorithm (GA) has been used in this study for a new approach of suboptimal model reduction in the Nyquist plane and optimal time domain tuning of proportional-integral-derivative (PID) and fractional-order (FO) PI(λ)D(μ) controllers. Simulation studies show that the new Nyquist-based model reduction technique outperforms the conventional H(2)-norm-based reduced parameter modeling technique. With the tuned controller parameters and reduced-order model parameter dataset, optimum tuning rules have been developed with a test-bench of higher-order processes via genetic programming (GP). The GP performs a symbolic regression on the reduced process parameters to evolve a tuning rule which provides the best analytical expression to map the data. The tuning rules are developed for a minimum time domain integral performance index described by a weighted sum of error index and controller effort. From the reported Pareto optimal front of the GP-based optimal rule extraction technique, a trade-off can be made between the complexity of the tuning formulae and the control performance. The efficacy of the single-gene and multi-gene GP-based tuning rules has been compared with the original GA-based control performance for the PID and PI(λ)D(μ) controllers, handling four different classes of representative higher-order processes. These rules are very useful for process control engineers, as they inherit the power of the GA-based tuning methodology, but can be easily calculated without the requirement for running the computationally intensive GA every time. Three-dimensional plots of the required variation in PID/fractional-order PID (FOPID) controller parameters with reduced process parameters have been shown as a guideline for the operator. Parametric robustness of the reported GP-based tuning rules has also been shown with credible simulation examples. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  5. Analytical mechanics

    CERN Document Server

    Helrich, Carl S

    2017-01-01

    This advanced undergraduate textbook begins with the Lagrangian formulation of Analytical Mechanics and then passes directly to the Hamiltonian formulation and the canonical equations, with constraints incorporated through Lagrange multipliers. Hamilton's Principle and the canonical equations remain the basis of the remainder of the text. Topics considered for applications include small oscillations, motion in electric and magnetic fields, and rigid body dynamics. The Hamilton-Jacobi approach is developed with special attention to the canonical transformation in order to provide a smooth and logical transition into the study of complex and chaotic systems. Finally the text has a careful treatment of relativistic mechanics and the requirement of Lorentz invariance. The text is enriched with an outline of the history of mechanics, which particularly outlines the importance of the work of Euler, Lagrange, Hamilton and Jacobi. Numerous exercises with solutions support the exceptionally clear and concise treatment...

  6. THE PROPOSAL MODEL OF RATIONAL WORKFORCE ASSIGNMENT IN DOKUZ EYLUL UNIVERSITY BY ANALYTIC HIERARCHY PROCESS BASED 0-1 INTEGER PROGRAMMING

    Directory of Open Access Journals (Sweden)

    Yılmaz GÖKŞEN

    2016-11-01

    Full Text Available Production factors include labour, capital, nature and technology in traditional organization management. Nowadays, creativity, innovation and ability can be added to these. Competition is intense both in private and public sectors. Workforce can be said to be the main source enriching the organization and making it complex. What makes the organizational powerful is that talented personnel can make new ideas to be productive by using their creativity. From this point of view workforce productivity is important parameter in organizational efficiency. Workforce productivity in organizations where there are too many employees is primarily related to employ the personnel in a right position. The appointment of employees to work in accordance with their capabilities increase efficiency. This situation makes the structure of the workforce and make the organization be capable of competing with the rivals. The Assignment model of the type of linear programming models is a mathematical method used to match the right person for the right job. The coefficients of the variables in the objective function of the assignment model constitute potential contributions of employees. Factors that make contributions in different types of jobs are different and expert opinion is needed for the evaluation of these factors. In this study, more than 1000 people who work for Dokuz Eylul University as drivers, food handlers, technicians, office workers, securities and servants are taken into consideration. Gender, level of education, the distance from working place, marital status, number of children and tenure of these employees have been included in the analysis. Analytical hierarchy process which is a method of multi-criteria decision-making model has been preferred as a method. Specific criteria have been determined for each occupational group depending on this method. Later, weighted average for these criteria in question have been found. With these values belonging

  7. Segmental neurofibromatosis and malignancy.

    Science.gov (United States)

    Dang, Julie D; Cohen, Philip R

    2010-01-01

    Segmental neurofibromatosis is an uncommon variant of neurofibromatosis type I characterized by neurofibromas and/or café-au-lait macules localized to one sector of the body. Although patients with neurofibromatosis type I have an associated increased risk of certain malignancies, malignancy has only occasionally been reported in patients with segmental neurofibromatosis. The published reports of patients with segmental neurofibromatosis who developed malignancy were reviewed and the characteristics of these patients and their cancers were summarized. Ten individuals (6 women and 4 men) with segmental neurofibromatosis and malignancy have been reported. The malignancies include malignant peripheral nerve sheath tumor (3), malignant melanoma (2), breast cancer (1), colon cancer (1), gastric cancer (1), lung cancer (1), and Hodgkin lymphoma (1). The most common malignancies in patients with segmental neurofibromatosis are derived from neural crest cells: malignant peripheral nerve sheath tumor and malignant melanoma. The incidence of malignancy in patients with segmental neurofibromatosis may approach that of patients with neurofibromatosis type I.

  8. Segmenting patients and physicians using preferences from discrete choice experiments.

    Science.gov (United States)

    Deal, Ken

    2014-01-01

    People often form groups or segments that have similar interests and needs and seek similar benefits from health providers. Health organizations need to understand whether the same health treatments, prevention programs, services, and products should be applied to everyone in the relevant population or whether different treatments need to be provided to each of several segments that are relatively homogeneous internally but heterogeneous among segments. Our objective was to explain the purposes, benefits, and methods of segmentation for health organizations, and to illustrate the process of segmenting health populations based on preference coefficients from a discrete choice conjoint experiment (DCE) using an example study of prevention of cyberbullying among university students. We followed a two-level procedure for investigating segmentation incorporating several methods for forming segments in Level 1 using DCE preference coefficients and testing their quality, reproducibility, and usability by health decision makers. Covariates (demographic, behavioral, lifestyle, and health state variables) were included in Level 2 to further evaluate quality and to support the scoring of large databases and developing typing tools for assigning those in the relevant population, but not in the sample, to the segments. Several segmentation solution candidates were found during the Level 1 analysis, and the relationship of the preference coefficients to the segments was investigated using predictive methods. Those segmentations were tested for their quality and reproducibility and three were found to be very close in quality. While one seemed better than others in the Level 1 analysis, another was very similar in quality and proved ultimately better in predicting segment membership using covariates in Level 2. The two segments in the final solution were profiled for attributes that would support the development and acceptance of cyberbullying prevention programs among university

  9. Segmenting the MBA Market: An Australian Strategy.

    Science.gov (United States)

    Everett, James E.; Armstrong, Robert W.

    1990-01-01

    A University of Western Australia market segmentation study for the masters program in business administration examined the relationship between Graduate Management Admission Test scores, work experience, faculty of undergraduate degree, gender, and academic success in the program. Implications of the results for establishing admission criteria…

  10. Unsupervised Segmentation Methods of TV Contents

    Directory of Open Access Journals (Sweden)

    Elie El-Khoury

    2010-01-01

    Full Text Available We present a generic algorithm to address various temporal segmentation topics of audiovisual contents such as speaker diarization, shot, or program segmentation. Based on a GLR approach, involving the ΔBIC criterion, this algorithm requires the value of only a few parameters to produce segmentation results at a desired scale and on most typical low-level features used in the field of content-based indexing. Results obtained on various corpora are of the same quality level than the ones obtained by other dedicated and state-of-the-art methods.

  11. A Perspective On Segment Reporting Choices And Segment Reconciliations

    OpenAIRE

    Dana Hollie; Shaokun£¨Carol) Yu

    2015-01-01

    In 2014, segment reporting gained third place in SEC comment letters. This article reviews the history of segment reporting including segment reporting choices and segment reconciliations, the current concerns as the level of detail in segment disclosures varies widely across organizations, the value relevance of segment reconciliations and its market consequences, and the importance of segment reporting to management. The following are highlights of the manuscript: The third-most-common area...

  12. Methodological Options in International Market Segmentation

    OpenAIRE

    Bastian, Iryna

    2007-01-01

    The last decades were marked with an increasing involvement of multi-product manufacturers into cross-border business activities. Dealing with heterogeneous needs of consumers in different countries is one of the biggest challenges in the modern business. Correspondingly, special importance is being attached to international market segmentation. Finding transnational segments and developing standardized marketing programs for targeting them gain on popularity. The doctoral research of Dr. Iry...

  13. Speech segmentation in aphasia.

    Science.gov (United States)

    Peñaloza, Claudia; Benetello, Annalisa; Tuomiranta, Leena; Heikius, Ida-Maria; Järvinen, Sonja; Majos, Maria Carmen; Cardona, Pedro; Juncadella, Montserrat; Laine, Matti; Martin, Nadine; Rodríguez-Fornells, Antoni

    2015-01-01

    Speech segmentation is one of the initial and mandatory phases of language learning. Although some people with aphasia have shown a preserved ability to learn novel words, their speech segmentation abilities have not been explored. We examined the ability of individuals with chronic aphasia to segment words from running speech via statistical learning. We also explored the relationships between speech segmentation and aphasia severity, and short-term memory capacity. We further examined the role of lesion location in speech segmentation and short-term memory performance. The experimental task was first validated with a group of young adults (n = 120). Participants with chronic aphasia (n = 14) were exposed to an artificial language and were evaluated in their ability to segment words using a speech segmentation test. Their performance was contrasted against chance level and compared to that of a group of elderly matched controls (n = 14) using group and case-by-case analyses. As a group, participants with aphasia were significantly above chance level in their ability to segment words from the novel language and did not significantly differ from the group of elderly controls. Speech segmentation ability in the aphasic participants was not associated with aphasia severity although it significantly correlated with word pointing span, a measure of verbal short-term memory. Case-by-case analyses identified four individuals with aphasia who performed above chance level on the speech segmentation task, all with predominantly posterior lesions and mild fluent aphasia. Their short-term memory capacity was also better preserved than in the rest of the group. Our findings indicate that speech segmentation via statistical learning can remain functional in people with chronic aphasia and suggest that this initial language learning mechanism is associated with the functionality of the verbal short-term memory system and the integrity of the left inferior frontal region.

  14. Schedule Analytics

    Science.gov (United States)

    2016-04-30

    led several cost research initiatives in cloud computing, service-oriented architecture , and agile development and various independent schedule...systems and platforms. Manring is trained and experienced on a number of commercial parametric software cost models and risk analysis tools. She has...and he supports DoD and federal acquisition efforts with a focus on rapid and agile practices to speed solutions with the lowest practical program

  15. Segmented conjugated polymers

    Indian Academy of Sciences (India)

    Abstract. Segmented conjugated polymers, wherein the conjugation is randomly truncated by varying lengths of non-conjugated segments, form an interesting class of polymers as they not only represent systems of varying stiffness, but also ones where the backbone can be construed as being made up of chromophores of ...

  16. Segmentation, advertising and prices

    NARCIS (Netherlands)

    Galeotti, Andrea; Moraga-González, José Luis

    This paper explores the implications of market segmentation on firm competitiveness. In contrast to earlier work, here market segmentation is minimal in the sense that it is based on consumer attributes that are completely unrelated to tastes. We show that when the market is comprised by two

  17. Segmented conjugated polymers

    Indian Academy of Sciences (India)

    Segmented conjugated polymers, wherein the conjugation is randomly truncated by varying lengths of non-conjugated segments, form an interesting class of polymers as they not only represent systems of varying stiffness, but also ones where the backbone can be construed as being made up of chromophores of varying ...

  18. Visual Analytics Applied to Image Analysis : From Segmentation to Classification

    NARCIS (Netherlands)

    Rauber, Paulo

    2017-01-01

    Image analysis is the field of study concerned with extracting information from images. This field is immensely important for commercial and scientific applications, from identifying people in photographs to recognizing diseases in medical images. The goal behind the work presented in this thesis is

  19. Multi-segmental neurofibromatosis

    Directory of Open Access Journals (Sweden)

    Kumar Sudhir

    2004-01-01

    Full Text Available Neurofibromatosis (NF, one of the commonest phakomatoses, is characterized by varied clinical manifestations. Segmental NF is one of the uncommon subtypes of NF. We report a young adult presenting with asymptomatic skin lesions- neurofibromas and café-au-lait macules- over localized areas of the lower back, affecting more than one segment. None of the family members were found to have features of segmental NF. Segmental NF may be misdiagnosed as a birthmark or remain undiagnosed for long periods of time, as the patients are often asymptomatic. Moreover, the clinical features are highly variable and range from a small area of skin involvement to involvement over the entire half of the body. This variation is explained by the fact that segmental NF is thought to arise from a postzygotic NF1 gene mutation, leading to somatic mosaicism. We have also reviewed the relevant literature on this subject.

  20. Multi-segmental neurofibromatosis

    Directory of Open Access Journals (Sweden)

    Kumar Sudhir

    2004-11-01

    Full Text Available Neurofibromatosis (NF, one of the commonest phakomatoses, is characterized by varied clinical manifestations. Segmental NF is one of the uncommon subtypes of NF. We report a young adult presenting with asymptomatic skin lesions- neurofibromas and café-au-lait macules- over localized areas of the lower back, affecting more than one segment. None of the family members were found to have features of segmental NF. Segmental NF may be misdiagnosed as a birthmark or remain undiagnosed for long periods of time, as the patients are often asymptomatic. Moreover, the clinical features are highly variable and range from a small area of skin involvement to involvement over the entire half of the body. This variation is explained by the fact that segmental NF is thought to arise from a postzygotic NF1 gene mutation, leading to somatic mosaicism. We have also reviewed the relevant literature on this subject.

  1. Pancreas and cyst segmentation

    Science.gov (United States)

    Dmitriev, Konstantin; Gutenko, Ievgeniia; Nadeem, Saad; Kaufman, Arie

    2016-03-01

    Accurate segmentation of abdominal organs from medical images is an essential part of surgical planning and computer-aided disease diagnosis. Many existing algorithms are specialized for the segmentation of healthy organs. Cystic pancreas segmentation is especially challenging due to its low contrast boundaries, variability in shape, location and the stage of the pancreatic cancer. We present a semi-automatic segmentation algorithm for pancreata with cysts. In contrast to existing automatic segmentation approaches for healthy pancreas segmentation which are amenable to atlas/statistical shape approaches, a pancreas with cysts can have even higher variability with respect to the shape of the pancreas due to the size and shape of the cyst(s). Hence, fine results are better attained with semi-automatic steerable approaches. We use a novel combination of random walker and region growing approaches to delineate the boundaries of the pancreas and cysts with respective best Dice coefficients of 85.1% and 86.7%, and respective best volumetric overlap errors of 26.0% and 23.5%. Results show that the proposed algorithm for pancreas and pancreatic cyst segmentation is accurate and stable.

  2. Familial segmental neurofibromatosis.

    Science.gov (United States)

    Oguzkan, Sibel; Cinbis, Mine; Ayter, Sükriye; Anlar, Banu; Aysun, Sabiha

    2004-05-01

    Segmental neurofibromatosis is considered to be the result of postzygotic NF1 gene mutations. We present a family in which the proband has generalized neurofibromatosis 1, whereas members of previous generations manifest segmental skin lesions. All, including the clinically asymptomatic grandmother, carry the same haplotype. This is the only case in the literature in which a parent with segmental skin findings has a child with full-blown neurofibromatosis 1 disease. The genetic mechanisms underlying this association are discussed. This family can be further investigated by examination of tissue samples from affected and unaffected sites for mutations.

  3. Automated determination of segment positions in a high-purity 32-fold segmented germanium detector

    CERN Document Server

    Miller, K L; Campbell, C; Morris, L; Müller, W F; Strahler, E A

    2002-01-01

    An automated system for determining detector segment positions in a high-purity 32-fold segmented germanium detector has been developed. To determine segment positions as they would appear in an experiment, positions must be measured while the 32-fold segmented germanium crystal is kept at liquid nitrogen temperatures. A collimated sup 5 sup 7 Co gamma-ray source is moved around the surface of the detector cryostat, and the response of the germanium crystal is measured. Motion of the source is driven by two Slo-Syn motors and BEI incremental optical encoders, which are controlled through LabVIEW programming and a National Instruments PCStep board. The collected data is analyzed to determine the position of the center of each of the 32 segments.

  4. Sclerosing segmental neurofibromatosis.

    Science.gov (United States)

    Lee, Joong Sun; Kim, You Chan

    2005-04-01

    Segmental neurofibromatosis is a rare disorder characterized by cafe-au-lait macules and/or neurofibromas limited to a single body segment. The neurofibromas in segmental neurofibromatosis are usually soft, non-tender nodules as in other types of neurofibromatosis. Histopathologically, they are usually non-encapsulated, loosely textured dermal tumors. We report a case of sclerosing segmental neurofibromatosis, in which the patient presented with several grouped, erythematous to brownish, firm tender nodules on the left side of the posterior neck. Histopathologically, the stroma was mostly very fibrotic, especially around hair follicles, in addition to the usual features of neurofibroma. The atypical clinical feature, hardness, and tenderness of the lesions may be associated with the fibrosis.

  5. Understanding Business Analytics

    Science.gov (United States)

    2015-01-05

    analytic resource for the enter- prise , individual Air Force domain areas see the value of integrating analytic capabilities into their own decision ...Business Analytics, Decision Analytics, Business Intelligence, Advanced Analytics, Data Science. . . to a certain degree, to label is to limit - if only...providing a quantitative basis for complex decisions . Decision Analysis: a systematic, quantitative and visual approach to addressing and evaluating important

  6. Remote sensing image segmentation based on Hadoop cloud platform

    Science.gov (United States)

    Li, Jie; Zhu, Lingling; Cao, Fubin

    2018-01-01

    To solve the problem that the remote sensing image segmentation speed is slow and the real-time performance is poor, this paper studies the method of remote sensing image segmentation based on Hadoop platform. On the basis of analyzing the structural characteristics of Hadoop cloud platform and its component MapReduce programming, this paper proposes a method of image segmentation based on the combination of OpenCV and Hadoop cloud platform. Firstly, the MapReduce image processing model of Hadoop cloud platform is designed, the input and output of image are customized and the segmentation method of the data file is rewritten. Then the Mean Shift image segmentation algorithm is implemented. Finally, this paper makes a segmentation experiment on remote sensing image, and uses MATLAB to realize the Mean Shift image segmentation algorithm to compare the same image segmentation experiment. The experimental results show that under the premise of ensuring good effect, the segmentation rate of remote sensing image segmentation based on Hadoop cloud Platform has been greatly improved compared with the single MATLAB image segmentation, and there is a great improvement in the effectiveness of image segmentation.

  7. Universal Numeric Segmented Display

    OpenAIRE

    Azad, Md. Abul Kalam; Sharmeen, Rezwana; S. M. Kamruzzaman

    2010-01-01

    Segmentation display plays a vital role to display numerals. But in today's world matrix display is also used in displaying numerals. Because numerals has lots of curve edges which is better supported by matrix display. But as matrix display is costly and complex to implement and also needs more memory, segment display is generally used to display numerals. But as there is yet no proposed compact display architecture to display multiple language numerals at a time, this paper proposes uniform...

  8. Strategic market segmentation

    Directory of Open Access Journals (Sweden)

    Maričić Branko R.

    2015-01-01

    Full Text Available Strategic planning of marketing activities is the basis of business success in modern business environment. Customers are not homogenous in their preferences and expectations. Formulating an adequate marketing strategy, focused on realization of company's strategic objectives, requires segmented approach to the market that appreciates differences in expectations and preferences of customers. One of significant activities in strategic planning of marketing activities is market segmentation. Strategic planning imposes a need to plan marketing activities according to strategically important segments on the long term basis. At the same time, there is a need to revise and adapt marketing activities on the short term basis. There are number of criteria based on which market segmentation is performed. The paper will consider effectiveness and efficiency of different market segmentation criteria based on empirical research of customer expectations and preferences. The analysis will include traditional criteria and criteria based on behavioral model. The research implications will be analyzed from the perspective of selection of the most adequate market segmentation criteria in strategic planning of marketing activities.

  9. Gamifying Video Object Segmentation.

    Science.gov (United States)

    Spampinato, Concetto; Palazzo, Simone; Giordano, Daniela

    2017-10-01

    Video object segmentation can be considered as one of the most challenging computer vision problems. Indeed, so far, no existing solution is able to effectively deal with the peculiarities of real-world videos, especially in cases of articulated motion and object occlusions; limitations that appear more evident when we compare the performance of automated methods with the human one. However, manually segmenting objects in videos is largely impractical as it requires a lot of time and concentration. To address this problem, in this paper we propose an interactive video object segmentation method, which exploits, on one hand, the capability of humans to identify correctly objects in visual scenes, and on the other hand, the collective human brainpower to solve challenging and large-scale tasks. In particular, our method relies on a game with a purpose to collect human inputs on object locations, followed by an accurate segmentation phase achieved by optimizing an energy function encoding spatial and temporal constraints between object regions as well as human-provided location priors. Performance analysis carried out on complex video benchmarks, and exploiting data provided by over 60 users, demonstrated that our method shows a better trade-off between annotation times and segmentation accuracy than interactive video annotation and automated video object segmentation approaches.

  10. Guide to Savannah River Laboratory Analytical Services Group

    Energy Technology Data Exchange (ETDEWEB)

    1990-04-01

    The mission of the Analytical Services Group (ASG) is to provide analytical support for Savannah River Laboratory Research and Development Programs using onsite and offsite analytical labs as resources. A second mission is to provide Savannah River Site (SRS) operations with analytical support for nonroutine material characterization or special chemical analyses. The ASG provides backup support for the SRS process control labs as necessary.

  11. Rediscovering market segmentation.

    Science.gov (United States)

    Yankelovich, Daniel; Meer, David

    2006-02-01

    In 1964, Daniel Yankelovich introduced in the pages of HBR the concept of nondemographic segmentation, by which he meant the classification of consumers according to criteria other than age, residence, income, and such. The predictive power of marketing studies based on demographics was no longer strong enough to serve as a basis for marketing strategy, he argued. Buying patterns had become far better guides to consumers' future purchases. In addition, properly constructed nondemographic segmentations could help companies determine which products to develop, which distribution channels to sell them in, how much to charge for them, and how to advertise them. But more than 40 years later, nondemographic segmentation has become just as unenlightening as demographic segmentation had been. Today, the technique is used almost exclusively to fulfill the needs of advertising, which it serves mainly by populating commercials with characters that viewers can identify with. It is true that psychographic types like "High-Tech Harry" and "Joe Six-Pack" may capture some truth about real people's lifestyles, attitudes, self-image, and aspirations. But they are no better than demographics at predicting purchase behavior. Thus they give corporate decision makers very little idea of how to keep customers or capture new ones. Now, Daniel Yankelovich returns to these pages, with consultant David Meer, to argue the case for a broad view of nondemographic segmentation. They describe the elements of a smart segmentation strategy, explaining how segmentations meant to strengthen brand identity differ from those capable of telling a company which markets it should enter and what goods to make. And they introduce their "gravity of decision spectrum", a tool that focuses on the form of consumer behavior that should be of the greatest interest to marketers--the importance that consumers place on a product or product category.

  12. Segmental Refinement: A Multigrid Technique for Data Locality

    KAUST Repository

    Adams, Mark F.

    2016-08-04

    We investigate a domain decomposed multigrid technique, termed segmental refinement, for solving general nonlinear elliptic boundary value problems. We extend the method first proposed in 1994 by analytically and experimentally investigating its complexity. We confirm that communication of traditional parallel multigrid is eliminated on fine grids, with modest amounts of extra work and storage, while maintaining the asymptotic exactness of full multigrid. We observe an accuracy dependence on the segmental refinement subdomain size, which was not considered in the original analysis. We present a communication complexity analysis that quantifies the communication costs ameliorated by segmental refinement and report performance results with up to 64K cores on a Cray XC30.

  13. Segmental neurofibromatosis in childhood.

    Science.gov (United States)

    Listernick, Robert; Mancini, Anthony J; Charrow, Joel

    2003-08-30

    Segmental neurofibromatosis refers to individuals who have manifestations of neurofibromatosis type 1 (NF-1) limited to one area of the body. It results from a post-conceptional mutation in the NF-1 gene leading to somatic mosaicism. Although it is generally considered a rare condition, this report of 39 children with segmental NF-1 demonstrates that it is commonly seen in a pediatric NF-1 referral center. The mean age at diagnosis was 7.8 years (range: 2-25 years). Twenty-nine patients had only pigmentary manifestations of segmental NF-1, including seven who had only café-au-lait macules and 22 who had café-au-lait macules and freckling. Two patients had isolated plexiform neurofibromas; a third patient had a plexiform neurofibroma of the eyelid in addition to ipsilateral dysplasia of the sphenoid wing and Lisch nodules. A 12-year-old girl had an isolated tibial pseudarthrosis. An 8-year-old boy had an isolated optic pathway tumor, which behaved both biologically and radiographically as an NF1-associated tumor. While most children with segmental NF-1 have only localized pigmentary changes, some children will have isolated plexiform neurofibromas, pseudarthroses, or optic pathway tumors. Accurate diagnosis of segmental NF-1 is crucial for both management and genetic counseling. Copyright 2003 Wiley-Liss, Inc.

  14. Cooperative processes in image segmentation

    Science.gov (United States)

    Davis, L. S.

    1982-01-01

    Research into the role of cooperative, or relaxation, processes in image segmentation is surveyed. Cooperative processes can be employed at several levels of the segmentation process as a preprocessing enhancement step, during supervised or unsupervised pixel classification and, finally, for the interpretation of image segments based on segment properties and relations.

  15. Modeling of market segmentation for new IT product development

    Science.gov (United States)

    Nasiopoulos, Dimitrios K.; Sakas, Damianos P.; Vlachos, D. S.; Mavrogianni, Amanda

    2015-02-01

    Businesses from all Information Technology sectors use market segmentation[1] in their product development[2] and strategic planning[3]. Many studies have concluded that market segmentation is considered as the norm of modern marketing. With the rapid development of technology, customer needs are becoming increasingly diverse. These needs can no longer be satisfied by a mass marketing approach and follow one rule. IT Businesses can face with this diversity by pooling customers[4] with similar requirements and buying behavior and strength into segments. The result of the best choices about which segments are the most appropriate to serve can then be made, thus making the best of finite resources. Despite the attention which segmentation gathers and the resources that are invested in it, growing evidence suggests that businesses have problems operationalizing segmentation[5]. These problems take various forms. There may have been a rule that the segmentation process necessarily results in homogeneous groups of customers for whom appropriate marketing programs and procedures for dealing with them can be developed. Then the segmentation process, that a company follows, can fail. This increases concerns about what causes segmentation failure and how it might be overcome. To prevent the failure, we created a dynamic simulation model of market segmentation[6] based on the basic factors leading to this segmentation.

  16. Chan-Vese Segmentation

    Directory of Open Access Journals (Sweden)

    Pascal Getreuer

    2012-08-01

    Full Text Available While many segmentation methods rely heavily in some way on edge detection, the "Active Contours Without Edges" method by Chan and Vese ignores edges completely. Instead, the method optimally fits a two-phase piecewise constant model to the given image. The segmentation boundary is represented implicitly with a level set function, which allows the segmentation to handle topological changes more easily than explicit snake methods. This article describes the level set formulation of the Chan–Vese model and its numerical solution using a semi-implicit gradient descent. We also discuss the Chan–Sandberg–Vese method, a straightforward extension of Chan–Vese for vector-valued images.

  17. Segmentation of complex document

    Directory of Open Access Journals (Sweden)

    Souad Oudjemia

    2014-06-01

    Full Text Available In this paper we present a method for segmentation of documents image with complex structure. This technique based on GLCM (Grey Level Co-occurrence Matrix used to segment this type of document in three regions namely, 'graphics', 'background' and 'text'. Very briefly, this method is to divide the document image, in block size chosen after a series of tests and then applying the co-occurrence matrix to each block in order to extract five textural parameters which are energy, entropy, the sum entropy, difference entropy and standard deviation. These parameters are then used to classify the image into three regions using the k-means algorithm; the last step of segmentation is obtained by grouping connected pixels. Two performance measurements are performed for both graphics and text zones; we have obtained a classification rate of 98.3% and a Misclassification rate of 1.79%.

  18. Segmental Neurofibromatosis: Atypical Localisation

    Directory of Open Access Journals (Sweden)

    Filiz Topaloğlu Demir

    2015-06-01

    Full Text Available Neurofibromatosis (NF is a genetic disease leading pathological findings in skin, soft tissue, bone and nervous system by affecting neural crest cells. Due to its heterogeneity neurofibromatosis was divided into eight different subgroups (NF-I NF-VIII by Riccardi. Segmental neurofibromatosis (NF type V is characterized by cutaneous neurofibromas and Café-au-lait spots limited with a segment of dermatome. Here we report this case with numerous, painless cutaneous nodules showing extension from the shoulder to the dorsal aspect of the right hand, since it a rare case.

  19. Marketing Education Through Benefit Segmentation. AIR Forum 1981 Paper.

    Science.gov (United States)

    Goodnow, Wilma Elizabeth

    The applicability of the "benefit segmentation" marketing technique to education was tested at the College of DuPage in 1979. Benefit segmentation identified target markets homogeneous in benefits expected from a program offering and may be useful in combatting declining enrollments. The 487 randomly selected students completed the 223…

  20. On the Segmentation of the Response Surfaces for Super ...

    African Journals Online (AJOL)

    The Solutions of Linear and Quadratic Programming Problems using Super Convergent Line Series involving the Segmentation of the Response Surface are presented in the paper. It is verified that the number of segments, S for which optimal solutions of these problems selected for verification are obtained are 2 and 4 for ...

  1. Loading effects of anterior cervical spine fusion on adjacent segments

    Directory of Open Access Journals (Sweden)

    Chien-Shiung Wang

    2012-11-01

    Full Text Available Adjacent segment degeneration typically follows anterior cervical spine fusion. However, the primary cause of adjacent segment degeneration remains unknown. Therefore, in order to identify the loading effects that cause adjacent segment degeneration, this study examined the loading effects to superior segments adjacent to fused bone following anterior cervical spine fusion. The C3–C6 cervical spine segments of 12 sheep were examined. Specimens were divided into the following groups: intact spine (group 1; and C5–C6 segments that were fused via cage-instrumented plate fixation (group 2. Specimens were cycled between 20° flexion and 15° extension with a displacement control of 1°/second. The tested parameters included the range of motion (ROM of each segment, torque and strain on both the body and inferior articular process at the superior segments (C3–C4 adjacent to the fused bone, and the position of the neutral axis of stress at under 20° flexion and 15° extension. Under flexion and Group 2, torque, ROM, and strain on both the bodies and facets of superior segments adjacent to the fused bone were higher than those of Group 1. Under extension and Group 2, ROM for the fused segment was less than that of Group 1; torque, ROM, and stress on both the bodies and facets of superior segments adjacent to the fused bone were higher than those of Group 1. These analytical results indicate that the muscles and ligaments require greater force to achieve cervical motion than the intact spine following anterior cervical spine fusion. In addition, ROM and stress on the bodies and facets of the joint segments adjacent to the fused bone were significantly increased. Under flexion, the neutral axis of the stress on the adjacent segment moved backward, and the stress on the bodies of the segments adjacent to the fused bone increased. These comparative results indicate that increased stress on the adjacent segments is caused by stress-shielding effects

  2. Strategies for regular segmented reductions on GPU

    DEFF Research Database (Denmark)

    Larsen, Rasmus Wriedt; Henriksen, Troels

    2017-01-01

    We present and evaluate an implementation technique for regular segmented reductions on GPUs. Existing techniques tend to be either consistent in performance but relatively inefficient in absolute terms, or optimised for specific workloads and thereby exhibiting bad performance for certain input...... is in the context of the Futhark compiler, the implementation technique is applicable to any library or language that has a need for segmented reductions. We evaluate the technique on four microbenchmarks, two of which we also compare to implementations in the CUB library for GPU programming, as well as on two...

  3. Calibration exercise for the Community Aquatic Monitoring Program (CAMP) nutrient analyses: establishing variability between filtered and unfiltered water samples and two analytical laboratories

    National Research Council Canada - National Science Library

    Thériault, M.-H; Courtenay, S.C

    2012-01-01

    As part of the Community Aquatic Monitoring Program (CAMP) unfiltered water samples were collected between 2006 and 2008 and analyzed for dissolved inorganic nutrients (i.e., nitrate + nitrite (NO3 + NO2...

  4. Color Image Segmentation in a Quaternion Framework.

    Science.gov (United States)

    Subakan, Ozlem N; Vemuri, Baba C

    2009-01-01

    In this paper, we present a feature/detail preserving color image segmentation framework using Hamiltonian quaternions. First, we introduce a novel Quaternionic Gabor Filter (QGF) which can combine the color channels and the orientations in the image plane. Using the QGFs, we extract the local orientation information in the color images. Second, in order to model this derived orientation information, we propose a continuous mixture of appropriate hypercomplex exponential basis functions. We derive a closed form solution for this continuous mixture model. This analytic solution is in the form of a spatially varying kernel which, when convolved with the signed distance function of an evolving contour (placed in the color image), yields a detail preserving segmentation.

  5. Dictionary Based Image Segmentation

    DEFF Research Database (Denmark)

    Dahl, Anders Bjorholm; Dahl, Vedrana Andersen

    2015-01-01

    We propose a method for weakly supervised segmentation of natural images, which may contain both textured or non-textured regions. Our texture representation is based on a dictionary of image patches. To divide an image into separated regions with similar texture we use an implicit level sets...

  6. Thoracoscopic Subsuperior Segment Segmentectomy.

    Science.gov (United States)

    Shimizu, Kimihiro; Mogi, Akira; Yajima, Toshiki; Nagashima, Toshiteru; Ohtaki, Yoichi; Obayashi, Kai; Nakazawa, Seshiru; Kosaka, Takayuki; Kuwano, Hiroyuki

    2017-11-01

    To date, anatomic subsuperior segment (S∗) segmentectomy has not yet been reported. Herein we report the technical details of thoracoscopic anatomic S∗ segmentectomy and the anatomic features of the S∗. Copyright © 2017 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  7. Segmentation and crustal structure

    Indian Academy of Sciences (India)

    N; J. Geophys. Res. 105 8205–8226. Kamesh Raju K A, Ramprasad T and Subrahmanyam C. 1997 Geophysical investigations over a segment of the. Central Indian Ridge, Indian Ocean; Geo-Marine Lett. 17. 195–201. Kamesh Raju K A, Chaubey A K, Amarnath Dileep and. Mudholkar Abhay 2008 Morphotectonics of the ...

  8. Connecting textual segments

    DEFF Research Database (Denmark)

    Brügger, Niels

    2017-01-01

    history than just the years of the emergence of the web, the chapter traces the history of how segments of text have deliberately been connected to each other by the use of specific textual and media features, from clay tablets, manuscripts on parchment, and print, among others, to hyperlinks on stand......-alone computers and in local and global digital networks....

  9. Sipunculans and segmentation

    DEFF Research Database (Denmark)

    Wanninger, Andreas; Kristof, Alen; Brinkmann, Nora

    2009-01-01

    Comparative molecular, developmental and morphogenetic analyses show that the three major segmented animal groups- Lophotrochozoa, Ecdysozoa and Vertebrata-use a wide range of ontogenetic pathways to establish metameric body organization. Even in the life history of a single specimen, different m...

  10. CORNEA AND ANTERIOR SEGMENT

    African Journals Online (AJOL)

    2016-11-04

    Nov 4, 2016 ... 24. Nigerian Journal of Ophthalmology / Supplement 1 - 2014 - Volume 22. S24. CORNEA AND ANTERIOR SEGMENT. A Comparison of Visual Outcomes after Extracapsular Cataract. Surgery and Phacoemulsification in Eye Foundation Hospital. Lagos Nigeria. Oderinlo O. O., Hassan A. O., Oluyadi F. O., ...

  11. Labor market segmentation

    OpenAIRE

    Berndt Christian

    2017-01-01

    This contribution to the International Encyclopedia of Geography is a reinterpretation of labor market segmentation theories mapping the evolution of this perspective on labor markets and using the findings of the care market project to reflect on the rising importance of female migrant labor in the domestic sphere and the question of diversity and inequalities in the labor market.

  12. Text line Segmentation of Curved Document Images

    Directory of Open Access Journals (Sweden)

    Anusree.M

    2014-05-01

    Full Text Available Document image analysis has been widely used in historical and heritage studies, education and digital library. Document image analytical techniques are mainly used for improving the human readability and the OCR quality of the document. During the digitization, camera captured images contain warped document due perspective and geometric distortions. The main difficulty is text line detection in the document. Many algorithms had been proposed to address the problem of printed document text line detection, but they failed to extract text lines in curved document. This paper describes a segmentation technique that detects the curled text line in camera captured document images.

  13. Segmentation in Tardigrada and diversification of segmental patterns in Panarthropoda.

    Science.gov (United States)

    Smith, Frank W; Goldstein, Bob

    2017-05-01

    The origin and diversification of segmented metazoan body plans has fascinated biologists for over a century. The superphylum Panarthropoda includes three phyla of segmented animals-Euarthropoda, Onychophora, and Tardigrada. This superphylum includes representatives with relatively simple and representatives with relatively complex segmented body plans. At one extreme of this continuum, euarthropods exhibit an incredible diversity of serially homologous segments. Furthermore, distinct tagmosis patterns are exhibited by different classes of euarthropods. At the other extreme, all tardigrades share a simple segmented body plan that consists of a head and four leg-bearing segments. The modular body plans of panarthropods make them a tractable model for understanding diversification of animal body plans more generally. Here we review results of recent morphological and developmental studies of tardigrade segmentation. These results complement investigations of segmentation processes in other panarthropods and paleontological studies to illuminate the earliest steps in the evolution of panarthropod body plans. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. The VERCORS safety program, source term analytical study with special emphasis on the release of non volatile fission products and transuranic elements

    Energy Technology Data Exchange (ETDEWEB)

    Ducros, G.; Andre, B.; Tourasse, M. [CEA Centre d`Etudes de Grenoble, 38 (France). Direction des Technologies Avancees; Maro, D. [CEA Centre d`Etudes de Fontenay-aux-Roses, 92 (France). Inst. de Protection et de Surete Nucleaire

    1995-12-31

    This document is not a true report but the succession of transparencies listing the main titles of subjects that have been developed in oral form at CSARP Meeting. The main interest of the document is in large computer designed figures. The subject is the VERCORS program (which extends the HEVA experimental program) devoted to the determination of the source term of fission products released from PWR fuel samples during a severe accident sequence. The experiment is performed in a shielded hot cell at CEA Grenoble plant. Measurements aimed at characterizing fission products releases and structural materials as a function of fuel temperature and oxidising / reducing conditions of the environment. (author).

  15. Reactor Section standard analytical methods. Part 1

    Energy Technology Data Exchange (ETDEWEB)

    Sowden, D.

    1954-07-01

    the Standard Analytical Methods manual was prepared for the purpose of consolidating and standardizing all current analytical methods and procedures used in the Reactor Section for routine chemical analyses. All procedures are established in accordance with accepted practice and the general analytical methods specified by the Engineering Department. These procedures are specifically adapted to the requirements of the water treatment process and related operations. The methods included in this manual are organized alphabetically within the following five sections which correspond to the various phases of the analytical control program in which these analyses are to be used: water analyses, essential material analyses, cotton plug analyses boiler water analyses, and miscellaneous control analyses.

  16. Bayesian segmentation of protein secondary structure.

    Science.gov (United States)

    Schmidler, S C; Liu, J S; Brutlag, D L

    2000-01-01

    We present a novel method for predicting the secondary structure of a protein from its amino acid sequence. Most existing methods predict each position in turn based on a local window of residues, sliding this window along the length of the sequence. In contrast, we develop a probabilistic model of protein sequence/structure relationships in terms of structural segments, and formulate secondary structure prediction as a general Bayesian inference problem. A distinctive feature of our approach is the ability to develop explicit probabilistic models for alpha-helices, beta-strands, and other classes of secondary structure, incorporating experimentally and empirically observed aspects of protein structure such as helical capping signals, side chain correlations, and segment length distributions. Our model is Markovian in the segments, permitting efficient exact calculation of the posterior probability distribution over all possible segmentations of the sequence using dynamic programming. The optimal segmentation is computed and compared to a predictor based on marginal posterior modes, and the latter is shown to provide significant improvement in predictive accuracy. The marginalization procedure provides exact secondary structure probabilities at each sequence position, which are shown to be reliable estimates of prediction uncertainty. We apply this model to a database of 452 nonhomologous structures, achieving accuracies as high as the best currently available methods. We conclude by discussing an extension of this framework to model nonlocal interactions in protein structures, providing a possible direction for future improvements in secondary structure prediction accuracy.

  17. Market segmentation: Venezuelan ADRs

    Directory of Open Access Journals (Sweden)

    Urbi Garay

    2012-12-01

    Full Text Available The control on foreign exchange imposed by Venezuela in 2003 constitute a natural experiment that allows researchers to observe the effects of exchange controls on stock market segmentation. This paper provides empirical evidence that although the Venezuelan capital market as a whole was highly segmented before the controls were imposed, the shares in the firm CANTV were, through their American Depositary Receipts (ADRs, partially integrated with the global market. Following the imposition of the exchange controls this integration was lost. Research also documents the spectacular and apparently contradictory rise experienced by the Caracas Stock Exchange during the serious economic crisis of 2003. It is argued that, as it happened in Argentina in 2002, the rise in share prices occurred because the depreciation of the Bolívar in the parallel currency market increased the local price of the stocks that had associated ADRs, which were negotiated in dollars.

  18. Labor market segmentation

    OpenAIRE

    Berndt, Christian

    2017-01-01

    Labor market segmentation theories arose as an alternative to neoclassical notions of labor and labor markets in the 1970s. After briefly revisiting the strengths and the weaknesses of this approach, the article discusses more recent developments around the question of difference and diversity in labor markets, directing attention to three key developments associated with the rise of neoliberal capitalism: (i) the formation of entrepreneurial subjectivities and the treatment of labor as a div...

  19. Benefits of a comprehensive quality program for cryopreserved PBMC covering 28 clinical trials sites utilizing an integrated, analytical web-based portal.

    Science.gov (United States)

    Ducar, Constance; Smith, Donna; Pinzon, Cris; Stirewalt, Michael; Cooper, Cristine; McElrath, M Juliana; Hural, John

    2014-07-01

    The HIV Vaccine Trials Network (HVTN) is a global network of 28 clinical trial sites dedicated to identifying an effective HIV vaccine. Cryopreservation of high-quality peripheral blood mononuclear cells (PBMC) is critical for the assessment of vaccine-induced cellular immune functions. The HVTN PBMC Quality Management Program is designed to ensure that viable PBMC are processed, stored and shipped for clinical trial assays from all HVTN clinical trial sites. The program has evolved by developing and incorporating best practices for laboratory and specimen quality and implementing automated, web-based tools. These tools allow the site-affiliated processing laboratories and the central Laboratory Operations Unit to rapidly collect, analyze and report PBMC quality data. The HVTN PBMC Quality Management Program includes five key components: 1) Laboratory Assessment, 2) PBMC Training and Certification, 3) Internal Quality Control, 4) External Quality Control (EQC), and 5) Assay Specimen Quality Control. Fresh PBMC processing data is uploaded from each clinical site processing laboratory to a central HVTN Statistical and Data Management Center database for access and analysis on a web portal. Samples are thawed at a central laboratory for assay or specimen quality control and sample quality data is uploaded directly to the database by the central laboratory. Four year cumulative data covering 23,477 blood draws reveals an average fresh PBMC yield of 1.45×10(6)±0.48 cells per milliliter of useable whole blood. 95% of samples were within the acceptable range for fresh cell yield of 0.8-3.2×10(6) cells/ml of usable blood. Prior to full implementation of the HVTN PBMC Quality Management Program, the 2007 EQC evaluations from 10 international sites showed a mean day 2 thawed viability of 83.1% and a recovery of 67.5%. Since then, four year cumulative data covering 3338 specimens used in immunologic assays shows that 99.88% had acceptable viabilities (>66%) for use in

  20. Croatian Analytical Terminology

    OpenAIRE

    Kastelan-Macan; M.

    2008-01-01

    Results of analytical research are necessary in all human activities. They are inevitable in making decisions in the environmental chemistry, agriculture, forestry, veterinary medicine, pharmaceutical industry, and biochemistry. Without analytical measurements the quality of materials and products cannot be assessed, so that analytical chemistry is an essential part of technical sciences and disciplines.The language of Croatian science, and analytical chemistry within it, was one of the goals...

  1. Clustering in analytical chemistry.

    Science.gov (United States)

    Drab, Klaudia; Daszykowski, Michal

    2014-01-01

    Data clustering plays an important role in the exploratory analysis of analytical data, and the use of clustering methods has been acknowledged in different fields of science. In this paper, principles of data clustering are presented with a direct focus on clustering of analytical data. The role of the clustering process in the analytical workflow is underlined, and its potential impact on the analytical workflow is emphasized.

  2. Spinal segmental dysgenesis CASE SERIES

    African Journals Online (AJOL)

    Spinal segmental dysgenesis is a rare congenital spinal abnormality seen in neonates and infants, in which a segment of the spine and spinal cord fails to develop normally. The condition is segmental in nature, with vertebrae above and below the malformation. It is commonly associated with various abnormalities that ...

  3. Market Segmentation for Information Services.

    Science.gov (United States)

    Halperin, Michael

    1981-01-01

    Discusses the advantages and limitations of market segmentation as strategy for the marketing of information services made available by nonprofit organizations, particularly libraries. Market segmentation is defined, a market grid for libraries is described, and the segmentation of information services is outlined. A 16-item reference list is…

  4. Preliminary recommendations on the design of the characterization program for the Hanford Site single-shell tanks: A system analysis. Volume 2, Closure-related analyte priorities, concentration thresholds, and detection limit goals based on public health concerns

    Energy Technology Data Exchange (ETDEWEB)

    Buck, J.W.; Peffers, M.S.; Hwang, S.T.

    1991-11-01

    The work described in this volume was conducted by Pacific Northwest Laboratory to provide preliminary recommendations on data quality objectives (DQOs) to support the Waste Characterization Plan (WCP) and closure decisions for the Hanford Site single-shell tanks (SSTs). The WCP describes the first of a two-phase characterization program that will obtain information to assess and implement disposal options for SSTs. This work was performed for the Westinghouse Hanford Company (WHC), the current operating contractor on the Hanford Site. The preliminary DQOs contained in this volume deal with the analysis of SST wastes in support of the WCP and final closure decisions. These DQOs include information on significant contributors and detection limit goals (DLGs) for SST analytes based on public health risk.

  5. Validation tools for image segmentation

    Science.gov (United States)

    Padfield, Dirk; Ross, James

    2009-02-01

    A large variety of image analysis tasks require the segmentation of various regions in an image. For example, segmentation is required to generate accurate models of brain pathology that are important components of modern diagnosis and therapy. While the manual delineation of such structures gives accurate information, the automatic segmentation of regions such as the brain and tumors from such images greatly enhances the speed and repeatability of quantifying such structures. The ubiquitous need for such algorithms has lead to a wide range of image segmentation algorithms with various assumptions, parameters, and robustness. The evaluation of such algorithms is an important step in determining their effectiveness. Therefore, rather than developing new segmentation algorithms, we here describe validation methods for segmentation algorithms. Using similarity metrics comparing the automatic to manual segmentations, we demonstrate methods for optimizing the parameter settings for individual cases and across a collection of datasets using the Design of Experiment framework. We then employ statistical analysis methods to compare the effectiveness of various algorithms. We investigate several region-growing algorithms from the Insight Toolkit and compare their accuracy to that of a separate statistical segmentation algorithm. The segmentation algorithms are used with their optimized parameters to automatically segment the brain and tumor regions in MRI images of 10 patients. The validation tools indicate that none of the ITK algorithms studied are able to outperform with statistical significance the statistical segmentation algorithm although they perform reasonably well considering their simplicity.

  6. FRAMEWORK FOR COMPARING SEGMENTATION ALGORITHMS

    Directory of Open Access Journals (Sweden)

    G. Sithole

    2015-05-01

    Full Text Available The notion of a ‘Best’ segmentation does not exist. A segmentation algorithm is chosen based on the features it yields, the properties of the segments (point sets it generates, and the complexity of its algorithm. The segmentation is then assessed based on a variety of metrics such as homogeneity, heterogeneity, fragmentation, etc. Even after an algorithm is chosen its performance is still uncertain because the landscape/scenarios represented in a point cloud have a strong influence on the eventual segmentation. Thus selecting an appropriate segmentation algorithm is a process of trial and error. Automating the selection of segmentation algorithms and their parameters first requires methods to evaluate segmentations. Three common approaches for evaluating segmentation algorithms are ‘goodness methods’, ‘discrepancy methods’ and ‘benchmarks’. Benchmarks are considered the most comprehensive method of evaluation. This paper shortcomings in current benchmark methods are identified and a framework is proposed that permits both a visual and numerical evaluation of segmentations for different algorithms, algorithm parameters and evaluation metrics. The concept of the framework is demonstrated on a real point cloud. Current results are promising and suggest that it can be used to predict the performance of segmentation algorithms.

  7. Weighted entropy for segmentation evaluation

    Science.gov (United States)

    Khan, Jesmin F.; Bhuiyan, Sharif M.

    2014-04-01

    In many image, video and computer vision systems the image segmentation is an essential part. Significant research has been done in image segmentation and a number of quantitative evaluation methods have already been proposed in the literature. However, often the segmentation evaluation is subjective that means it has been done visually or qualitatively. A segmentation evaluation method based on entropy is proposed in this work which is objective and simple to implement. A weighted self and mutual entropy are proposed to measure the dissimilarity of the pixels among the segmented regions and the similarity within a region. This evaluation technique gives a score that can be used to compare different segmentation algorithms for the same image, or to compare the segmentation results of a given algorithm with different images, or to find the best suited values of the parameters of a segmentation algorithm for a given image. The simulation results show that the proposed method can identify over-segmentation, under-segmentation, and the good segmentation.

  8. Segmentation interactive d'images cardiaques dynamiques.

    OpenAIRE

    Bianchi, Kevin

    2014-01-01

    This thesis focuses on the spatio-temporal and interactive segmentation of dynamiccardiac images. It is a part of the ANR 3DSTRAIN project of program "Technologiesfor Health and Autonomy" which aims to estimate full, dense and on several3D+t imaging modalities (such as Magnetic Resonance Imaging (MRI), Single PhotonEmission Computed Tomography (SPECT) and echocardiography) the indexof deformation of the heart muscle : the strain. The strain estimation requires asegmentation step which must be...

  9. Segmental neurofibromatosis of face.

    Science.gov (United States)

    Agarwal, Anuja; Thappa, Devinder M; Jayanthi, S; Shivaswamy, K N

    2005-12-01

    A 38-year-old man presented with asymptomatic skin lesions over the left side of the face of 5-years duration. He had multiple discrete soft-to-firm papules and nodules on the left side of the face along the distribution of the mandibular division of the trigeminal nerve. Histopathology examination of one of the nodules (face) showed a non-encapsulated tumor of the dermis with normal overlying epidermis. The tumor consisted of loosely spaced spindle cells and wavy collagenous strands in a clear matrix. These features were consistent with our clinical diagnosis of segmental neurofibromatosis. This case is reported for its rarity and typical manifestations.

  10. Segmented Target Design

    Science.gov (United States)

    Merhi, Abdul Rahman; Frank, Nathan; Gueye, Paul; Thoennessen, Michael; MoNA Collaboration

    2013-10-01

    A proposed segmented target would improve decay energy measurements of neutron-unbound nuclei. Experiments like this have been performed at the National Superconducting Cyclotron Laboratory (NSCL) located at Michigan State University. Many different nuclei are produced in such experiments, some of which immediately decay into a charged particle and neutron. The charged particles are bent by a large magnet and measured by a suite of charged particle detectors. The neutrons are measured by the Modular Neutron Array (MoNA) and Large Multi-Institutional Scintillation Array (LISA). With the current target setup, a nucleus in a neutron-unbound state is produced with a radioactive beam impinged upon a beryllium target. The resolution of these measurements is very dependent on the target thickness since the nuclear interaction point is unknown. In a segmented target using alternating layers of silicon detectors and Be-targets, the Be-target in which the nuclear reaction takes place would be determined. Thus the experimental resolution would improve. This poster will describe the improvement over the current target along with the status of the design. Work supported by Augustana College and the National Science Foundation grant #0969173.

  11. Segmentation of the Infant Food Market

    OpenAIRE

    Hrůzová, Daniela

    2015-01-01

    The theoretical part covers general market segmentation, namely the marketing importance of differences among consumers, the essence of market segmentation, its main conditions and the process of segmentation, which consists of four consecutive phases - defining the market, determining important criteria, uncovering segments and developing segment profiles. The segmentation criteria, segmentation approaches, methods and techniques for the process of market segmentation are also described in t...

  12. Analytical Chemistry in Russia.

    Science.gov (United States)

    Zolotov, Yuri

    2016-09-06

    Research in Russian analytical chemistry (AC) is carried out on a significant scale, and the analytical service solves practical tasks of geological survey, environmental protection, medicine, industry, agriculture, etc. The education system trains highly skilled professionals in AC. The development and especially manufacturing of analytical instruments should be improved; in spite of this, there are several good domestic instruments and other satisfy some requirements. Russian AC has rather good historical roots.

  13. A mathematical analysis to address the 6 degree-of-freedom segmental power imbalance.

    Science.gov (United States)

    Ebrahimi, Anahid; Collins, John D; Kepple, Thomas M; Takahashi, Kota Z; Higginson, Jill S; Stanhope, Steven J

    2018-01-03

    Segmental power is used in human movement analyses to indicate the source and net rate of energy transfer between the rigid bodies of biomechanical models. Segmental power calculations are performed using segment endpoint dynamics (kinetic method). A theoretically equivalent method is to measure the rate of change in a segment's mechanical energy state (kinematic method). However, these two methods have not produced experimentally equivalent results for segments proximal to the foot, with the difference in methods deemed the "power imbalance." In a 6 degree-of-freedom model, segments move independently, resulting in relative segment endpoint displacement and non-equivalent segment endpoint velocities at a joint. In the kinetic method, a segment's distal end translational velocity may be defined either at the anatomical end of the segment or at the location of the joint center (defined here as the proximal end of the adjacent distal segment). Our mathematical derivations revealed the power imbalance between the kinetic method using the anatomical definition and the kinematic method can be explained by power due to relative segment endpoint displacement. In this study, we tested this analytical prediction through experimental gait data from nine healthy subjects walking at a typical speed. The average absolute segmental power imbalance was reduced from 0.023 to 0.046 W/kg using the anatomical definition to ≤0.001 W/kg using the joint center definition in the kinetic method (95.56-98.39% reduction). Power due to relative segment endpoint displacement in segmental power analyses is substantial and should be considered in analyzing energetic flow into and between segments. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Analytical Electron Microscope

    Data.gov (United States)

    Federal Laboratory Consortium — The Titan 80-300 is a transmission electron microscope (TEM) equipped with spectroscopic detectors to allow chemical, elemental, and other analytical measurements to...

  15. Segmentation-DrivenTomographic Reconstruction

    DEFF Research Database (Denmark)

    Kongskov, Rasmus Dalgas

    ), the classical reconstruction methods suffer from their inability to handle limited and/ or corrupted data. Form any analysis tasks computationally demanding segmentation methods are used to automatically segment an object, after using a simple reconstruction method as a first step. In the literature, methods...... such that the segmentation subsequently can be carried out by use of a simple segmentation method, for instance just a thresholding method. We tested the advantages of going from a two-stage reconstruction method to a one stage segmentation-driven reconstruction method for the phase contrast tomography reconstruction...... problem. The tests showed a clear improvement for realistic materials simulations and that the one-stage method was clearly more robust toward noise. The noise-robustness result could be a step toward making this method more applicable for lab-scale experiments. We have introduced a segmentation...

  16. Identifying and Coordinating Care for Complex Patients: Findings from the Leading Edge of Analytics and Health Information Technology.

    Science.gov (United States)

    Rudin, Robert S; Gidengil, Courtney A; Predmore, Zachary; Schneider, Eric C; Sorace, James; Hornstein, Rachel

    2017-06-01

    In the United States, a relatively small proportion of complex patients---defined as having multiple comorbidities, high risk for poor outcomes, and high cost---incur most of the nation's health care costs. Improved care coordination and management of complex patients could reduce costs while increasing quality of care. However, care coordination efforts face multiple challenges, such as segmenting populations of complex patients to better match their needs with the design of specific interventions, understanding how to reduce spending, and integrating care coordination programs into providers' care delivery processes. Innovative uses of analytics and health information technology (HIT) may address these challenges. Rudin and colleagues at RAND completed a literature review and held discussions with subject matter experts, reaching the conclusion that analytics and HIT are being used in innovative ways to coordinate care for complex patients but that the capabilities are limited, evidence of their effectiveness is lacking, and challenges are substantial, and important foundational work is still needed.

  17. Segmented heat exchanger

    Science.gov (United States)

    Baldwin, Darryl Dean; Willi, Martin Leo; Fiveland, Scott Byron; Timmons, Kristine Ann

    2010-12-14

    A segmented heat exchanger system for transferring heat energy from an exhaust fluid to a working fluid. The heat exchanger system may include a first heat exchanger for receiving incoming working fluid and the exhaust fluid. The working fluid and exhaust fluid may travel through at least a portion of the first heat exchanger in a parallel flow configuration. In addition, the heat exchanger system may include a second heat exchanger for receiving working fluid from the first heat exchanger and exhaust fluid from a third heat exchanger. The working fluid and exhaust fluid may travel through at least a portion of the second heat exchanger in a counter flow configuration. Furthermore, the heat exchanger system may include a third heat exchanger for receiving working fluid from the second heat exchanger and exhaust fluid from the first heat exchanger. The working fluid and exhaust fluid may travel through at least a portion of the third heat exchanger in a parallel flow configuration.

  18. Segmentation Using Symmetry Deviation

    DEFF Research Database (Denmark)

    Hollensen, Christian; Højgaard, L.; Specht, L.

    2011-01-01

    Purpose: The manual delineation of gross tumour volume(GTV) for radiation therapy for head and neck cancer patients relies in some degree of pathological deviation from normal anatomical symmetry. The purpose of this study is to introduce a novel method for 3-dimensional determination of GTV...... hypopharyngeal cancer patients to find anatomical symmetry and evaluate it against the standard deviation of the normal patients to locate pathologic volumes. Combining the information with an absolute PET threshold of 3 Standard uptake value (SUV) a volume was automatically delineated. The overlap of automated...... segmentations on manual contours was evaluated using concordance index and sensitivity for the hypopharyngeal patients. The resulting concordance index and sensitivity was compared with the result of using a threshold of 3 SUV using a paired t-test. Results: The anatomical and symmetrical atlas was constructed...

  19. Generalizing cell segmentation and quantification.

    Science.gov (United States)

    Wang, Zhenzhou; Li, Haixing

    2017-03-23

    In recent years, the microscopy technology for imaging cells has developed greatly and rapidly. The accompanying requirements for automatic segmentation and quantification of the imaged cells are becoming more and more. After studied widely in both scientific research and industrial applications for many decades, cell segmentation has achieved great progress, especially in segmenting some specific types of cells, e.g. muscle cells. However, it lacks a framework to address the cell segmentation problems generally. On the contrary, different segmentation methods were proposed to address the different types of cells, which makes the research work divergent. In addition, most of the popular segmentation and quantification tools usually require a great part of manual work. To make the cell segmentation work more convergent, we propose a framework that is able to segment different kinds of cells automatically and robustly in this paper. This framework evolves the previously proposed method in segmenting the muscle cells and generalizes it to be suitable for segmenting and quantifying a variety of cell images by adding more union cases. Compared to the previous methods, the segmentation and quantification accuracy of the proposed framework is also improved by three novel procedures: (1) a simplified calibration method is proposed and added for the threshold selection process; (2) a noise blob filter is proposed to get rid of the noise blobs. (3) a boundary smoothing filter is proposed to reduce the false seeds produced by the iterative erosion. As it turned out, the quantification accuracy of the proposed framework increases from 93.4 to 96.8% compared to the previous method. In addition, the accuracy of the proposed framework is also better in quantifying the muscle cells than two available state-of-the-art methods. The proposed framework is able to automatically segment and quantify more types of cells than state-of-the-art methods.

  20. Market segmentation and positioning: matching creativity with fiscal responsibility.

    Science.gov (United States)

    Kiener, M E

    1989-01-01

    This paper describes an approach to continuing professional education (CPE) program development in nursing within a university environment that utilizes the concepts of market segmentation and positioning. Use of these strategies enables the academic CPE enterprise to move beyond traditional needs assessment practices to create more successful and better-managed CPE programs.

  1. Tank 241-BY-109, cores 201 and 203, analytical results for the final report

    Energy Technology Data Exchange (ETDEWEB)

    Esch, R.A.

    1997-11-20

    This document is the final laboratory report for tank 241-BY-109 push mode core segments collected between June 6, 1997 and June 17, 1997. The segments were subsampled and analyzed in accordance with the Tank Push Mode Core Sampling and Analysis Plan (Bell, 1997), the Tank Safety Screening Data Quality Objective (Dukelow, et al, 1995). The analytical results are included.

  2. Image segmentation by graph partitioning

    Science.gov (United States)

    Torres, Ana Sofia; Monteiro, Fernando C.

    2012-09-01

    In this paper we propose an hybrid method for the image segmentation which combines the edge-based, region-based and the morphological techniques in conjunction through the spectral based clustering approach. An initial partitioning of the image into atomic regions is set by applying a watershed method to the image gradient magnitude. This initial partition is the input to a computationally efficient region segmentation process which produces the final segmentation. We have applied our approach on several images of the Berkeley Segmentation Dataset. The results reveal the accuracy of the propose method.

  3. Automated medical image segmentation techniques

    Directory of Open Access Journals (Sweden)

    Sharma Neeraj

    2010-01-01

    Full Text Available Accurate segmentation of medical images is a key step in contouring during radiotherapy planning. Computed topography (CT and Magnetic resonance (MR imaging are the most widely used radiographic techniques in diagnosis, clinical studies and treatment planning. This review provides details of automated segmentation methods, specifically discussed in the context of CT and MR images. The motive is to discuss the problems encountered in segmentation of CT and MR images, and the relative merits and limitations of methods currently available for segmentation of medical images.

  4. 14 CFR 91.1075 - Training program: Special rules.

    Science.gov (United States)

    2010-01-01

    ... Operations Program Management § 91.1075 Training program: Special rules. Other than the program manager, only... approved curriculums, curriculum segments, and portions of curriculum segments applicable for use in...

  5. A More Accurate Model for Finding Tutorial Segments Explaining APIs

    OpenAIRE

    Jiang, He; Zhang, Jingxuan; Li, Xiaochen; Ren, Zhilei; Lo, David

    2017-01-01

    Developers prefer to utilize third-party libraries when they implement some functionalities and Application Programming Interfaces (APIs) are frequently used by them. Facing an unfamiliar API, developers tend to consult tutorials as learning resources. Unfortunately, the segments explaining a specific API scatter across tutorials. Hence, it remains a challenging issue to find the relevant segments. In this study, we propose a more accurate model to find the exact tutorial fragments explaining...

  6. Flexible Segmentation and Matching for Optical Character Recognition

    Science.gov (United States)

    Sun, San-Wei; Kung, Sun-Yuan

    1989-11-01

    This paper presents a flexible image segmentation and feature matching method based on dynamic programming techniques to resolve the spatial deformation of Optical Character Recognition (OCR) problems. A 2-subcycle thinning algorithm is presented to extract a character skeleton which is 8-connected. In addition, two feature extraction methods are devised, which will extract the projected 1-D profiles of stroke distributions and 2-D background distribution respectively. The performance of the scheme is superior to that of an equally divided segmentation scheme.

  7. Public Interest Energy Research (PIER) Program Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Flapper, Joris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ke, Jing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kramer, Klaas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sathaye, Jayant [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-02-01

    water usage in individual dairy plants, augment benchmarking activities in the market places, and facilitate implementation of efficiency measures and strategies to save energy and water usage in the dairy industry. Industrial adoption of this emerging tool and technology in the market is expected to benefit dairy plants, which are important customers of California utilities. Further demonstration of this benchmarking tool is recommended, for facilitating its commercialization and expansion in functions of the tool. Wider use of this BEST-Dairy tool and its continuous expansion (in functionality) will help to reduce the actual consumption of energy and water in the dairy industry sector. The outcomes comply very well with the goals set by the AB 1250 for PIER program.

  8. Geochemical evolution processes and water-quality observations based on results of the National Water-Quality Assessment Program in the San Antonio segment of the Edwards aquifer, 1996-2006

    Science.gov (United States)

    Musgrove, MaryLynn; Fahlquist, Lynne; Houston, Natalie A.; Lindgren, Richard J.; Ging, Patricia B.

    2010-01-01

    As part of the National Water-Quality Assessment Program, the U.S. Geological Survey collected and analyzed groundwater samples during 1996-2006 from the San Antonio segment of the Edwards aquifer of central Texas, a productive karst aquifer developed in Cretaceous-age carbonate rocks. These National Water-Quality Assessment Program studies provide an extensive dataset of groundwater geochemistry and water quality, consisting of 249 groundwater samples collected from 136 sites (wells and springs), including (1) wells completed in the shallow, unconfined, and urbanized part of the aquifer in the vicinity of San Antonio (shallow/urban unconfined category), (2) wells completed in the unconfined (outcrop area) part of the regional aquifer (unconfined category), and (3) wells completed in and springs discharging from the confined part of the regional aquifer (confined category). This report evaluates these data to assess geochemical evolution processes, including local- and regional-scale processes controlling groundwater geochemistry, and to make water-quality observations pertaining to sources and distribution of natural constituents and anthropogenic contaminants, the relation between geochemistry and hydrologic conditions, and groundwater age tracers and travel time. Implications for monitoring water-quality trends in karst are also discussed. Geochemical and isotopic data are useful tracers of recharge, groundwater flow, fluid mixing, and water-rock interaction processes that affect water quality. Sources of dissolved constituents to Edwards aquifer groundwater include dissolution of and geochemical interaction with overlying soils and calcite and dolomite minerals that compose the aquifer. Geochemical tracers such as magnesium to calcium and strontium to calcium ratios and strontium isotope compositions are used to evaluate and constrain progressive fluid-evolution processes. Molar ratios of magnesium to calcium and strontium to calcium in groundwater typically

  9. Quine's "Strictly Vegetarian" Analyticity

    NARCIS (Netherlands)

    Decock, L.B.

    2017-01-01

    I analyze Quine’s later writings on analyticity from a linguistic point of view. In Word and Object Quine made room for a “strictly vegetarian” notion of analyticity. In later years, he developed this notion into two more precise notions, which I have coined “stimulus analyticity” and “behaviorist

  10. Some Heterodox Analytic Philosophy

    Directory of Open Access Journals (Sweden)

    Guillermo E. Rosado Haddock

    2013-04-01

    Full Text Available Analytic philosophy has been the most influential philosophical movement in 20th century philosophy. It has surely contributed like no other movement to the elucidation and demarcation of philosophical problems. Nonetheless, the empiricist and sometimes even nominalist convictions of orthodox analytic philosophers have served them to inadequately render even philosophers they consider their own and to propound very questionable conceptions.

  11. The Analytical Hierarchy Process

    DEFF Research Database (Denmark)

    Barfod, Michael Bruhn

    2007-01-01

    The technical note gathers the theory behind the Analytical Hierarchy Process (AHP) and present its advantages and disadvantages in practical use.......The technical note gathers the theory behind the Analytical Hierarchy Process (AHP) and present its advantages and disadvantages in practical use....

  12. Characterizing and reaching high-risk drinkers using audience segmentation.

    Science.gov (United States)

    Moss, Howard B; Kirby, Susan D; Donodeo, Fred

    2009-08-01

    Market or audience segmentation is widely used in social marketing efforts to help planners identify segments of a population to target for tailored program interventions. Market-based segments are typically defined by behaviors, attitudes, knowledge, opinions, or lifestyles. They are more helpful to health communication and marketing planning than epidemiologically defined groups because market-based segments are similar in respect to how they behave or might react to marketing and communication efforts. However, market segmentation has rarely been used in alcohol research. As an illustration of its utility, we employed commercial data that describes the sociodemographic characteristics of high-risk drinkers as an audience segment, including where they tend to live, lifestyles, interests, consumer behaviors, alcohol consumption behaviors, other health-related behaviors, and cultural values. Such information can be extremely valuable in targeting and planning public health campaigns, targeted mailings, prevention interventions, and research efforts. We described the results of a segmentation analysis of those individuals who self-reported to consume 5 or more drinks per drinking episode at least twice in the last 30 days. The study used the proprietary PRIZM (Claritas, Inc., San Diego, CA) audience segmentation database merged with the Center for Disease Control and Prevention's (CDC) Behavioral Risk Factor Surveillance System (BRFSS) database. The top 10 of the 66 PRIZM audience segments for this risky drinking pattern are described. For five of these segments we provided additional in-depth details about consumer behavior and the estimates of the market areas where these risky drinkers resided. The top 10 audience segments (PRIZM clusters) most likely to engage in high-risk drinking are described. The cluster with the highest concentration of binge-drinking behavior is referred to as the "Cyber Millenials." This cluster is characterized as "the nation's tech

  13. Quo vadis, analytical chemistry?

    Science.gov (United States)

    Valcárcel, Miguel

    2016-01-01

    This paper presents an open, personal, fresh approach to the future of Analytical Chemistry in the context of the deep changes Science and Technology are anticipated to experience. Its main aim is to challenge young analytical chemists because the future of our scientific discipline is in their hands. A description of not completely accurate overall conceptions of our discipline, both past and present, to be avoided is followed by a flexible, integral definition of Analytical Chemistry and its cornerstones (viz., aims and objectives, quality trade-offs, the third basic analytical reference, the information hierarchy, social responsibility, independent research, transfer of knowledge and technology, interfaces to other scientific-technical disciplines, and well-oriented education). Obsolete paradigms, and more accurate general and specific that can be expected to provide the framework for our discipline in the coming years are described. Finally, the three possible responses of analytical chemists to the proposed changes in our discipline are discussed.

  14. European Analytical Column

    DEFF Research Database (Denmark)

    Karlberg, B.; Grasserbauer, M.; Andersen, Jens Enevold Thaulov

    2009-01-01

    The European Analytical Column has once more invited a guest columnist to give his views on various matters related to analytical chemistry in Europe. This year, we have invited Professor Manfred Grasserbauer of the Vienna University of Technology to present some of the current challenges...... for European analytical chemistry. During the period 2002–07, Professor Grasserbauer was Director of the Institute for Environment and Sustainability, Joint Research Centre of the European Commission (EC), Ispra, Italy. There is no doubt that many challenges exist at the present time for all of us representing...... a major branch of chemistry, namely analytical chemistry. The global financial crisis is affecting all branches of chemistry, but analytical chemistry, in particular, since our discipline by tradition has many close links to industry. We have already noticed decreased industrial commitment with respect...

  15. A Guide to the Use of Market Segmentation for the Dissemination of Educational Innovations. Final Report of a Project to Study the Effectiveness of Marketing Programming for Educational Change.

    Science.gov (United States)

    Wrausmann, Gale L.; And Others

    Markets can be defined as groups of people or organizations that have resources that could be exchanged for distinct benefits. Market segmentation is one strategy for market management and involves describing the market in terms of the subgroups that compose it so that exchanges with those subgroups can be more effectively promoted or facilitated.…

  16. Load curve modelling of the residential segment electric power consumption applying a demand side energy management program; Modelagem da curva de carga das faixas de consumo de energia eletrica residencial a partir da aplicacao de um programa de gerenciamento de energia pelo lado da demanda

    Energy Technology Data Exchange (ETDEWEB)

    Rahde, Sergio Barbosa [Pontificia Univ. Catolica do Rio Grande do Sul, Porto Alegre (Brazil). Dept. de Engenharia Mecanica e Mecatronica]. E-mail: sergio@em.pucrs.br; Kaehler, Jose Wagner [Pontificia Univ. Catolica do Rio Grande do Sul, Porto Alegre (Brazil). Faculdade de Engenharia]. E-mail: kaehlerjw@pucrs.br

    2000-07-01

    The dissertation aims to offer a current vision on the use of electrical energy inside CEEE's newly defined area of operation. It also intends to propose different alternatives to set up a Demand Side Management (DSM) project to be carried out on the same market segment, through a Residential Load Management program. Starting from studies developed by DNAEE (the Brazilian federal government's agency for electrical energy), to establish the load curve characteristics, as well as from a research on electrical equipment ownership and electricity consumption habits, along with the contribution supplied by other utilities, especially in the US, an evaluation is offered, concerning several approaches to residential energy management, setting up conditions that simulate the residential segment's scenarios and their influence on the general system's load. (author)

  17. IFRS 8 – OPERATING SEGMENTS

    Directory of Open Access Journals (Sweden)

    BOCHIS LEONICA

    2009-05-01

    Full Text Available Segment reporting in accordance with IFRS 8 will be mandatory for annual financial statements covering periods beginning on or after 1 January 2009. The standards replaces IAS 14, Segment Reporting, from that date. The objective of IFRS 8 is to require

  18. Segmental neurofibromatosis [NF type - v].

    Science.gov (United States)

    Arfan-ul-Bari; Simeen-ber-Rahman

    2003-12-01

    Segmental neurofibromatosis is a rare variant of neurofibromatosis in which skin lesions are confined to a circumscribed body segment. A case of a 39-year-old man with this condition is presented, who was having multiple soft skin tumours over a localized area of back with no associated cafe au lait spots, axillary freckles or lish nodules. Histology confirmed the diagnosis of neurofibroma.

  19. Market Segmentation: An Instructional Module.

    Science.gov (United States)

    Wright, Peter H.

    A concept-based introduction to market segmentation is provided in this instructional module for undergraduate and graduate transportation-related courses. The material can be used in many disciplines including engineering, business, marketing, and technology. The concept of market segmentation is primarily a transportation planning technique by…

  20. Adaptive segmentation for scientific databases

    NARCIS (Netherlands)

    Ivanova, M.; Kersten, M.L.; Nes, N.

    2008-01-01

    In this paper we explore database segmentation in the context of a column-store DBMS targeted at a scientific database. We present a novel hardware- and scheme-oblivious segmentation algorithm, which learns and adapts to the workload immediately. The approach taken is to capitalize on (intermediate)

  1. Region segmentation along image sequence

    Energy Technology Data Exchange (ETDEWEB)

    Monchal, L.; Aubry, P.

    1995-12-31

    A method to extract regions in sequence of images is proposed. Regions are not matched from one image to the following one. The result of a region segmentation is used as an initialization to segment the following and image to track the region along the sequence. The image sequence is exploited as a spatio-temporal event. (authors). 12 refs., 8 figs.

  2. Market segmentation using perceived constraints

    Science.gov (United States)

    Jinhee Jun; Gerard Kyle; Andrew Mowen

    2008-01-01

    We examined the practical utility of segmenting potential visitors to Cleveland Metroparks using their constraint profiles. Our analysis identified three segments based on their scores on the dimensions of constraints: Other priorities--visitors who scored the highest on 'other priorities' dimension; Highly Constrained--visitors who scored relatively high on...

  3. Using Predictability for Lexical Segmentation

    Science.gov (United States)

    Çöltekin, Çagri

    2017-01-01

    This study investigates a strategy based on predictability of consecutive sub-lexical units in learning to segment a continuous speech stream into lexical units using computational modeling and simulations. Lexical segmentation is one of the early challenges during language acquisition, and it has been studied extensively through psycholinguistic…

  4. IFRS 8 – OPERATING SEGMENTS

    OpenAIRE

    BOCHIS LEONICA; Sucala Lucia; DUMBRAVA PARTENIE; BREBAN LUDOVICA

    2009-01-01

    Segment reporting in accordance with IFRS 8 will be mandatory for annual financial statements covering periods beginning on or after 1 January 2009. The standards replaces IAS 14, Segment Reporting, from that date. The objective of IFRS 8 is to require

  5. The Importance of Marketing Segmentation

    Science.gov (United States)

    Martin, Gillian

    2011-01-01

    The rationale behind marketing segmentation is to allow businesses to focus on their consumers' behaviors and purchasing patterns. If done effectively, marketing segmentation allows an organization to achieve its highest return on investment (ROI) in turn for its marketing and sales expenses. If an organization markets its products or services to…

  6. Essays in international market segmentation

    NARCIS (Netherlands)

    Hofstede, ter F.

    1999-01-01

    The primary objective of this thesis is to develop and validate new methodologies to improve the effectiveness of international segmentation strategies. The current status of international market segmentation research is reviewed in an introductory chapter, which provided a number of

  7. Intestinal, segmented, filamentous bacteria.

    Science.gov (United States)

    Klaasen, H L; Koopman, J P; Poelma, F G; Beynen, A C

    1992-06-01

    Segmented, filamentous bacteria (SFBs) are autochthonous, apathogenic bacteria, occurring in the ileum of mice and rats. Although the application of formal taxonomic criteria is impossible due to the lack of an in vitro technique to culture SFBs, microbes with a similar morphology, found in the intestine of a wide range of vertebrate and invertebrate host species, are considered to be related. SFBs are firmly attached to the epithelial cells of the distal ileal mucosa, their preferential ecological niche being the epithelium covering the Peyer's patches. Electron microscopic studies have demonstrated a considerable morphological diversity of SFBs, which may relate to different stages of a life cycle. Determinants of SFB colonization in vivo are host species, genotypical and phenotypical characteristics of the host, diet composition, environmental stress and antimicrobial drugs. SFBs can survive in vitro incubation, but do not multiply. On the basis of their apathogenic character and intimate relationship with the host, it is suggested that SFBs contribute to development and/or maintenance of host resistance to enteropathogens.

  8. [Orbitotemporal segmental neurofibromatosis].

    Science.gov (United States)

    Montard, R; Putz, C; Barrali, M; Kantelip, B; Montard, M

    2007-11-01

    Neurofibromatosis is a rare pathology with heterogeneous clinical presentation. We report a case of a right orbitotemporal plexiform neurofibroma in a 64-year-old woman with von Recklinghausen's neurofibromatosis. A craniofacial CT scan, with injection, showed a heterogeneous tumor in front of the skull base and the temporoparietal bone with no intracranial extension but an extension into the maxillary sinus and nasal cavity. In summery, she presented orbitotemporal segmental neurofibromatosis type 1 because of the unilateral lesion. She had a first surgery to remove her jugal and preauricular tumor with an exenteration, which provided an eye histology. The histology found no Lisch nodules but a cellular proliferation causing choroidal hyperplasia. We noted neurofibromin on choroidal cells and normal cells in addition to pathologic cells (Schwann cells and melanocytes), meaning that two cell populations were obtained in the same tissue: a somatic mosaicism. We advance the hypothesis that there was a regulation of cellular growth in a particular microenvironment because of the absence of tumor. To identify and confirm the somatic mosaicism, we would need a FISH analysis (probes containing sequences of the NF1 gene with a probe specific for the chromosome 17 centromere).

  9. Deriving Earth Science Data Analytics Requirements

    Science.gov (United States)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  10. Visual Analytics 101

    Energy Technology Data Exchange (ETDEWEB)

    Scholtz, Jean; Burtner, Edwin R.; Cook, Kristin A.

    2016-06-13

    This course will introduce the field of Visual Analytics to HCI researchers and practitioners highlighting the contributions they can make to this field. Topics will include a definition of visual analytics along with examples of current systems, types of tasks and end users, issues in defining user requirements, design of visualizations and interactions, guidelines and heuristics, the current state of user-centered evaluations, and metrics for evaluation. We encourage designers, HCI researchers, and HCI practitioners to attend to learn how their skills can contribute to advancing the state of the art of visual analytics

  11. Google analytics integrations

    CERN Document Server

    Waisberg, Daniel

    2015-01-01

    A roadmap for turning Google Analytics into a centralized marketing analysis platform With Google Analytics Integrations, expert author Daniel Waisberg shows you how to gain a more meaningful, complete view of customers that can drive growth opportunities. This in-depth guide shows not only how to use Google Analytics, but also how to turn this powerful data collection and analysis tool into a central marketing analysis platform for your company. Taking a hands-on approach, this resource explores the integration and analysis of a host of common data sources, including Google AdWords, AdSens

  12. Satellite Image Classification and Segmentation by Using JSEG Segmentation Algorithm

    OpenAIRE

    Khamael Abbas; Mustafa Rydh

    2012-01-01

    In this paper, a adopted approach to fully automatic satellite image segmentation, called JSEG, "JPEG image segmentation" is presented. First colors in the image are quantized to represent differentiate regions in the image. Then image pixel colors are replaced by their corresponding color class labels, thus forming a class-map of the image. A criterion for “good” segmentation using this class-map is proposed. Applying the criterion to local windows in the class-map results in the “J-image”...

  13. Evaluation of the analytical performance of the novel NS-Prime system and examination of temperature stability of fecal transferrin compared with fecal hemoglobin as biomarkers in a colon cancer screening program.

    Science.gov (United States)

    Demian, Wael L L; Collins, Stacy; Fowler, Candace; McGrath, Jerry; Antle, Scott; Moores, Zoë; Hollohan, Deborah; Lacey, Suzanne; Banoub, Joseph; Randell, Edward

    2015-08-01

    To examine the analytical aspects of fecal transferrin (Tf) and hemoglobin (Hb) measured on the NS-Prime analyzer for use in a colon cancer screening program. Method evaluation and temperature stability studies for fecal Tf and Hb were completed. A method comparison was carried out against the NS-Plus system using samples collected from 254 screening program participants. A further 200 samples were analyzed to help determine suitable reference limits for fecal Tf using these systems. The assay for fecal Tf showed acceptable linearity, precision, and recovery, and showed minimal carryover with low potential for impact by the prozone effect. The 95th percentile for fecal Tf obtained for the reference population was 4.9 µg/g feces. The collection device sufficiently maintained fecal Tf and Hb stability for at least 7 days at room temperature, 4 °C, and -20 °C. Fecal Tf and Hb were most stable at 4 °C and -20 °C, but showed considerable loss (20-40%) of both proteins at 37 °C within the first 7 days. Mixing small amounts of blood into diluted fecal samples maintained at 37 °C for various time periods showed >50% loss of both proteins within 1 h of incubation. The NS-Prime analyzer showed acceptable performance for fecal Tf and Hb. These studies suggest that use of both Tf and Hb together as biomarkers will result in higher positivity rates, but this may not be attributed to greater stability of Tf over Hb in human feces.

  14. Towards Secure and Trustworthy Cyberspace: Social Media Analytics on Hacker Communities

    Science.gov (United States)

    Li, Weifeng

    2017-01-01

    Social media analytics is a critical research area spawned by the increasing availability of rich and abundant online user-generated content. So far, social media analytics has had a profound impact on organizational decision making in many aspects, including product and service design, market segmentation, customer relationship management, and…

  15. Sometimes spelling is easier than phonemic segmentation

    NARCIS (Netherlands)

    Bon, W.H.J. van; Duighuisen, H.C.M.

    1995-01-01

    Poor spellers from the Netherlands segmented and spelled the same words on different occasions. If they base their spellings on the segmentations that they produce in the segmentation task, the correlation between segmentation and spelling scores should be high, and segmentation should not be more

  16. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.

    Science.gov (United States)

    Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita

    2016-10-11

    We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix

  17. Pesticide Analytical Methods

    Science.gov (United States)

    Pesticide manufacturers must develop and submit analytical methods for their pesticide products to support registration of their products under FIFRA. Learn about these methods as well as SOPs for testing of antimicrobial products against three organisms.

  18. Mobility Data Analytics Center.

    Science.gov (United States)

    2016-01-01

    Mobility Data Analytics Center aims at building a centralized data engine to efficiently manipulate : large-scale data for smart decision making. Integrating and learning the massive data are the key to : the data engine. The ultimate goal of underst...

  19. Direct volume estimation without segmentation

    Science.gov (United States)

    Zhen, X.; Wang, Z.; Islam, A.; Bhaduri, M.; Chan, I.; Li, S.

    2015-03-01

    Volume estimation plays an important role in clinical diagnosis. For example, cardiac ventricular volumes including left ventricle (LV) and right ventricle (RV) are important clinical indicators of cardiac functions. Accurate and automatic estimation of the ventricular volumes is essential to the assessment of cardiac functions and diagnosis of heart diseases. Conventional methods are dependent on an intermediate segmentation step which is obtained either manually or automatically. However, manual segmentation is extremely time-consuming, subjective and highly non-reproducible; automatic segmentation is still challenging, computationally expensive, and completely unsolved for the RV. Towards accurate and efficient direct volume estimation, our group has been researching on learning based methods without segmentation by leveraging state-of-the-art machine learning techniques. Our direct estimation methods remove the accessional step of segmentation and can naturally deal with various volume estimation tasks. Moreover, they are extremely flexible to be used for volume estimation of either joint bi-ventricles (LV and RV) or individual LV/RV. We comparatively study the performance of direct methods on cardiac ventricular volume estimation by comparing with segmentation based methods. Experimental results show that direct estimation methods provide more accurate estimation of cardiac ventricular volumes than segmentation based methods. This indicates that direct estimation methods not only provide a convenient and mature clinical tool for cardiac volume estimation but also enables diagnosis of cardiac diseases to be conducted in a more efficient and reliable way.

  20. Position sensors for segmented mirror

    Science.gov (United States)

    Rozière, Didier; Buous, Sébastien; Courteville, Alain

    2004-09-01

    There are currently several projects for giant telescopes with segmented mirrors under way. These future telescopes will have their primary mirror made of several thousand segments. The main advantage of segmentation is that it enables the active control of the whole mirror, so as to suppress the deformations of the support structure due to the wind, gravity, thermal inhomogeneities etc. ..., thus getting the best possible stigmatism. However, providing active control of segmented mirrors requires numerous accurate edges sensors. It is acknowledged that capacitance-based technology nowadays offers the best metrological performances-to-cost ratio. As the leader in capacitive technology, FOGALE nanotech offers an original concept which reduces the cost of instrumentation, sensors and electronics, while keeping a very high level of performances with a manufacturing process completely industrialised. We present here the sensors developed for the Segment Alignment Measurement System (SAMS) of the Southern African Large Telescope (SALT). This patented solution represents an important improvement in terms of cost, to market the Position Sensors for Segmented Mirrors of ELTs, whilst maintaining a very high performance level. We present here the concept, the laboratory qualification, and the first trials on the 7 central segments of SALT. The laboratory results are good, and we are now working on the on-site implementation to improve the immunity of the sensors to environment.

  1. Intermediate algebra & analytic geometry

    CERN Document Server

    Gondin, William R

    1967-01-01

    Intermediate Algebra & Analytic Geometry Made Simple focuses on the principles, processes, calculations, and methodologies involved in intermediate algebra and analytic geometry. The publication first offers information on linear equations in two unknowns and variables, functions, and graphs. Discussions focus on graphic interpretations, explicit and implicit functions, first quadrant graphs, variables and functions, determinate and indeterminate systems, independent and dependent equations, and defective and redundant systems. The text then examines quadratic equations in one variable, system

  2. Learning analytics in education

    OpenAIRE

    Štrukelj, Tajda

    2015-01-01

    Learning analytics is a young field in computer supported learning, which could have a great impact on education in the future. It is a set of analytical tools which measure, collect, analyze and report about students' data for the purpose of understanding and optimizing students' learning and environments in which this learning occurs. Today, more and more learning related activities are placed on the web. Teachers are creating virtual learning environments (VLE), in which a great set of...

  3. Encyclopedia of analytical surfaces

    CERN Document Server

    Krivoshapko, S N

    2015-01-01

    This encyclopedia presents an all-embracing collection of analytical surface classes. It provides concise definitions  and description for more than 500 surfaces and categorizes them in 38 classes of analytical surfaces. All classes are cross references to the original literature in an excellent bibliography. The encyclopedia is of particular interest to structural and civil engineers and serves as valuable reference for mathematicians.

  4. ENVIRONMENTAL ANALYTICAL CHEMISTRY OF ...

    Science.gov (United States)

    Within the scope of a number of emerging contaminant issues in environmental analysis, one area that has received a great deal of public interest has been the assessment of the role of pharmaceuticals and personal care products (PPCPs) as stressors and agents of change in ecosystems as well as their role in unplanned human exposure. The relationship between personal actions and the occurrence of PPCPs in the environment is clear-cut and comprehensible to the public. In this overview, we attempt to examine the separations aspect of the analytical approach to the vast array of potential analytes among this class of compounds. We also highlight the relationship between these compounds and endocrine disrupting compounds (EDCs) and between PPCPs and EDCs and the more traditional environmental analytes such as the persistent organic pollutants (POPs). Although the spectrum of chemical behavior extends from hydrophobic to hydrophilic, the current focus has shifted to moderately and highly polar analytes. Thus, emphasis on HPLC and LC/MS has grown and MS/MS has become a detection technique of choice with either electrospray ionization or atmospheric pressure chemical ionization. This contrasts markedly with the bench mark approach of capillary GC, GC/MS and electron ionization in traditional environmental analysis. The expansion of the analyte list has fostered new vigor in the development of environmental analytical chemistry, modernized the range of tools appli

  5. Multiple Segmentation of Image Stacks

    DEFF Research Database (Denmark)

    Smets, Jonathan; Jaeger, Manfred

    2014-01-01

    We propose a method for the simultaneous construction of multiple image segmentations by combining a recently proposed “convolution of mixtures of Gaussians” model with a multi-layer hidden Markov random field structure. The resulting method constructs for a single image several, alternative...... segmentations that capture different structural elements of the image. We also apply the method to collections of images with identical pixel dimensions, which we call image stacks. Here it turns out that the method is able to both identify groups of similar images in the stack, and to provide segmentations...

  6. A software framework for preprocessing and level set segmentation of medical image data

    Science.gov (United States)

    Fritscher, Karl David; Schubert, Rainer

    2005-04-01

    In this work a software platform for semiautomatic segmentation of medical images based on geometric deformable models will be presented. Including filters for image preprocessing, image segmentation and 3D visualization this toolkit offers the possibility of creating highly effective segmentation pipelines by combining classic segmentation techniques like seeded region growing and manual segmentation with modern level set segmentation algorithms. By individually combining input and output of different segmentation methods, specific and at the same time easy to use segmentation pipelines can be created. Using open source libraries for the implementation of a number of frequently used preprocessing and segmentation algorithms allowed effective programming by at the same time providing stable and highly effective algorithms. The usage of modern programming standards and developing cross-platform algorithm classes guarantees extensibility and flexible implementation in different hard- and software settings. Segmentation results, created in different research projects will be presented and the efficient usage of this framework will be demonstrated. The implementation of parts of the framework in a clinical setting is in progress and currently we are working on the embedding of statistical models and prior knowledge in the segmentation framework.

  7. Hanford analytical services quality assurance requirements documents

    Energy Technology Data Exchange (ETDEWEB)

    Hyatt, J.E.

    1997-09-25

    Hanford Analytical Services Quality Assurance Requirements Document (HASQARD) is issued by the Analytical Services, Program of the Waste Management Division, US Department of Energy (US DOE), Richland Operations Office (DOE-RL). The HASQARD establishes quality requirements in response to DOE Order 5700.6C (DOE 1991b). The HASQARD is designed to meet the needs of DOE-RL for maintaining a consistent level of quality for sampling and field and laboratory analytical services provided by contractor and commercial field and laboratory analytical operations. The HASQARD serves as the quality basis for all sampling and field/laboratory analytical services provided to DOE-RL through the Analytical Services Program of the Waste Management Division in support of Hanford Site environmental cleanup efforts. This includes work performed by contractor and commercial laboratories and covers radiological and nonradiological analyses. The HASQARD applies to field sampling, field analysis, and research and development activities that support work conducted under the Hanford Federal Facility Agreement and Consent Order Tri-Party Agreement and regulatory permit applications and applicable permit requirements described in subsections of this volume. The HASQARD applies to work done to support process chemistry analysis (e.g., ongoing site waste treatment and characterization operations) and research and development projects related to Hanford Site environmental cleanup activities. This ensures a uniform quality umbrella to analytical site activities predicated on the concepts contained in the HASQARD. Using HASQARD will ensure data of known quality and technical defensibility of the methods used to obtain that data. The HASQARD is made up of four volumes: Volume 1, Administrative Requirements; Volume 2, Sampling Technical Requirements; Volume 3, Field Analytical Technical Requirements; and Volume 4, Laboratory Technical Requirements. Volume 1 describes the administrative requirements

  8. Croatian Analytical Terminology

    Directory of Open Access Journals (Sweden)

    Kastelan-Macan; M.

    2008-04-01

    Full Text Available Results of analytical research are necessary in all human activities. They are inevitable in making decisions in the environmental chemistry, agriculture, forestry, veterinary medicine, pharmaceutical industry, and biochemistry. Without analytical measurements the quality of materials and products cannot be assessed, so that analytical chemistry is an essential part of technical sciences and disciplines.The language of Croatian science, and analytical chemistry within it, was one of the goals of our predecessors. Due to the political situation, they did not succeed entirely, but for the scientists in independent Croatia this is a duty, because language is one of the most important features of the Croatian identity. The awareness of the need to introduce Croatian terminology was systematically developed in the second half of the 19th century, along with the founding of scientific societies and the wish of scientists to write their scientific works in Croatian, so that the results of their research may be applied in economy. Many authors of textbooks from the 19th and the first half of the 20th century contributed to Croatian analytical terminology (F. Rački, B. Šulek, P. Žulić, G. Pexidr, J. Domac, G. Janeček , F. Bubanović, V. Njegovan and others. M. DeŢelić published the first systematic chemical terminology in 1940, adjusted to the IUPAC recommendations. In the second half of 20th century textbooks in classic analytical chemistry were written by V. Marjanović-Krajovan, M. Gyiketta-Ogrizek, S. Žilić and others. I. Filipović wrote the General and Inorganic Chemistry textbook and the Laboratory Handbook (in collaboration with P. Sabioncello and contributed greatly to establishing the terminology in instrumental analytical methods.The source of Croatian nomenclature in modern analytical chemistry today are translated textbooks by Skoog, West and Holler, as well as by Günnzler i Gremlich, and original textbooks by S. Turina, Z.

  9. Metrology of IXO Mirror Segments

    Science.gov (United States)

    Chan, Kai-Wing

    2011-01-01

    For future x-ray astrophysics mission that demands optics with large throughput and excellent angular resolution, many telescope concepts build around assembling thin mirror segments in a Wolter I geometry, such as that originally proposed for the International X-ray Observatory. The arc-second resolution requirement posts unique challenges not just for fabrication, mounting but also for metrology of these mirror segments. In this paper, we shall discuss the metrology of these segments using normal incidence metrological method with interferometers and null lenses. We present results of the calibration of the metrology systems we are currently using, discuss their accuracy and address the precision in measuring near-cylindrical mirror segments and the stability of the measurements.

  10. Analytical Thinking, Analytical Action: Using Prelab Video Demonstrations and e-Quizzes to Improve Undergraduate Preparedness for Analytical Chemistry Practical Classes

    Science.gov (United States)

    Jolley, Dianne F.; Wilson, Stephen R.; Kelso, Celine; O'Brien, Glennys; Mason, Claire E.

    2016-01-01

    This project utilizes visual and critical thinking approaches to develop a higher-education synergistic prelab training program for a large second-year undergraduate analytical chemistry class, directing more of the cognitive learning to the prelab phase. This enabled students to engage in more analytical thinking prior to engaging in the…

  11. Information theory in analytical chemistry

    National Research Council Canada - National Science Library

    Eckschlager, Karel; Danzer, Klaus

    1994-01-01

    Contents: The aim of analytical chemistry - Basic concepts of information theory - Identification of components - Qualitative analysis - Quantitative analysis - Multicomponent analysis - Optimum analytical...

  12. 40 CFR 140.5 - Analytical procedures.

    Science.gov (United States)

    2010-07-01

    ... 140.5 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS (CONTINUED) MARINE SANITATION DEVICE STANDARD § 140.5 Analytical procedures. In determining the composition and quality of effluent discharge from marine sanitation devices, the procedures contained in 40 CFR part 136...

  13. Doing social media analytics

    Directory of Open Access Journals (Sweden)

    Phillip Brooker

    2016-07-01

    Full Text Available In the few years since the advent of ‘Big Data’ research, social media analytics has begun to accumulate studies drawing on social media as a resource and tool for research work. Yet, there has been relatively little attention paid to the development of methodologies for handling this kind of data. The few works that exist in this area often reflect upon the implications of ‘grand’ social science methodological concepts for new social media research (i.e. they focus on general issues such as sampling, data validity, ethics, etc.. By contrast, we advance an abductively oriented methodological suite designed to explore the construction of phenomena played out through social media. To do this, we use a software tool – Chorus – to illustrate a visual analytic approach to data. Informed by visual analytic principles, we posit a two-by-two methodological model of social media analytics, combining two data collection strategies with two analytic modes. We go on to demonstrate each of these four approaches ‘in action’, to help clarify how and why they might be used to address various research questions.

  14. Volume Segmentation and Analysis of Biological Materials Using SuRVoS (Super-region Volume Segmentation) Workbench

    Science.gov (United States)

    Darrow, Michele C.; Luengo, Imanol; Basham, Mark; Spink, Matthew C.; Irvine, Sarah; French, Andrew P.; Ashton, Alun W.; Duke, Elizabeth M.H.

    2017-01-01

    Segmentation is the process of isolating specific regions or objects within an imaged volume, so that further study can be undertaken on these areas of interest. When considering the analysis of complex biological systems, the segmentation of three-dimensional image data is a time consuming and labor intensive step. With the increased availability of many imaging modalities and with automated data collection schemes, this poses an increased challenge for the modern experimental biologist to move from data to knowledge. This publication describes the use of SuRVoS Workbench, a program designed to address these issues by providing methods to semi-automatically segment complex biological volumetric data. Three datasets of differing magnification and imaging modalities are presented here, each highlighting different strategies of segmenting with SuRVoS. Phase contrast X-ray tomography (microCT) of the fruiting body of a plant is used to demonstrate segmentation using model training, cryo electron tomography (cryoET) of human platelets is used to demonstrate segmentation using super- and megavoxels, and cryo soft X-ray tomography (cryoSXT) of a mammalian cell line is used to demonstrate the label splitting tools. Strategies and parameters for each datatype are also presented. By blending a selection of semi-automatic processes into a single interactive tool, SuRVoS provides several benefits. Overall time to segment volumetric data is reduced by a factor of five when compared to manual segmentation, a mainstay in many image processing fields. This is a significant savings when full manual segmentation can take weeks of effort. Additionally, subjectivity is addressed through the use of computationally identified boundaries, and splitting complex collections of objects by their calculated properties rather than on a case-by-case basis. PMID:28872144

  15. Competing on talent analytics.

    Science.gov (United States)

    Davenport, Thomas H; Harris, Jeanne; Shapiro, Jeremy

    2010-10-01

    Do investments in your employees actually affect workforce performance? Who are your top performers? How can you empower and motivate other employees to excel? Leading-edge companies such as Google, Best Buy, Procter & Gamble, and Sysco use sophisticated data-collection technology and analysis to answer these questions, leveraging a range of analytics to improve the way they attract and retain talent, connect their employee data to business performance, differentiate themselves from competitors, and more. The authors present the six key ways in which companies track, analyze, and use data about their people-ranging from a simple baseline of metrics to monitor the organization's overall health to custom modeling for predicting future head count depending on various "what if" scenarios. They go on to show that companies competing on talent analytics manage data and technology at an enterprise level, support what analytical leaders do, choose realistic targets for analysis, and hire analysts with strong interpersonal skills as well as broad expertise.

  16. Advanced business analytics

    CERN Document Server

    Lev, Benjamin

    2015-01-01

    The book describes advanced business analytics and shows how to apply them to many different professional areas of engineering and management. Each chapter of the book is contributed by a different author and covers a different area of business analytics. The book connects the analytic principles with business practice and provides an interface between the main disciplines of engineering/technology and the organizational, administrative and planning abilities of management. It also refers to other disciplines such as economy, finance, marketing, behavioral economics and risk analysis. This book is of special interest to engineers, economists and researchers who are developing new advances in engineering management but also to practitioners working on this subject.

  17. Automatic Music Boundary Detection Using Short Segmental Acoustic Similarity in a Music Piece

    Directory of Open Access Journals (Sweden)

    Yoshiaki Itoh

    2008-07-01

    Full Text Available The present paper proposes a new approach for detecting music boundaries, such as the boundary between music pieces or the boundary between a music piece and a speech section for automatic segmentation of musical video data and retrieval of a designated music piece. The proposed approach is able to capture each music piece using acoustic similarity defined for short-term segments in the music piece. The short segmental acoustic similarity is obtained by means of a new algorithm called segmental continuous dynamic programming, or segmental CDP. The location of each music piece and its music boundaries are then identified by referring to multiple similar segments and their location information, avoiding oversegmentation within a music piece. The performance of the proposed method is evaluated for music boundary detection using actual music datasets. The present paper demonstrates that the proposed method enables accurate detection of music boundaries for both the evaluation data and a real broadcasted music program.

  18. Automatic Music Boundary Detection Using Short Segmental Acoustic Similarity in a Music Piece

    Directory of Open Access Journals (Sweden)

    Tanaka Kazuyo

    2008-01-01

    Full Text Available The present paper proposes a new approach for detecting music boundaries, such as the boundary between music pieces or the boundary between a music piece and a speech section for automatic segmentation of musical video data and retrieval of a designated music piece. The proposed approach is able to capture each music piece using acoustic similarity defined for short-term segments in the music piece. The short segmental acoustic similarity is obtained by means of a new algorithm called segmental continuous dynamic programming, or segmental CDP. The location of each music piece and its music boundaries are then identified by referring to multiple similar segments and their location information, avoiding oversegmentation within a music piece. The performance of the proposed method is evaluated for music boundary detection using actual music datasets. The present paper demonstrates that the proposed method enables accurate detection of music boundaries for both the evaluation data and a real broadcasted music program.

  19. Foundations of predictive analytics

    CERN Document Server

    Wu, James

    2012-01-01

    Drawing on the authors' two decades of experience in applied modeling and data mining, Foundations of Predictive Analytics presents the fundamental background required for analyzing data and building models for many practical applications, such as consumer behavior modeling, risk and marketing analytics, and other areas. It also discusses a variety of practical topics that are frequently missing from similar texts. The book begins with the statistical and linear algebra/matrix foundation of modeling methods, from distributions to cumulant and copula functions to Cornish--Fisher expansion and o

  20. Social network data analytics

    CERN Document Server

    Aggarwal, Charu C

    2011-01-01

    Social network analysis applications have experienced tremendous advances within the last few years due in part to increasing trends towards users interacting with each other on the internet. Social networks are organized as graphs, and the data on social networks takes on the form of massive streams, which are mined for a variety of purposes. Social Network Data Analytics covers an important niche in the social network analytics field. This edited volume, contributed by prominent researchers in this field, presents a wide selection of topics on social network data mining such as Structural Pr

  1. Analytic number theory

    CERN Document Server

    Iwaniec, Henryk

    2004-01-01

    Analytic Number Theory distinguishes itself by the variety of tools it uses to establish results, many of which belong to the mainstream of arithmetic. One of the main attractions of analytic number theory is the vast diversity of concepts and methods it includes. The main goal of the book is to show the scope of the theory, both in classical and modern directions, and to exhibit its wealth and prospects, its beautiful theorems and powerful techniques. The book is written with graduate students in mind, and the authors tried to balance between clarity, completeness, and generality. The exercis

  2. Flurry Analytics pelikehityksen apuna

    OpenAIRE

    Kuusisto, Rami

    2015-01-01

    Flurry Analytics on Yahoo Mobile Developer Suiten osa, joka keskittyy analytiikkaan. Opinnäytetyössä kerrotaan Flurry Analytics SDK:n implementoimisesta sovellukseen, Flurry Analyticsin tarjoaman web-portaalin käytöstä, sekä siitä, miten näitä ominaisuuksia käytettiin toteutettaessa pelin Cabals: Legends analytiikkatoteutusta. Työssä tarkastellaan myös miten jo kehitettyä analytiikkatoteutusta voitaisiin käyttää pohjana vielä pidemmälle viedylle analytiikkatoteutukselle ja kuinka pystyttäisii...

  3. Business Analytics in Practice and in Education: A Competency-Based Perspective

    Science.gov (United States)

    Mamonov, Stanislav; Misra, Ram; Jain, Rashmi

    2015-01-01

    Business analytics is a fast-growing area in practice. The rapid growth of business analytics in practice in the recent years is mirrored by a corresponding fast evolution of new educational programs. While more than 130 graduate and undergraduate degree programs in business analytics have been launched in the past 5 years, no commonly accepted…

  4. A burst segmentation-deflection routing contention resolution mechanism in OBS networks

    Science.gov (United States)

    Guan, Ai-hong; Wang, Bo-yun

    2012-01-01

    One of the key problems to hinder the realization of optical burst switching (OBS) technology in the core networks is the losses due to the contention among the bursts at the core nodes. Burst segmentation is an effective contention resolution technique used to reduce the number of packets lost due to the burst losses. In our work, a burst segmentation-deflection routing contention resolution mechanism in OBS networks is proposed. When the contention occurs, the bursts are segmented according to the lowest packet loss probability of networks firstly, and then the segmented burst is deflected on the optimum routing. An analytical model is proposed to evaluate the contention resolution mechanism. Simulation results show that high-priority bursts have significantly lower packet loss probability and transmission delay than the low-priority. And the performance of the burst lengths, in which the number of segments per burst distributes geometrically, is more effective than that of the deterministically distributed burst lengths.

  5. Segmented Capacitance Sensor with Partially Released Inactive Segments

    Directory of Open Access Journals (Sweden)

    Lev Jakub

    2015-09-01

    Full Text Available Material throughput measurement is important for many applications, for example yield maps creation or control of mass flow in stationary lines. Quite perspective can be the capacitive throughput method. Segmented capacitance sensor (SCS is discussed in this paper. SCS is a compromise between simple capacitive throughput sensors and electrical capacitance tomography sensors. The SCS variant with partially released inactive segments is presented. The mathematical model of SCS was created and verified by measurements. A good correspondence between measured and computed values was found and it can be stated that the proposed mathematical model was verified. During measurement the voltage values on the inactive segments were monitored as well. On the basis of the measurement there was found that these values are significantly influenced by material distribution.

  6. Methods of evaluating segmentation characteristics and segmentation of major faults

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kie Hwa; Chang, Tae Woo; Kyung, Jai Bok [Seoul National Univ., Seoul (Korea, Republic of)] (and others)

    2000-03-15

    Seismological, geological, and geophysical studies were made for reasonable segmentation of the Ulsan fault and the results are as follows. One- and two- dimensional electrical surveys revealed clearly the fault fracture zone enlarges systematically northward and southward from the vicinity of Mohwa-ri, indicating Mohwa-ri is at the seismic segment boundary. Field Geological survey and microscope observation of fault gouge indicates that the Quaternary faults in the area are reactivated products of the preexisting faults. Trench survey of the Chonbuk fault Galgok-ri revealed thrust faults and cumulative vertical displacement due to faulting during the late Quaternary with about 1.1-1.9 m displacement per event; the latest event occurred from 14000 to 25000 yrs. BP. The seismic survey showed the basement surface os cut by numerous reverse faults and indicated the possibility that the boundary between Kyeongsangbukdo and Kyeongsannamdo may be segment boundary.

  7. Segmentation by Large Scale Hypothesis Testing - Segmentation as Outlier Detection

    DEFF Research Database (Denmark)

    Darkner, Sune; Dahl, Anders Lindbjerg; Larsen, Rasmus

    2010-01-01

    locally. We propose a method based on large scale hypothesis testing with a consistent method for selecting an appropriate threshold for the given data. By estimating the background distribution we characterize the segment of interest as a set of outliers with a certain probability based on the estimated...... a microscope and we show how the method can handle transparent particles with significant glare point. The method generalizes to other problems. THis is illustrated by applying the method to camera calibration images and MRI of the midsagittal plane for gray and white matter separation and segmentation...

  8. User Behavior Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Turcotte, Melissa [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Moore, Juston Shane [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-02-28

    User Behaviour Analytics is the tracking, collecting and assessing of user data and activities. The goal is to detect misuse of user credentials by developing models for the normal behaviour of user credentials within a computer network and detect outliers with respect to their baseline.

  9. Big Data Analytics

    Indian Academy of Sciences (India)

    IAS Admin

    2016-08-20

    Aug 20, 2016 ... sharpener (with picture and prices) when you place an order for a knife. If you order a book it will give you a list of other books you would probably like to buy .... use of predictive analytics is in marketing by comprehending customers' needs and preferences. An example is the advertise- ment on socks that ...

  10. Analytical Chemistry Laboratory

    Science.gov (United States)

    Anderson, Mark

    2013-01-01

    The Analytical Chemistry and Material Development Group maintains a capability in chemical analysis, materials R&D failure analysis and contamination control. The uniquely qualified staff and facility support the needs of flight projects, science instrument development and various technical tasks, as well as Cal Tech.

  11. Analytic number theory

    CERN Document Server

    Matsumoto, Kohji

    2002-01-01

    The book includes several survey articles on prime numbers, divisor problems, and Diophantine equations, as well as research papers on various aspects of analytic number theory such as additive problems, Diophantine approximations and the theory of zeta and L-function Audience Researchers and graduate students interested in recent development of number theory

  12. Explanatory analytics in OLAP

    NARCIS (Netherlands)

    Caron, E.A.M.; Daniëls, H.A.M.

    2013-01-01

    In this paper the authors describe a method to integrate explanatory business analytics in OLAP information systems. This method supports the discovery of exceptional values in OLAP data and the explanation of such values by giving their underlying causes. OLAP applications offer a support tool for

  13. Big Data Analytics

    Indian Academy of Sciences (India)

    But analysing massiveamounts of data available in the Internet has the potential ofimpinging on our privacy. Inappropriate analysis of big datacan lead to misleading conclusions. In this article, we explainwhat is big data, how it is analysed, and give some case studiesillustrating the potentials and pitfalls of big data analytics ...

  14. History of analytic geometry

    CERN Document Server

    Boyer, Carl B

    2012-01-01

    Designed as an integrated survey of the development of analytic geometry, this study presents the concepts and contributions from before the Alexandrian Age through the eras of the great French mathematicians Fermat and Descartes, and on through Newton and Euler to the "Golden Age," from 1789 to 1850.

  15. Analytics for Customer Engagement

    NARCIS (Netherlands)

    Bijmolt, Tammo H. A.; Leeflang, Peter S. H.; Block, Frank; Eisenbeiss, Maik; Hardie, Bruce G. S.; Lemmens, Aurelie; Saffert, Peter

    In this article, we discuss the state of the art of models for customer engagement and the problems that are inherent to calibrating and implementing these models. The authors first provide an overview of the data available for customer analytics and discuss recent developments. Next, the authors

  16. Volume Segmentation and Ghost Particles

    Science.gov (United States)

    Ziskin, Isaac; Adrian, Ronald

    2011-11-01

    Volume Segmentation Tomographic PIV (VS-TPIV) is a type of tomographic PIV in which images of particles in a relatively thick volume are segmented into images on a set of much thinner volumes that may be approximated as planes, as in 2D planar PIV. The planes of images can be analysed by standard mono-PIV, and the volume of flow vectors can be recreated by assembling the planes of vectors. The interrogation process is similar to a Holographic PIV analysis, except that the planes of image data are extracted from two-dimensional camera images of the volume of particles instead of three-dimensional holographic images. Like the tomographic PIV method using the MART algorithm, Volume Segmentation requires at least two cameras and works best with three or four. Unlike the MART method, Volume Segmentation does not require reconstruction of individual particle images one pixel at a time and it does not require an iterative process, so it operates much faster. As in all tomographic reconstruction strategies, ambiguities known as ghost particles are produced in the segmentation process. The effect of these ghost particles on the PIV measurement is discussed. This research was supported by Contract 79419-001-09, Los Alamos National Laboratory.

  17. Segmental NF: A Guide for Patients

    Science.gov (United States)

    ... the body can show signs of segmental NF. Segmental NF1 - Individuals with segmental NF1 most commonly have the skin findings associated with ... cases, can include severe complications. Many individuals with segmental NF1 never develop any complications other than café-au- ...

  18. Liver segmentation: indications, techniques and future directions.

    Science.gov (United States)

    Gotra, Akshat; Sivakumaran, Lojan; Chartrand, Gabriel; Vu, Kim-Nhien; Vandenbroucke-Menu, Franck; Kauffmann, Claude; Kadoury, Samuel; Gallix, Benoît; de Guise, Jacques A; Tang, An

    2017-08-01

    Liver volumetry has emerged as an important tool in clinical practice. Liver volume is assessed primarily via organ segmentation of computed tomography (CT) and magnetic resonance imaging (MRI) images. The goal of this paper is to provide an accessible overview of liver segmentation targeted at radiologists and other healthcare professionals. Using images from CT and MRI, this paper reviews the indications for liver segmentation, technical approaches used in segmentation software and the developing roles of liver segmentation in clinical practice. Liver segmentation for volumetric assessment is indicated prior to major hepatectomy, portal vein embolisation, associating liver partition and portal vein ligation for staged hepatectomy (ALPPS) and transplant. Segmentation software can be categorised according to amount of user input involved: manual, semi-automated and fully automated. Manual segmentation is considered the "gold standard" in clinical practice and research, but is tedious and time-consuming. Increasingly automated segmentation approaches are more robust, but may suffer from certain segmentation pitfalls. Emerging applications of segmentation include surgical planning and integration with MRI-based biomarkers. Liver segmentation has multiple clinical applications and is expanding in scope. Clinicians can employ semi-automated or fully automated segmentation options to more efficiently integrate volumetry into clinical practice. • Liver volume is assessed via organ segmentation on CT and MRI examinations. • Liver segmentation is used for volume assessment prior to major hepatic procedures. • Segmentation approaches may be categorised according to the amount of user input involved. • Emerging applications include surgical planning and integration with MRI-based biomarkers.

  19. An interactive segmentation method based on superpixel

    DEFF Research Database (Denmark)

    Yang, Shu; Zhu, Yaping; Wu, Xiaoyu

    2015-01-01

    This paper proposes an interactive image-segmentation method which is based on superpixel. To achieve fast segmentation, the method is used to establish a Graphcut model using superpixels as nodes, and a new energy function is proposed. Experimental results demonstrate that the authors' method has...... excellent performance in terms of segmentation accuracy and computation efficiency compared with other segmentation algorithm based on pixels....

  20. Discourse segmentation and ambiguity in discourse structure

    NARCIS (Netherlands)

    Hoek, J.|info:eu-repo/dai/nl/375290605; Evers-Vermeul, J.|info:eu-repo/dai/nl/191644684; Sanders, T.J.M.|info:eu-repo/dai/nl/075243911

    2016-01-01

    Discourse relations hold between two or more text segments. The process of discourse annotation not only involves determining what type of relation holds between segments, but also indicating the segments themselves. Often, segmentation and annotation are treated as individual steps, and separate

  1. Skip segment Hirschsprung disease and Waardenburg syndrome

    Directory of Open Access Journals (Sweden)

    Erica R. Gross

    2015-04-01

    Full Text Available Skip segment Hirschsprung disease describes a segment of ganglionated bowel between two segments of aganglionated bowel. It is a rare phenomenon that is difficult to diagnose. We describe a recent case of skip segment Hirschsprung disease in a neonate with a family history of Waardenburg syndrome and the genetic profile that was identified.

  2. 47 CFR 95.853 - Frequency segments.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 5 2010-10-01 2010-10-01 false Frequency segments. 95.853 Section 95.853... SERVICES 218-219 MHz Service Technical Standards § 95.853 Frequency segments. There are two frequency segments available for assignment to the 218-219 MHz Service in each service area. Frequency segment A is...

  3. Numerical and analytical methods with Matlab

    CERN Document Server

    Bober, William; Masory, Oren

    2013-01-01

    Numerical and Analytical Methods with MATLAB® presents extensive coverage of the MATLAB programming language for engineers. It demonstrates how the built-in functions of MATLAB can be used to solve systems of linear equations, ODEs, roots of transcendental equations, statistical problems, optimization problems, control systems problems, and stress analysis problems. These built-in functions are essentially black boxes to students. By combining MATLAB with basic numerical and analytical techniques, the mystery of what these black boxes might contain is somewhat alleviated. This classroom-tested

  4. A Novel Iris Segmentation Scheme

    Directory of Open Access Journals (Sweden)

    Chen-Chung Liu

    2014-01-01

    Full Text Available One of the key steps in the iris recognition system is the accurate iris segmentation from its surrounding noises including pupil, sclera, eyelashes, and eyebrows of a captured eye-image. This paper presents a novel iris segmentation scheme which utilizes the orientation matching transform to outline the outer and inner iris boundaries initially. It then employs Delogne-Kåsa circle fitting (instead of the traditional Hough transform to further eliminate the outlier points to extract a more precise iris area from an eye-image. In the extracted iris region, the proposed scheme further utilizes the differences in the intensity and positional characteristics of the iris, eyelid, and eyelashes to detect and delete these noises. The scheme is then applied on iris image database, UBIRIS.v1. The experimental results show that the presented scheme provides a more effective and efficient iris segmentation than other conventional methods.

  5. Estimation procedure of the efficiency of the heat network segment

    Science.gov (United States)

    Polivoda, F. A.; Sokolovskii, R. I.; Vladimirov, M. A.; Shcherbakov, V. P.; Shatrov, L. A.

    2017-07-01

    An extensive city heat network contains many segments, and each segment operates with different efficiency of heat energy transfer. This work proposes an original technical approach; it involves the evaluation of the energy efficiency function of the heat network segment and interpreting of two hyperbolic functions in the form of the transcendental equation. In point of fact, the problem of the efficiency change of the heat network depending on the ambient temperature was studied. Criteria dependences used for evaluation of the set segment efficiency of the heat network and finding of the parameters for the most optimal control of the heat supply process of the remote users were inferred with the help of the functional analysis methods. Generally, the efficiency function of the heat network segment is interpreted by the multidimensional surface, which allows illustrating it graphically. It was shown that the solution of the inverse problem is possible as well. Required consumption of the heating agent and its temperature may be found by the set segment efficient and ambient temperature; requirements to heat insulation and pipe diameters may be formulated as well. Calculation results were received in a strict analytical form, which allows investigating the found functional dependences for availability of the extremums (maximums) under the set external parameters. A conclusion was made that it is expedient to apply this calculation procedure in two practically important cases: for the already made (built) network, when the change of the heat agent consumption and temperatures in the pipe is only possible, and for the projecting (under construction) network, when introduction of changes into the material parameters of the network is possible. This procedure allows clarifying diameter and length of the pipes, types of insulation, etc. Length of the pipes may be considered as the independent parameter for calculations; optimization of this parameter is made in

  6. CNES solution for a reusable payload ground segment

    Science.gov (United States)

    Pradels, Grégory; Baroukh, Julien; Queyrut, Olivier; Sellé, Arnaud; Malapert, Jean-Christophe

    2012-12-01

    The MYRIADE program of the French Space Agency (CNES) has been developed for research institutes proposing pertinent scientific space experiments. It is composed of a satellite bus with independent and configurable functional chains able to onboard a scientific payload of 50 kg/60 W for at least a 3-year duration. The CNES is responsible for the launch and the development of the satellite bus and the satellite ground segment, while the research institute manages the development of the payload and the payload ground segment. This paper aims at discussing the means to enhance the development of a payload ground segment for small missions as MYRIADE. The needs of this kind of experimental missions are very specific and to try developing a reusable segment has been often considered inefficient. However the lessons learnt from the previous missions show the key role played by the payload ground segment in the success of the mission and also demonstrate that a payload ground segment of a small mission is more complex than expected. After a presentation of the MYRIADE program, the ground segments of the three first CNES missions are compared. This analysis first highlights that 15 full time equivalent in average are required to develop a payload ground segment. This result is not compliant with the resources of the research institutes which are mainly devoted to the development of the payload. The comparison emphasises invariants in the functional architecture and operational concept independently of the mission that allow to provide a tested frame adapted to small missions. The engineering processes are also compared and a solution is proposed to introduce flexibility and efficiency in the development. The approach, compliant with the functional architecture presented before, consists in using tools developed for previous missions and thereby to limit the new developments for the mission specific functions. This proposition is currently tested at CNES with the TARANIS

  7. Tank 241-S-106, cores 183, 184 and 187 analytical results for the final report

    Energy Technology Data Exchange (ETDEWEB)

    Esch, R.A.

    1997-06-30

    This document is the final laboratory report for tank 241-S-106 push mode core segments collected between February 12, 1997 and March 21, 1997. The segments were subsampled and analyzed in accordance with the Tank Push Mode Core Sampling and Analysis Plan (TSAP), the Tank Safety Screening Data Quality Objective (Safety DQO), the Historical Model Evaluation Data Requirements (Historical DQO) and the Data Quality Objective to Support Resolution of the Organic Complexant Safety Issue (Organic DQO). The analytical results are included in Table 1. Six of the twenty-four subsamples submitted for the differential scanning calorimetry (DSC) analysis exceeded the notification limit of 480 Joules/g stated in the DQO. Appropriate notifications were made. Total Organic Carbon (TOC) analyses were performed on all samples that produced exotherms during the DSC analysis. All results were less than the notification limit of three weight percent TOC. No cyanide analysis was performed, per agreement with the Tank Safety Program. None of the samples submitted for Total Alpha Activity exceeded notification limits as stated in the TSAP. Statistical evaluation of results by calculating the 95% upper confidence limit is not performed by the 222-S Laboratory and is not considered in this report. No core composites were created because there was insufficient solid material from any of the three core sampling events to generate a composite that would be representative of the tank contents.

  8. Learning to Segment Moving Objects in Videos

    OpenAIRE

    Fragkiadaki, Katerina; Arbelaez, Pablo; Felsen, Panna; Malik, Jitendra

    2014-01-01

    We segment moving objects in videos by ranking spatio-temporal segment proposals according to "moving objectness": how likely they are to contain a moving object. In each video frame, we compute segment proposals using multiple figure-ground segmentations on per frame motion boundaries. We rank them with a Moving Objectness Detector trained on image and motion fields to detect moving objects and discard over/under segmentations or background parts of the scene. We extend the top ranked segmen...

  9. Optimally segmented permanent magnet structures

    DEFF Research Database (Denmark)

    Insinga, Andrea Roberto; Bjørk, Rasmus; Smith, Anders

    2016-01-01

    We present an optimization approach which can be employed to calculate the globally optimal segmentation of a two-dimensional magnetic system into uniformly magnetized pieces. For each segment the algorithm calculates the optimal shape and the optimal direction of the remanent flux density vector......, with respect to a linear objective functional. We illustrate the approach with results for magnet design problems from different areas, such as a permanent magnet electric motor, a beam focusing quadrupole magnet for particle accelerators and a rotary device for magnetic refrigeration....

  10. Segmenting Brain Tumors with Symmetry

    OpenAIRE

    Zhang, Hejia; Zhu, Xia; Willke, Theodore L.

    2017-01-01

    We explore encoding brain symmetry into a neural network for a brain tumor segmentation task. A healthy human brain is symmetric at a high level of abstraction, and the high-level asymmetric parts are more likely to be tumor regions. Paying more attention to asymmetries has the potential to boost the performance in brain tumor segmentation. We propose a method to encode brain symmetry into existing neural networks and apply the method to a state-of-the-art neural network for medical imaging s...

  11. Nanofiber-segment ring resonator

    CERN Document Server

    Jones, D E; Franson, J D; Pittman, T B

    2016-01-01

    We describe a fiber ring resonator comprised of a relatively long loop of standard single-mode fiber with a short nanofiber segment. The evanescent mode of the nanofiber segment allows the cavity-enhanced field to interact with atoms in close proximity to the nanofiber surface. We report on an experiment using a warm atomic vapor and low-finesse cavity, and briefly discuss the potential for reaching the strong coupling regime of cavity QED by using trapped atoms and a high-finesse cavity of this kind.

  12. Four segment piezo based micropump

    Science.gov (United States)

    Haldkar, Rakesh Kumar; Sheorey, Tanuja; Gupta, Vijay Kumar; Ansari, M. Zahid

    2017-06-01

    In recent years, micropumps have been investigated by various researchers as drug delivery and disease diagnostic devices. Many of these micropumps have been designed, considering available micro fabrication technologies rather than appropriate pump performance analysis. Piezoelectric based micro pumps are more popular as compared to other smart materials being explored. In this paper, four segment piezoelectric bimorph actuator (FSPB) are compared with circular disc piezoelectric bimorph actuator (CDPB) based pump. The static and transient behaviors under various electric fields have been analyzed by using ANSYS 12.1(R) finite element software. Simulation results show that dividing the actuator in segment can amplify the deflection and improve the performance of the pump.

  13. Increasing Enrollment by Better Serving Your Institution's Target Audiences through Benefit Segmentation.

    Science.gov (United States)

    Goodnow, Betsy

    The marketing technique of benefit segmentation may be effective in increasing enrollment in adult educational programs, according to a study at College of DuPage, Glen Ellyn, Illinois. The study was conducted to test applicability of benefit segmentation to enrollment generation. The measuring instrument used in this study--the course improvement…

  14. The implement of Talmud property allocation algorithm based on graphic point-segment way

    Science.gov (United States)

    Cen, Haifeng

    2017-04-01

    Under the guidance of the Talmud allocation scheme's theory, the paper analyzes the algorithm implemented process via the perspective of graphic point-segment way, and designs the point-segment way's Talmud property allocation algorithm. Then it uses Java language to implement the core of allocation algorithm, by using Android programming to build a visual interface.

  15. Developments in analytical instrumentation

    Science.gov (United States)

    Petrie, G.

    The situation regarding photogrammetric instrumentation has changed quite dramatically over the last 2 or 3 years with the withdrawal of most analogue stereo-plotting machines from the market place and their replacement by analytically based instrumentation. While there have been few new developments in the field of comparators, there has been an explosive development in the area of small, relatively inexpensive analytical stereo-plotters based on the use of microcomputers. In particular, a number of new instruments have been introduced by manufacturers who mostly have not been associated previously with photogrammetry. Several innovative concepts have been introduced in these small but capable instruments, many of which are aimed at specialised applications, e.g. in close-range photogrammetry (using small-format cameras); for thematic mapping (by organisations engaged in environmental monitoring or resources exploitation); for map revision, etc. Another innovative and possibly significant development has been the production of conversion kits to convert suitable analogue stereo-plotting machines such as the Topocart, PG-2 and B-8 into fully fledged analytical plotters. The larger and more sophisticated analytical stereo-plotters are mostly being produced by the traditional mainstream photogrammetric systems suppliers with several new instruments and developments being introduced at the top end of the market. These include the use of enlarged photo stages to handle images up to 25 × 50 cm format; the complete integration of graphics workstations into the analytical plotter design; the introduction of graphics superimposition and stereo-superimposition; the addition of correlators for the automatic measurement of height, etc. The software associated with this new analytical instrumentation is now undergoing extensive re-development with the need to supply photogrammetric data as input to the more sophisticated G.I.S. systems now being installed by clients, instead

  16. Analytical Chemistry Laboratory, progress report for FY 1993

    Energy Technology Data Exchange (ETDEWEB)

    1993-12-01

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1993 (October 1992 through September 1993). This annual report is the tenth for the ACL and describes continuing effort on projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. The ACL also has research programs in analytical chemistry, conducts instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems. Some routine or standard analyses are done, but it is common for the Argonne programs to generate unique problems that require development or modification of methods and adaption of techniques to obtain useful analytical data. The ACL is administratively within the Chemical Technology Division (CMT), its principal ANL client, but provides technical support for many of the technical divisions and programs at ANL. The ACL has four technical groups--Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis--which together include about 45 technical staff members. Talents and interests of staff members cross the group lines, as do many projects within the ACL.

  17. How Predictive Analytics and Choice Architecture Can Improve Student Success

    National Research Council Canada - National Science Library

    Tristan Denley

    2014-01-01

      This article explores the challenges that students face in navigating the curricular structure of post-secondary degree programs, and how predictive analytics and choice architecture can play a role...

  18. Segmentation in cohesive systems constrained by elastic environments

    Science.gov (United States)

    Novak, I.; Truskinovsky, L.

    2017-04-01

    The complexity of fracture-induced segmentation in elastically constrained cohesive (fragile) systems originates from the presence of competing interactions. The role of discreteness in such phenomena is of interest in a variety of fields, from hierarchical self-assembly to developmental morphogenesis. In this paper, we study the analytically solvable example of segmentation in a breakable mass-spring chain elastically linked to a deformable lattice structure. We explicitly construct the complete set of local minima of the energy in this prototypical problem and identify among them the states corresponding to the global energy minima. We show that, even in the continuum limit, the dependence of the segmentation topology on the stretching/pre-stress parameter in this problem takes the form of a devil's type staircase. The peculiar nature of this staircase, characterized by locking in rational microstructures, is of particular importance for biological applications, where its structure may serve as an explanation of the robustness of stress-driven segmentation. This article is part of the themed issue 'Patterning through instabilities in complex media: theory and applications.'

  19. Analytical applications of aptamers

    Science.gov (United States)

    Tombelli, S.; Minunni, M.; Mascini, M.

    2007-05-01

    Aptamers are single stranded DNA or RNA ligands which can be selected for different targets starting from a library of molecules containing randomly created sequences. Aptamers have been selected to bind very different targets, from proteins to small organic dyes. Aptamers are proposed as alternatives to antibodies as biorecognition elements in analytical devices with ever increasing frequency. This in order to satisfy the demand for quick, cheap, simple and highly reproducible analytical devices, especially for protein detection in the medical field or for the detection of smaller molecules in environmental and food analysis. In our recent experience, DNA and RNA aptamers, specific for three different proteins (Tat, IgE and thrombin), have been exploited as bio-recognition elements to develop specific biosensors (aptasensors). These recognition elements have been coupled to piezoelectric quartz crystals and surface plasmon resonance (SPR) devices as transducers where the aptamers have been immobilized on the gold surface of the crystals electrodes or on SPR chips, respectively.

  20. Identifying uniformly mutated segments within repeats.

    Science.gov (United States)

    Sahinalp, S Cenk; Eichler, Evan; Goldberg, Paul; Berenbrink, Petra; Friedetzky, Tom; Ergun, Funda

    2004-12-01

    Given a long string of characters from a constant size alphabet we present an algorithm to determine whether its characters have been generated by a single i.i.d. random source. More specifically, consider all possible n-coin models for generating a binary string S, where each bit of S is generated via an independent toss of one of the n coins in the model. The choice of which coin to toss is decided by a random walk on the set of coins where the probability of a coin change is much lower than the probability of using the same coin repeatedly. We present a procedure to evaluate the likelihood of a n-coin model for given S, subject a uniform prior distribution over the parameters of the model (that represent mutation rates and probabilities of copying events). In the absence of detailed prior knowledge of these parameters, the algorithm can be used to determine whether the a posteriori probability for n=1 is higher than for any other n>1. Our algorithm runs in time O(l4logl), where l is the length of S, through a dynamic programming approach which exploits the assumed convexity of the a posteriori probability for n. Our test can be used in the analysis of long alignments between pairs of genomic sequences in a number of ways. For example, functional regions in genome sequences exhibit much lower mutation rates than non-functional regions. Because our test provides means for determining variations in the mutation rate, it may be used to distinguish functional regions from non-functional ones. Another application is in determining whether two highly similar, thus evolutionarily related, genome segments are the result of a single copy event or of a complex series of copy events. This is particularly an issue in evolutionary studies of genome regions rich with repeat segments (especially tandemly repeated segments).

  1. Inorganic Analytical Chemistry

    DEFF Research Database (Denmark)

    Berg, Rolf W.

    The book is a treatise on inorganic analytical reactions in aqueous solution. It covers about half of the elements in the periodic table, i.e. the most important ones : H, Li, B, C, N, O, Na, Mg, Al, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Co, Ni, Cu, Zn, As, Se, Br, Sr, Mo, Ag, Cd, Sn, Sb, I, Ba, W,...

  2. Social Data Analytics Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi

    2014-01-01

    This paper presents the design, development and demonstrative case studies of the Social Data Analytics Tool, SODATO. Adopting Action Design Framework [1], the objective of SODATO [2] is to collect, store, analyze, and report big social data emanating from the social media engagement of and social...... media conversations about organizations. We report and discuss results from two demonstrative case studies that were conducted using SODATO and conclude with implications and future work....

  3. Analytical strategies for phosphoproteomics

    DEFF Research Database (Denmark)

    Thingholm, Tine E; Jensen, Ole N; Larsen, Martin R

    2009-01-01

    Protein phosphorylation is a key regulator of cellular signaling pathways. It is involved in most cellular events in which the complex interplay between protein kinases and protein phosphatases strictly controls biological processes such as proliferation, differentiation, and apoptosis. Defective...... sensitive and specific strategies. Today, most phosphoproteomic studies are conducted by mass spectrometric strategies in combination with phospho-specific enrichment methods. This review presents an overview of different analytical strategies for the characterization of phosphoproteins. Emphasis...

  4. Teaching Analytical Skills

    OpenAIRE

    Anderson, John E.

    1981-01-01

    The maintenance of individual practice quality requires that the family physician continually evaluate and improve his performance, selectively using new information. This means that the physician must possess certain basic analytical abilities. The minimum skills necessary for critical appraisal are outlined in the CFPC's educational objectives. These objectives are used as a background to discuss curriculum content and teaching methods. Despite obstacles, there is a growing stimulus to expa...

  5. Doing social media analytics

    OpenAIRE

    Brooker, P; Barnett, J; Cribbin, TF

    2016-01-01

    'The era of Big Data has begun' (boyd and Crawford, 2012: 662). In the few years since this statement, social media analytics has begun to accumulate studies drawing on social media as a resource and tool for research work. Yet, there has been relatively little attention paid to the development of methodologies for handling this kind of data. The few works that exist in this area often reflect upon the implications of 'grand' social science methodological concepts for new social media researc...

  6. Analytical and physical electrochemistry

    CERN Document Server

    Girault, Hubert H

    2004-01-01

    The study of electrochemistry is pertinent to a wide variety of fields, including bioenergetics, environmental sciences, and engineering sciences. In addition, electrochemistry plays a fundamental role in specific applications as diverse as the conversion and storage of energy and the sequencing of DNA.Intended both as a basic course for undergraduate students and as a reference work for graduates and researchers, Analytical and Physical Electrochemistry covers two fundamental aspects of electrochemistry: electrochemistry in solution and interfacial electrochemistry. By bringing these two subj

  7. Digital collection of photographic surveys of beach profiles and animals taken as part of the Beach Watch program at Dillon Beach (segment 1-10), California from 1996-04-14 to 1996-10-25 (NCEI Accession 0071545)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — NOAA's Gulf of the Farallones National Marine Sanctuary (GFNMS) Beach Watch Program, administered by the Farallones Marine Sanctuary Association (FMSA), is a...

  8. Digital collection of photographic surveys of beach profiles and animals taken as part of the Beach Watch program at Doran Beach (segment 1-06), California from 1997-12-27 to 1998-12-11 (NODC Accession 0071352)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — NOAA's Gulf of the Farallones National Marine Sanctuary (GFNMS) Beach Watch Program, administered by the Farallones Marine Sanctuary Association (FMSA), is a...

  9. Measuring Data Quality in Analytical Projects

    Directory of Open Access Journals (Sweden)

    Anca Ioana ANDREESCU

    2014-05-01

    Full Text Available Measuring and assuring data quality in analytical projects are considered very important issues and overseeing their benefits may cause serious consequences for the efficiency of organizations. Data profiling and data cleaning are two essential activities in a data quality process, along with data integration, enrichment and monitoring. Data warehouses require and provide extensive support for data cleaning. These loads and renew continuously huge amounts of data from a variety of sources, so the probability that some of the sources contain "dirty data" is great. Also, analytics tools offer, to some extent, facilities for assessing and assuring data quality as a built in support or by using their proprietary programming languages. This paper emphasizes the scope and relevance of a data quality measurement in analytical projects by the means of two intensively used tools such as Oracle Warehouse Builder and SAS 9.3.

  10. Supramolecular analytical chemistry.

    Science.gov (United States)

    Anslyn, Eric V

    2007-02-02

    A large fraction of the field of supramolecular chemistry has focused in previous decades upon the study and use of synthetic receptors as a means of mimicking natural receptors. Recently, the demand for synthetic receptors is rapidly increasing within the analytical sciences. These classes of receptors are finding uses in simple indicator chemistry, cellular imaging, and enantiomeric excess analysis, while also being involved in various truly practical assays of bodily fluids. Moreover, one of the most promising areas for the use of synthetic receptors is in the arena of differential sensing. Although many synthetic receptors have been shown to yield exquisite selectivities, in general, this class of receptor suffers from cross-reactivities. Yet, cross-reactivity is an attribute that is crucial to the success of differential sensing schemes. Therefore, both selective and nonselective synthetic receptors are finding uses in analytical applications. Hence, a field of chemistry that herein is entitled "Supramolecular Analytical Chemistry" is emerging, and is predicted to undergo increasingly rapid growth in the near future.

  11. Dictionary Based Segmentation in Volumes

    DEFF Research Database (Denmark)

    Emerson, Monica Jane; Jespersen, Kristine Munk; Jørgensen, Peter Stanley

    Method for supervised segmentation of volumetric data. The method is trained from manual annotations, and these annotations make the method very flexible, which we demonstrate in our experiments. Our method infers label information locally by matching the pattern in a neighborhood around a voxel ...... to a dictionary, and hereby accounts for the volume texture....

  12. Segmental Colitis Complicating Diverticular Disease

    Directory of Open Access Journals (Sweden)

    Guido Ma Van Rosendaal

    1996-01-01

    Full Text Available Two cases of idiopathic colitis affecting the sigmoid colon in elderly patients with underlying diverticulosis are presented. Segmental resection has permitted close review of the histopathology in this syndrome which demonstrates considerable similarity to changes seen in idiopathic ulcerative colitis. The reported experience with this syndrome and its clinical features are reviewed.

  13. Leaf segmentation in plant phenotyping

    NARCIS (Netherlands)

    Scharr, Hanno; Minervini, Massimo; French, Andrew P.; Klukas, Christian; Kramer, David M.; Liu, Xiaoming; Luengo, Imanol; Pape, Jean Michel; Polder, Gerrit; Vukadinovic, Danijela; Yin, Xi; Tsaftaris, Sotirios A.

    2016-01-01

    Image-based plant phenotyping is a growing application area of computer vision in agriculture. A key task is the segmentation of all individual leaves in images. Here we focus on the most common rosette model plants, Arabidopsis and young tobacco. Although leaves do share appearance and shape

  14. Body segments and growth hormone.

    OpenAIRE

    Bundak, R; Hindmarsh, P C; Brook, C G

    1988-01-01

    The effects of human growth hormone treatment for five years on sitting height and subischial leg length of 35 prepubertal children with isolated growth hormone deficiency were investigated. Body segments reacted equally to treatment with human growth hormone; this is important when comparing the effect of growth hormone on the growth of children with skeletal dysplasias or after spinal irradiation.

  15. Keratoplasty following anterior segment trauma.

    Science.gov (United States)

    Robinson, L P

    1981-02-01

    This paper reports and analyses 20 keratoplasties with or without anterior segment reconstruction carried out for penetrating injuries of the anterior segment. The results show that 80% clear grafts were achieved and 65% of eyes had vision restored to 6/18 or better. No eyes were lost. The complications were retinal detachments 2 cases, corneal graft rejection 2 cases, glaucoma 4 cases (2 mild and easily controlled) and one each of amblyopia and retinal folds through macular area. Eyes that have "quietened" following severe penetrating injuries of the anterior segment should be considered for penetrating keratoplasty and anterior segment reconstruction if they retain normal intraocular pressures and have vision of at least accurate projection of light in all quadrants. As well as achieving clear grafts and improvement of vision as above, all eyes had better cosmetic appearance. Two eyes had an ipsilateral rotational autokeratoplasty. This technique has a role to play when central scarring can be rotated to the periphery if sufficient undamaged cornea remains and interference with angle structures can be minimised.

  16. Inductive Generalization of Analytically Learned Goal Hierarchies

    Science.gov (United States)

    Könik, Tolga; Nejati, Negin; Kuter, Ugur

    We describe a new approach for learning procedural knowledge represented as teleoreactive logic programs using relational behavior traces as input. This representation organizes task decomposition skills hierarchically and associate explicitly defined goals with them. Our approach integrates analytical learning with inductive generalization in order to learn these skills. The analytical component predicts the goal dependencies in a successful solution and generates a teleoreactive logic program that can solve similar problems by determining the structure of the skill hierarchy and skill applicability conditions (preconditions), which may be overgeneral. The inductive component experiments with these skills on new problems and uses the data collected in this process to refine the preconditions. Our system achieves this by converting the data collected during the problem solving experiments into the positive and negative examples of preconditions that can be learned with a standard Inductive Logic Programming system. We show that this conversion uses one of the main commitments of teleoreactive logic programs: associating all skills with explicitly defined goals. We claim that our approach uses less expert effort compared to a purely inductive approach and performs better compared to a purely analytical approach.

  17. Business analytics a practitioner's guide

    CERN Document Server

    Saxena, Rahul

    2013-01-01

    This book provides a guide to businesses on how to use analytics to help drive from ideas to execution. Analytics used in this way provides "full lifecycle support" for business and helps during all stages of management decision-making and execution.The framework presented in the book enables the effective interplay of business, analytics, and information technology (business intelligence) both to leverage analytics for competitive advantage and to embed the use of business analytics into the business culture. It lays out an approach for analytics, describes the processes used, and provides gu

  18. Conflation of Short Identity-by-Descent Segments Bias Their Inferred Length Distribution

    Directory of Open Access Journals (Sweden)

    Charleston W. K. Chiang

    2016-05-01

    Full Text Available Identity-by-descent (IBD is a fundamental concept in genetics with many applications. In a common definition, two haplotypes are said to share an IBD segment if that segment is inherited from a recent shared common ancestor without intervening recombination. Segments several cM long can be efficiently detected by a number of algorithms using high-density SNP array data from a population sample, and there are currently efforts to detect shorter segments from sequencing. Here, we study a problem of identifiability: because existing approaches detect IBD based on contiguous segments of identity-by-state, inferred long segments of IBD may arise from the conflation of smaller, nearby IBD segments. We quantified this effect using coalescent simulations, finding that significant proportions of inferred segments 1–2 cM long are results of conflations of two or more shorter segments, each at least 0.2 cM or longer, under demographic scenarios typical for modern humans for all programs tested. The impact of such conflation is much smaller for longer (> 2 cM segments. This biases the inferred IBD segment length distribution, and so can affect downstream inferences that depend on the assumption that each segment of IBD derives from a single common ancestor. As an example, we present and analyze an estimator of the de novo mutation rate using IBD segments, and demonstrate that unmodeled conflation leads to underestimates of the ages of the common ancestors on these segments, and hence a significant overestimate of the mutation rate. Understanding the conflation effect in detail will make its correction in future methods more tractable.

  19. APPLICATION SEGMENT ANALYSISFOR THE DEVELOPMENT STRATEGYEDUCATIONAL INSTITUTION

    Directory of Open Access Journals (Sweden)

    G. V. Alekseev

    2015-01-01

    Full Text Available Summary. Applicable at present methods of the shaping to strategies of the development of the educational institutions not always objective take into account the mutual influence and receivership separate structured and organizing block to organizations of the scholastic process, in particular work with applicant. The Article is dedicated to discussing the possibilities of the using the segment analysis for development of the strategies of the development of the educational institutions for the reason increasing produced specialist on the market of the labour real sector economy. In her is described possibility to formalize the choice of the marketing methods within the framework of approach of the stochastic programming, as section of the ill-defined logic (fuzzy logic, which is a generalizations classical theory of sets and classical formal logic. The Main reason of the using of such approach became presence ill-defined and drawn near discourses at description of the preferences applicant, quality of the formation, but consequently and missions of the educational institution. The Decision of the specified problems in significant measure promotes the ill-defined approach to modeling of the complex systems, which has obtained recognition all over the world for use the most most important factors and methods of the determination to value of the balance marketing approach on the base of the segment analysis and base expert estimation, for what is formed corresponding to about-gram for COMPUTER realizing specified approaches.

  20. Analyzing Array Manipulating Programs by Program Transformation

    Science.gov (United States)

    Cornish, J. Robert M.; Gange, Graeme; Navas, Jorge A.; Schachte, Peter; Sondergaard, Harald; Stuckey, Peter J.

    2014-01-01

    We explore a transformational approach to the problem of verifying simple array-manipulating programs. Traditionally, verification of such programs requires intricate analysis machinery to reason with universally quantified statements about symbolic array segments, such as "every data item stored in the segment A[i] to A[j] is equal to the corresponding item stored in the segment B[i] to B[j]." We define a simple abstract machine which allows for set-valued variables and we show how to translate programs with array operations to array-free code for this machine. For the purpose of program analysis, the translated program remains faithful to the semantics of array manipulation. Based on our implementation in LLVM, we evaluate the approach with respect to its ability to extract useful invariants and the cost in terms of code size.

  1. Competition between influenza A virus genome segments.

    Directory of Open Access Journals (Sweden)

    Ivy Widjaja

    Full Text Available Influenza A virus (IAV contains a segmented negative-strand RNA genome. How IAV balances the replication and transcription of its multiple genome segments is not understood. We developed a dual competition assay based on the co-transfection of firefly or Gaussia luciferase-encoding genome segments together with plasmids encoding IAV polymerase subunits and nucleoprotein. At limiting amounts of polymerase subunits, expression of the firefly luciferase segment was negatively affected by the presence of its Gaussia luciferase counterpart, indicative of competition between reporter genome segments. This competition could be relieved by increasing or decreasing the relative amounts of firefly or Gaussia reporter segment, respectively. The balance between the luciferase expression levels was also affected by the identity of the untranslated regions (UTRs as well as segment length. In general it appeared that genome segments displaying inherent higher expression levels were more efficient competitors of another segment. When natural genome segments were tested for their ability to suppress reporter gene expression, shorter genome segments generally reduced firefly luciferase expression to a larger extent, with the M and NS segments having the largest effect. The balance between different reporter segments was most dramatically affected by the introduction of UTR panhandle-stabilizing mutations. Furthermore, only reporter genome segments carrying these mutations were able to efficiently compete with the natural genome segments in infected cells. Our data indicate that IAV genome segments compete for available polymerases. Competition is affected by segment length, coding region, and UTRs. This competition is probably most apparent early during infection, when limiting amounts of polymerases are present, and may contribute to the regulation of segment-specific replication and transcription.

  2. Big Data Analytics for Demand Response: Clustering Over Space and Time

    Energy Technology Data Exchange (ETDEWEB)

    Chelmis, Charalampos [Univ. of Southern California, Los Angeles, CA (United States); Kolte, Jahanvi [Nirma Univ., Gujarat (India); Prasanna, Viktor K. [Univ. of Southern California, Los Angeles, CA (United States)

    2015-10-29

    The pervasive deployment of advanced sensing infrastructure in Cyber-Physical systems, such as the Smart Grid, has resulted in an unprecedented data explosion. Such data exhibit both large volumes and high velocity characteristics, two of the three pillars of Big Data, and have a time-series notion as datasets in this context typically consist of successive measurements made over a time interval. Time-series data can be valuable for data mining and analytics tasks such as identifying the “right” customers among a diverse population, to target for Demand Response programs. However, time series are challenging to mine due to their high dimensionality. In this paper, we motivate this problem using a real application from the smart grid domain. We explore novel representations of time-series data for BigData analytics, and propose a clustering technique for determining natural segmentation of customers and identification of temporal consumption patterns. Our method is generizable to large-scale, real-world scenarios, without making any assumptions about the data. We evaluate our technique using real datasets from smart meters, totaling ~ 18,200,000 data points, and show the efficacy of our technique in efficiency detecting the number of optimal number of clusters.

  3. Developing Healthcare Data Analytics APPs with Open Data Science Tools.

    Science.gov (United States)

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong

    2017-01-01

    Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.

  4. Analytical Chemistry Laboratory progress report for FY 1985

    Energy Technology Data Exchange (ETDEWEB)

    Green, D.W.; Heinrich, R.R.; Jensen, K.J.

    1985-12-01

    The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of technical support services to the scientific and engineering programs at ANL. In addition, ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques. The purpose of this report is to summarize the technical and administrative activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1985 (October 1984 through September 1985). This is the second annual report for the ACL. 4 figs., 1 tab.

  5. Stabilized wave segments in an excitable medium with a phase wave at the wave back

    Science.gov (United States)

    Zykov, V. S.; Bodenschatz, E.

    2014-04-01

    The propagation velocity and the shape of a stationary propagating wave segment are determined analytically for excitable media supporting excitation waves with trigger fronts and phase backs. The general relationships between the medium's excitability and the wave segment parameters are obtained in the framework of the free boundary approach under quite usual assumptions. Two universal limits restricting the region of existence of stabilized wave segments are found. The comparison of the analytical results with numerical simulations of the well-known Kessler-Levine model demonstrates their good quantitative agreement. The findings should be applicable to a wide class of systems, such as the propagation of electrical waves in the cardiac muscle or wave propagation in autocatalytic chemical reactions, due to the generality of the free-boundary approach used.

  6. Of the Analytical Engine

    Indian Academy of Sciences (India)

    idea ofa program library for common functions. He also understood space-time tradeoff in programming. He was way ahead of his time as mechanical systems ofthe day did not have enough precision to implement his ideas. It was also very expensive with the result that his proposals remained unimplemented and later.

  7. Mars Analytical Microimager

    Science.gov (United States)

    Batory, Krzysztof J.; Govindjee; Andersen, Dale; Presley, John; Lucas, John M.; Sears, S. Kelly; Vali, Hojatollah

    Unambiguous detection of extraterrestrial nitrogenous hydrocarbon microbiology requires an instrument both to recognize potential biogenic specimens and to successfully discriminate them from geochemical settings. Such detection should ideally be in-situ and not jeopardize other experiments by altering samples. Taken individually most biomarkers are inconclusive. For example, since amino acids can be synthesized abiotically they are not always considered reliable biomarkers. An enantiomeric imbalance, which is characteristic of all terrestrial life, may be questioned because chirality can also be altered abiotically. However, current scientific understanding holds that aggregates of identical proteins or proteinaceous complexes, with their well-defined amino acid residue sequences, are indisputable biomarkers. Our paper describes the Mars Analytical Microimager, an instrument for the simultaneous imaging of generic autofluorescent biomarkers and overall morphology. Autofluorescence from ultraviolet to near-infrared is emitted by all known terrestrial biology, and often as consistent complex bands uncharacteristic of abiotic mineral luminescence. The MAM acquires morphology, and even sub-micron morphogenesis, at a 3-centimeter working distance with resolution approaching a laser scanning microscope. Luminescence is simultaneously collected via a 2.5-micron aperture, thereby permitting accurate correlation of multi-dimensional optical behavior with specimen morphology. A variable wavelength excitation source and photospectrometer serve to obtain steady-state and excitation spectra of biotic and luminescent abiotic sources. We believe this is the first time instrumentation for detecting hydrated or desiccated microbiology non-destructively in-situ has been demonstrated. We have obtained excellent preliminary detection of biota and inorganic matrix discrimination from terrestrial polar analogues, and perimetric morphology of individual magnetotactic bacteria. Proposed

  8. Elements of analytical dynamics

    CERN Document Server

    Kurth, Rudolph; Stark, M

    1976-01-01

    Elements of Analytical Dynamics deals with dynamics, which studies the relationship between motion of material bodies and the forces acting on them. This book is a compilation of lectures given by the author at the Georgia and Institute of Technology and formed a part of a course in Topological Dynamics. The book begins by discussing the notions of space and time and their basic properties. It then discusses the Hamilton-Jacobi theory and Hamilton's principle and first integrals. The text concludes with a discussion on Jacobi's geometric interpretation of conservative systems. This book will

  9. Analytical elements of mechanics

    CERN Document Server

    Kane, Thomas R

    2013-01-01

    Analytical Elements of Mechanics, Volume 1, is the first of two volumes intended for use in courses in classical mechanics. The books aim to provide students and teachers with a text consistent in content and format with the author's ideas regarding the subject matter and teaching of mechanics, and to disseminate these ideas. The book opens with a detailed exposition of vector algebra, and no prior knowledge of this subject is required. This is followed by a chapter on the topic of mass centers, which is presented as a logical extension of concepts introduced in connection with centroids. A

  10. Local analytic geometry

    CERN Document Server

    Abhyankar, Shreeram Shankar

    1964-01-01

    This book provides, for use in a graduate course or for self-study by graduate students, a well-motivated treatment of several topics, especially the following: (1) algebraic treatment of several complex variables; (2) geometric approach to algebraic geometry via analytic sets; (3) survey of local algebra; (4) survey of sheaf theory. The book has been written in the spirit of Weierstrass. Power series play the dominant role. The treatment, being algebraic, is not restricted to complex numbers, but remains valid over any complete-valued field. This makes it applicable to situations arising from

  11. Analytic aspects of convexity

    CERN Document Server

    Colesanti, Andrea; Gronchi, Paolo

    2018-01-01

    This book presents the proceedings of the international conference Analytic Aspects in Convexity, which was held in Rome in October 2016. It offers a collection of selected articles, written by some of the world’s leading experts in the field of Convex Geometry, on recent developments in this area: theory of valuations; geometric inequalities; affine geometry; and curvature measures. The book will be of interest to a broad readership, from those involved in Convex Geometry, to those focusing on Functional Analysis, Harmonic Analysis, Differential Geometry, or PDEs. The book is a addressed to PhD students and researchers, interested in Convex Geometry and its links to analysis.

  12. Analytical chemistry in space

    CERN Document Server

    Wainerdi, Richard E

    1970-01-01

    Analytical Chemistry in Space presents an analysis of the chemical constitution of space, particularly the particles in the solar wind, of the planetary atmospheres, and the surfaces of the moon and planets. Topics range from space engineering considerations to solar system atmospheres and recovered extraterrestrial materials. Mass spectroscopy in space exploration is also discussed, along with lunar and planetary surface analysis using neutron inelastic scattering. This book is comprised of seven chapters and opens with a discussion on the possibilities for exploration of the solar system by

  13. Division of Analytical Chemistry, 1998

    DEFF Research Database (Denmark)

    Hansen, Elo Harald

    1999-01-01

    The article recounts the 1998 activities of the Division of Analytical Chemistry (DAC- formerly the Working Party on Analytical Chemistry, WPAC), which body is a division of the Federation of European Chemical Societies (FECS). Elo Harald Hansen is the Danish delegate, representing The Danish...... Chemical Society/The Society for Analytical Chemistry....

  14. Human body segmentation via data-driven graph cut.

    Science.gov (United States)

    Li, Shifeng; Lu, Huchuan; Shao, Xingqing

    2014-11-01

    Human body segmentation is a challenging and important problem in computer vision. Existing methods usually entail a time-consuming training phase for prior knowledge learning with complex shape matching for body segmentation. In this paper, we propose a data-driven method that integrates top-down body pose information and bottom-up low-level visual cues for segmenting humans in static images within the graph cut framework. The key idea of our approach is first to exploit human kinematics to search for body part candidates via dynamic programming for high-level evidence. Then, by using the body parts classifiers, obtaining bottom-up cues of human body distribution for low-level evidence. All the evidence collected from top-down and bottom-up procedures are integrated in a graph cut framework for human body segmentation. Qualitative and quantitative experiment results demonstrate the merits of the proposed method in segmenting human bodies with arbitrary poses from cluttered backgrounds.

  15. The perisylvian language network and language analytical abilities.

    Science.gov (United States)

    Kepinska, Olga; Lakke, Egbert A J F; Dutton, Eleanor M; Caspers, Johanneke; Schiller, Niels O

    2017-10-01

    Aiming at exploring the brain's structural organisation underlying successful second language learning, we investigate the anatomy of the perisylvian language network in a group of healthy adults, consisting of participants with high and average language analytical abilities. Utilising deterministic tractography, six tracts per participant (left and right long direct segment, left and right indirect anterior segment and left and right indirect posterior segment) were virtually dissected and measurements pertaining to their microstructural organisation were collected. Our results obtained by means of linear discriminant analysis pointed to mean diffusivity (MD) values of three tracts (right anterior, left long and left anterior segments) as best discriminating between the two groups. By far the highest coefficient was obtained for the MD values of the right anterior segment, pointing to the role of the right white matter fronto-parietal connectivity for superior language learning abilities. The results imply the importance of attentional processes and reasoning abilities for successful L2 acquisition, and support previous findings concerning right-hemispheric involvement in language learning. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Dictionary Based Segmentation in Volumes

    DEFF Research Database (Denmark)

    Emerson, Monica Jane; Jespersen, Kristine Munk; Jørgensen, Peter Stanley

    2015-01-01

    We present a method for supervised volumetric segmentation based on a dictionary of small cubes composed of pairs of intensity and label cubes. Intensity cubes are small image volumes where each voxel contains an image intensity. Label cubes are volumes with voxelwise probabilities for a given...... label. The segmentation process is done by matching a cube from the volume, of the same size as the dictionary intensity cubes, to the most similar intensity dictionary cube, and from the associated label cube we get voxel-wise label probabilities. Probabilities from overlapping cubes are averaged...... and hereby we obtain a robust label probability encoding. The dictionary is computed from labeled volumetric image data based on weighted clustering. We experimentally demonstrate our method using two data sets from material science – a phantom data set of a solid oxide fuel cell simulation for detecting...

  17. Text Segmentation Using Exponential Models

    CERN Document Server

    Beeferman, D; Lafferty, G D; Beeferman, Doug; Berger, Adam; Lafferty, John

    1997-01-01

    This paper introduces a new statistical approach to partitioning text automatically into coherent segments. Our approach enlists both short-range and long-range language models to help it sniff out likely sites of topic changes in text. To aid its search, the system consults a set of simple lexical hints it has learned to associate with the presence of boundaries through inspection of a large corpus of annotated data. We also propose a new probabilistically motivated error metric for use by the natural language processing and information retrieval communities, intended to supersede precision and recall for appraising segmentation algorithms. Qualitative assessment of our algorithm as well as evaluation using this new metric demonstrate the effectiveness of our approach in two very different domains, Wall Street Journal articles and the TDT Corpus, a collection of newswire articles and broadcast news transcripts.

  18. Smart city analytics

    DEFF Research Database (Denmark)

    Hansen, Casper; Hansen, Christian; Alstrup, Stephen

    2017-01-01

    We present an ensemble learning method that predicts large increases in the hours of home care received by citizens. The method is supervised, and uses different ensembles of either linear (logistic regression) or non-linear (random forests) classifiers. Experiments with data available from 2013 ...... is very useful when full records are not accessible or available. Smart city analytics does not necessarily require full city records. To our knowledge this preliminary study is the first to predict large increases in home care for smart city analytics.......We present an ensemble learning method that predicts large increases in the hours of home care received by citizens. The method is supervised, and uses different ensembles of either linear (logistic regression) or non-linear (random forests) classifiers. Experiments with data available from 2013...... to 2017 for every citizen in Copenhagen receiving home care (27,775 citizens) show that prediction can achieve state of the art performance as reported in similar health related domains (AUC=0.715). We further find that competitive results can be obtained by using limited information for training, which...

  19. The analytic renormalization group

    Directory of Open Access Journals (Sweden)

    Frank Ferrari

    2016-08-01

    Full Text Available Finite temperature Euclidean two-point functions in quantum mechanics or quantum field theory are characterized by a discrete set of Fourier coefficients Gk, k∈Z, associated with the Matsubara frequencies νk=2πk/β. We show that analyticity implies that the coefficients Gk must satisfy an infinite number of model-independent linear equations that we write down explicitly. In particular, we construct “Analytic Renormalization Group” linear maps Aμ which, for any choice of cut-off μ, allow to express the low energy Fourier coefficients for |νk|<μ (with the possible exception of the zero mode G0, together with the real-time correlators and spectral functions, in terms of the high energy Fourier coefficients for |νk|≥μ. Operating a simple numerical algorithm, we show that the exact universal linear constraints on Gk can be used to systematically improve any random approximate data set obtained, for example, from Monte-Carlo simulations. Our results are illustrated on several explicit examples.

  20. POST-HOC SEGMENTATION USING MARKETING RESEARCH

    National Research Council Canada - National Science Library

    CRISTINEL CONSTANTIN

    2012-01-01

    .... These methods are K-means cluster and TwoStep cluster, which are available in SPSS system. Such methods could be used in post-hoc market segmentations, which allow companies to find segments with specific behaviours or attitudes...

  1. Ophthalmological manifestations in segmental neurofibromatosis type 1

    OpenAIRE

    Ruggieri, M; PAVONE, P.; Polizzi, A.; Pietro, M. Di; Scuderi, A.; Gabriele, A; Spalice, A.; Iannetti, P

    2004-01-01

    Aims: To study the ophthalmological manifestations in individuals with the typical features of neurofibromatosis type 1 (NF1) circumscribed to one or more body segments, usually referred to as segmental NF1.

  2. Spectral clustering algorithms for ultrasound image segmentation.

    Science.gov (United States)

    Archip, Neculai; Rohling, Robert; Cooperberg, Peter; Tahmasebpour, Hamid; Warfield, Simon K

    2005-01-01

    Image segmentation algorithms derived from spectral clustering analysis rely on the eigenvectors of the Laplacian of a weighted graph obtained from the image. The NCut criterion was previously used for image segmentation in supervised manner. We derive a new strategy for unsupervised image segmentation. This article describes an initial investigation to determine the suitability of such segmentation techniques for ultrasound images. The extension of the NCut technique to the unsupervised clustering is first described. The novel segmentation algorithm is then performed on simulated ultrasound images. Tests are also performed on abdominal and fetal images with the segmentation results compared to manual segmentation. Comparisons with the classical NCut algorithm are also presented. Finally, segmentation results on other types of medical images are shown.

  3. Pectoral muscle segmentation: a review.

    Science.gov (United States)

    Ganesan, Karthikeyan; Acharya, U Rajendra; Chua, Kuang Chua; Min, Lim Choo; Abraham, K Thomas

    2013-04-01

    Mammograms are X-ray images of breasts which are used to detect breast cancer. The pectoral muscle is a mass of tissue on which the breast rests. During routine mammographic screenings, in medio-lateral oblique (MLO) views, the pectoral muscle turns up in the mammograms along with the breast tissues. The pectoral muscle has to be segmented from the mammogram for an effective automated computer aided diagnosis (CAD). This is due to the fact that pectoral muscles have pixel intensities and texture similar to that of breast tissues which can result in awry CAD results. As a result, a lot of effort has been put into the segmentation of pectoral muscles and finding its contour with the breast tissues. To the best of our knowledge, currently there is no definitive literature available which provides a comprehensive review about the current state of research in this area of pectoral muscle segmentation. We try to address this shortcoming by providing a comprehensive review of research papers in this area. A conscious effort has been made to avoid deviating into the area of automated breast cancer detection. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  4. Analytical Business Model for Sustainable Distributed Retail Enterprises in a Competitive Market

    National Research Council Canada - National Science Library

    Courage Matobobo; Isaac O Osunmakinde

    2016-01-01

    .... Although some enterprises have implemented classical business models to address these challenging issues, they still lack analytics-based marketing programs to gain a competitive advantage to deal...

  5. Semiautomatic Segmentation of Glioma on Mobile Devices

    OpenAIRE

    Ya-Ping Wu; Yu-Song Lin; Wei-Guo Wu; Cong Yang; Jian-Qin Gu; Yan Bai; Mei-Yun Wang

    2017-01-01

    Brain tumor segmentation is the first and the most critical step in clinical applications of radiomics. However, segmenting brain images by radiologists is labor intense and prone to inter- and intraobserver variability. Stable and reproducible brain image segmentation algorithms are thus important for successful tumor detection in radiomics. In this paper, we propose a supervised brain image segmentation method, especially for magnetic resonance (MR) brain images with glioma. This paper uses...

  6. Review of segmentation process in consumer markets

    OpenAIRE

    Veronika Jadczaková

    2013-01-01

    Although there has been a considerable debate on market segmentation over five decades, attention was merely devoted to single stages of the segmentation process. In doing so, stages as segmentation base selection or segments profiling have been heavily covered in the extant literature, whereas stages as implementation of the marketing strategy or market definition were of a comparably lower interest. Capitalizing on this shortcoming, this paper strives to close the gap and provide each step...

  7. Commercial Midstream Energy Efficiency Incentive Programs: Guidelines for Future Program Design, Implementation, and Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Milostan, Catharina [Argonne National Lab. (ANL), Argonne, IL (United States); Levin, Todd [Argonne National Lab. (ANL), Argonne, IL (United States); Muehleisen, Ralph T. [Argonne National Lab. (ANL), Argonne, IL (United States); Guzowski, Leah Bellah B. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-01-01

    Many electric utilities operate energy efficiency incentive programs that encourage increased dissemination and use of energy-efficient (EE) products in their service territories. The programs can be segmented into three broad categories—downstream incentive programs target product end users, midstream programs target product distributors, and upstream programs target product manufacturers. Traditional downstream programs have had difficulty engaging Small Business/Small Portfolio (SBSP) audiences, and an opportunity exists to expand Commercial Midstream Incentive Programs (CMIPs) to reach this market segment instead.

  8. Stability of a fluid-fluid interface in a biconical pore segment.

    Science.gov (United States)

    Hilpert, Markus; Miller, Cass T; Gray, William G

    2003-11-15

    Pore networks that include biconical pore segments are frequently used to model two-phase flow. In this work, we describe in detail the displacement of a fluid-fluid interface in such a pore segment. We assume sharp edges in the throat, inlet, and outlet of the pore segment to be the limiting cases of round edges, the radii of which vanish. We account for interfacial and lineal tensions that cause nonconstant contact angles. For zero lineal tension, we provide analytical solutions for flow induced by changing infinitesimally slowly either capillary pressure or the volume of one fluid. In diverging and converging cones, the common line among the two fluids and the solid phase slides while it is pinned in the throat, inlet, and outlet. We observe hysteresis within the pore segment, and drainage entry pressures deviate from prior work.

  9. Automatic segmentation of diatom images for classification

    NARCIS (Netherlands)

    Jalba, Andrei C.; Wilkinson, Michael H.F.; Roerdink, Jos B.T.M.

    A general framework for automatic segmentation of diatom images is presented. This segmentation is a critical first step in contour-based methods for automatic identification of diatoms by computerized image analysis. We review existing results, adapt popular segmentation methods to this difficult

  10. Melt spinnable elastane fibres from segmented copolyetheresteramids

    NARCIS (Netherlands)

    Niesten, M.C.E.J.; Krijgsman, J.; Gaymans, R.J.

    2001-01-01

    Spandex fibers were obtained by melt spinning segmented copolyetheresteramides with crystallizable aromatic diamide units of uniform length and poly(tetramethyleneoxide) segments. The aramid content was varied from 3 to 22 wt %, and the molecular weight of the polyether segment ranged from 1000 to

  11. Automatic Segmentation of Spanish Speech Into Syllables

    OpenAIRE

    Mariño Acebal, José Bernardo

    1989-01-01

    This paper presents an algorithm that provides a syllabic segmentation of speech following the syllabification rules of Spanish language. The implemented algorithm is divided into two parts. First, an initial segmentation is made based on energy contour, sonority and duration. Second, a fine adjustement of syllable boundaries and final segmentation is made by applying syllabic rules. Peer Reviewed

  12. Mora or syllable? Speech segmentation in Japanese

    NARCIS (Netherlands)

    Otake, T.; Hatano, G.; Cutler, A.; Mehler, J.

    1993-01-01

    Four experiments examined segmentation of spoken Japanese words by native and non-native listeners. Previous studies suggested that language rhythm determines the segmentation unit most natural to native listeners: French has syllabic rhythm, and French listeners use the syllable in segmentation,

  13. Peptide segments in protein-protein interfaces

    Indian Academy of Sciences (India)

    2006-09-06

    Sep 6, 2006 ... In 1000 Å2 of the interface area, contributed by a polypeptide chain, there would be 3.4 segments in homodimers, 5.6 in complexes and 6.3 in crystal contacts. Concomitantly, the segments are the longest (with 8.7 interface residues) in homodimers. Core segments (likely to contribute more towards binding) ...

  14. Market Segmentation from a Behavioral Perspective

    Science.gov (United States)

    Wells, Victoria K.; Chang, Shing Wan; Oliveira-Castro, Jorge; Pallister, John

    2010-01-01

    A segmentation approach is presented using both traditional demographic segmentation bases (age, social class/occupation, and working status) and a segmentation by benefits sought. The benefits sought in this case are utilitarian and informational reinforcement, variables developed from the Behavioral Perspective Model (BPM). Using data from 1,847…

  15. Quick Dissection of the Segmental Bronchi

    Science.gov (United States)

    Nakajima, Yuji

    2010-01-01

    Knowledge of the three-dimensional anatomy of the bronchopulmonary segments is essential for respiratory medicine. This report describes a quick guide for dissecting the segmental bronchi in formaldehyde-fixed human material. All segmental bronchi are easy to dissect, and thus, this exercise will help medical students to better understand the…

  16. Handwriting segmentation of unconstrained Oriya text

    Indian Academy of Sciences (India)

    Indian language; Oriya script; character segmentation; handwriting recognition. 1. Introduction. Segmentation of handwritten text into lines, words and characters is one of the important steps in the handwritten script recognition process. The task of individual text-line segmentation from unconstrained handwritten documents ...

  17. LIFE-STYLE SEGMENTATION WITH TAILORED INTERVIEWING

    NARCIS (Netherlands)

    KAMAKURA, WA; WEDEL, M

    The authors present a tailored interviewing procedure for life-style segmentation. The procedure assumes that a life-style measurement instrument has been designed. A classification of a sample of consumers into life-style segments is obtained using a latent-class model. With these segments, the

  18. The Process of Marketing Segmentation Strategy Selection

    OpenAIRE

    Ionel Dumitru

    2007-01-01

    The process of marketing segmentation strategy selection represents the essence of strategical marketing. We present hereinafter the main forms of the marketing statategy segmentation: undifferentiated marketing, differentiated marketing, concentrated marketing and personalized marketing. In practice, the companies use a mix of these marketing segmentation methods in order to maximize the proffit and to satisfy the consumers’ needs.

  19. Analytic device including nanostructures

    KAUST Repository

    Di Fabrizio, Enzo M.

    2015-07-02

    A device for detecting an analyte in a sample comprising: an array including a plurality of pixels, each pixel including a nanochain comprising: a first nanostructure, a second nanostructure, and a third nanostructure, wherein size of the first nanostructure is larger than that of the second nanostructure, and size of the second nanostructure is larger than that of the third nanostructure, and wherein the first nanostructure, the second nanostructure, and the third nanostructure are positioned on a substrate such that when the nanochain is excited by an energy, an optical field between the second nanostructure and the third nanostructure is stronger than an optical field between the first nanostructure and the second nanostructure, wherein the array is configured to receive a sample; and a detector arranged to collect spectral data from a plurality of pixels of the array.

  20. Analytical applications of dithizone

    Energy Technology Data Exchange (ETDEWEB)

    Irving, H.M.N.H.

    1980-01-01

    The organic reagent best known under its common name dithizone was introduced into analytical practice by Hellmuth Fischer just over 50 years ago. By virtue of its thiol group, it can form formally unchanged chelate complexes with a small group of metals (notably, Co, Ni, Zn, Pd, Ag, Cd, In, Sn, Pt, Au, Hg, Tl, Pb, and Bi) and organometallic ions (such as R/sub 2/Tl/sup +/, R/sub 3/Su/sup +/, R/sub 2/Pb/sup 2 +/, RHg/sup +/) and since, like the reagent itself, these are intensely colored and very sparingly soluble in water though soluble in chloroform, carbon tetrachloride, and other water-immiscible organic solvents, dithizone lends itself to liquid-liquid extraction procedures and the spectrophotometric determination of trace metals at around the microgram level. With the increasing popularity of atomic abosption spectrophotometry, this technique has tended to supplant spectrophotometry as the preferred finish in quantitative trace-metal determinations, but many other physical procedures are in current use. The ability to preconcentrate certain metals by liquid-liquid extraction of their dithizonates plays an increasing role in environmental analysis, and chromatographic techniques now extend from thin layer chromatography to the use of columns for specific separations. The present review summarizes the basic analytical applications of dithizone that have become well established in the past half-century but highlights the more recent developments through a detailed review of papers published during the last 10 years. Particular attention is paid to the applications of dithizone in preconcentration and separation techniques, in electroanalytical procedures, in substoichiometry and in the design of liquid-membrane ion-selective electrodes. 4 figures, 4 tables.

  1. Surface properties of poly(ethylene oxide)-based segmented block copolymers with monodisperse hard segments

    NARCIS (Netherlands)

    Husken, D.; Feijen, Jan; Gaymans, R.J.

    2009-01-01

    The surface properties of segmented block copolymers based on poly(ethylene oxide) (PEO) segments and monodisperse crystallizable tetra-amide segments were studied. The monodisperse crystallizable segments (T6T6T) were based on terephthalate (T) and hexamethylenediamine (6). Due to the crystallinity

  2. Analytical Chemistry Division annual progress report for period ending December 31, 1990

    Energy Technology Data Exchange (ETDEWEB)

    1991-04-01

    The Analytical Chemistry Division has programs in inorganic mass spectrometry, optical spectroscopy, organic mass spectrometry, and secondary ion mass spectrometry. It maintains a transuranium analytical laboratory and an environmental analytical laboratory. It carries out chemical and physical analysis in the fields of inorganic chemistry, organic spectroscopy, separations and synthesis. (WET)

  3. An analytical procedure to assist decision-making in a government research organization

    Science.gov (United States)

    H. Dean Claxton; Giuseppe Rensi

    1972-01-01

    An analytical procedure to help management decision-making in planning government research is described. The objectives, activities, and restrictions of a government research organization are modeled in a consistent analytical framework. Theory and methodology is drawn from economics and mathe-matical programing. The major analytical aspects distinguishing research...

  4. Local analytic first integrals of planar analytic differential systems

    Energy Technology Data Exchange (ETDEWEB)

    Colak, Ilker E., E-mail: ilkercolak@mat.uab.cat [Departament de Matemàtiques, Universitat Autònoma de Barcelona, 08193 Bellaterra, Barcelona, Catalonia (Spain); Llibre, Jaume, E-mail: jllibre@mat.uab.cat [Departament de Matemàtiques, Universitat Autònoma de Barcelona, 08193 Bellaterra, Barcelona, Catalonia (Spain); Valls, Claudia, E-mail: cvalls@math.ist.utl.pt [Departamento de Matemática, Instituto Superior Técnico, Universidade Técnica de Lisboa, 1049-001 Lisboa (Portugal)

    2013-06-17

    We study the existence of local analytic first integrals of a class of analytic differential systems in the plane, obtained from the Chua's system studied in L.O. Chua (1992, 1995), N.V. Kuznetsov et al. (2011), G.A. Leonov et al. (2012) [6,7,11,13]. The method used can be applied to other analytic differential systems.

  5. Older People's Mobility: Segments, Factors, Trends

    DEFF Research Database (Denmark)

    Haustein, Sonja; Siren, Anu

    2015-01-01

    demographic, health-related, or transport-related factors. This paper reviews these studies and compares the segments of older people that different studies have identified. First, as a result of a systematic comparison, we identified four generic segments: (1) an active car-oriented segment; (2) a car...... people’s travel behaviour. Based on this, we proposed a theoretical model on how the different determinants work together to form the four mobility patterns related to the identified segments. Finally, based on current trends and expectations, we assessed which segments are likely to increase or decrease...

  6. MOVING WINDOW SEGMENTATION FRAMEWORK FOR POINT CLOUDS

    Directory of Open Access Journals (Sweden)

    G. Sithole

    2012-07-01

    Full Text Available As lidar point clouds become larger streamed processing becomes more attractive. This paper presents a framework for the streamed segmentation of point clouds with the intention of segmenting unstructured point clouds in real-time. The framework is composed of two main components. The first component segments points within a window shifting over the point cloud. The second component stitches the segments within the windows together. In this fashion a point cloud can be streamed through these two components in sequence, thus producing a segmentation. The algorithm has been tested on airborne lidar point cloud and some results of the performance of the framework are presented.

  7. Automatic segmentation of vertebrae from radiographs

    DEFF Research Database (Denmark)

    Mysling, Peter; Petersen, Peter Kersten; Nielsen, Mads

    2011-01-01

    Segmentation of vertebral contours is an essential task in the design of automatic tools for vertebral fracture assessment. In this paper, we propose a novel segmentation technique which does not require operator interaction. The proposed technique solves the segmentation problem in a hierarchical...... is constrained by a conditional shape model, based on the variability of the coarse spine location estimates. The technique is evaluated on a data set of manually annotated lumbar radiographs. The results compare favorably to the previous work in automatic vertebra segmentation, in terms of both segmentation...

  8. An Interactive Analytical Chemistry Summer Camp for Middle School Girls

    Science.gov (United States)

    Robbins, Mary E.; Schoenfisch, Mark H.

    2005-01-01

    A summer outreach program, which was implemented for the first time in the summer of 2004, that provided middle school girls with an opportunity to conduct college-level analytical chemistry experiments under the guidance of female graduate students is explained. The program proved beneficial to participants at each level.

  9. Efficient graph-cut tattoo segmentation

    Science.gov (United States)

    Kim, Joonsoo; Parra, Albert; Li, He; Delp, Edward J.

    2015-03-01

    Law enforcement is interested in exploiting tattoos as an information source to identify, track and prevent gang-related crimes. Many tattoo image retrieval systems have been described. In a retrieval system tattoo segmentation is an important step for retrieval accuracy since segmentation removes background information in a tattoo image. Existing segmentation methods do not extract the tattoo very well when the background includes textures and color similar to skin tones. In this paper we describe a tattoo segmentation approach by determining skin pixels in regions near the tattoo. In these regions graph-cut segmentation using a skin color model and a visual saliency map is used to find skin pixels. After segmentation we determine which set of skin pixels are connected with each other that form a closed contour including a tattoo. The regions surrounded by the closed contours are considered tattoo regions. Our method segments tattoos well when the background includes textures and color similar to skin.

  10. Patriot Advanced Capability-3 Missile Segment Enhancement (PAC-3 MSE)

    Science.gov (United States)

    2015-12-01

    Selected Acquisition Report (SAR) RCS: DD-A&T(Q&A)823-492 Patriot Advanced Capability-3 Missile Segment Enhancement ( PAC -3 MSE) As of FY 2017...President’s Budget Defense Acquisition Management Information Retrieval (DAMIR) March 21, 2016 09:42:58 UNCLASSIFIED PAC -3 MSE December 2015 SAR March...ORD - Operational Requirements Document OSD - Office of the Secretary of Defense O&S - Operating and Support PAUC - Program Acquisition Unit Cost PAC

  11. Analytical Study of Oxalates Coprecipitation

    Directory of Open Access Journals (Sweden)

    Liana MARTA

    2003-03-01

    Full Text Available The paper deals with the establishing of the oxalates coprecipitation conditions in view of the synthesis of superconducting systems. A systematic analytical study of the oxalates precipitation conditions has been performed, for obtaining superconducting materials, in the Bi Sr-Ca-Cu-O system. For this purpose, the formulae of the precipitates solubility as a function of pH and oxalate excess were established. The possible formation of hydroxo-complexes and soluble oxalato-complexes was taken into account. A BASIC program was used for tracing the precipitation curves. The curves of the solubility versus pH for different oxalate excess have plotted for the four oxalates, using a logaritmic scale. The optimal conditions for the quantitative oxalate coprecipitation have been deduced from the diagrams. The theoretical curves were confirmed by experimental results. From the precursors obtained by this method, the BSCCO superconducting phases were obtained by an appropriate thermal treatment. The formation of the superconducting phases was identified by X-ray diffraction analysis.

  12. Heterologous Packaging Signals on Segment 4, but Not Segment 6 or Segment 8, Limit Influenza A Virus Reassortment.

    Science.gov (United States)

    White, Maria C; Steel, John; Lowen, Anice C

    2017-06-01

    Influenza A virus (IAV) RNA packaging signals serve to direct the incorporation of IAV gene segments into virus particles, and this process is thought to be mediated by segment-segment interactions. These packaging signals are segment and strain specific, and as such, they have the potential to impact reassortment outcomes between different IAV strains. Our study aimed to quantify the impact of packaging signal mismatch on IAV reassortment using the human seasonal influenza A/Panama/2007/99 (H3N2) and pandemic influenza A/Netherlands/602/2009 (H1N1) viruses. Focusing on the three most divergent segments, we constructed pairs of viruses that encoded identical proteins but differed in the packaging signal regions on a single segment. We then evaluated the frequency with which segments carrying homologous versus heterologous packaging signals were incorporated into reassortant progeny viruses. We found that, when segment 4 (HA) of coinfecting parental viruses was modified, there was a significant preference for the segment containing matched packaging signals relative to the background of the virus. This preference was apparent even when the homologous HA constituted a minority of the HA segment population available in the cell for packaging. Conversely, when segment 6 (NA) or segment 8 (NS) carried modified packaging signals, there was no significant preference for homologous packaging signals. These data suggest that movement of NA and NS segments between the human H3N2 and H1N1 lineages is unlikely to be restricted by packaging signal mismatch, while movement of the HA segment would be more constrained. Our results indicate that the importance of packaging signals in IAV reassortment is segment dependent.IMPORTANCE Influenza A viruses (IAVs) can exchange genes through reassortment. This process contributes to both the highly diverse population of IAVs found in nature and the formation of novel epidemic and pandemic IAV strains. Our study sought to determine the

  13. A contrario line segment detection

    CERN Document Server

    von Gioi, Rafael Grompone

    2014-01-01

    The reliable detection of low-level image structures is an old and still challenging problem in computer vision. This?book leads a detailed tour through the LSD algorithm, a line segment detector designed to be fully automatic. Based on the a contrario framework, the algorithm works efficiently without the need of any parameter tuning. The design criteria are thoroughly explained and the algorithm's good and bad results are illustrated on real and synthetic images. The issues involved, as well as the strategies used, are common to many geometrical structure detection problems and some possible

  14. Interferon Induced Focal Segmental Glomerulosclerosis

    Directory of Open Access Journals (Sweden)

    Yusuf Kayar

    2016-01-01

    Full Text Available Behçet’s disease is an inflammatory disease of unknown etiology which involves recurring oral and genital aphthous ulcers and ocular lesions as well as articular, vascular, and nervous system involvement. Focal segmental glomerulosclerosis (FSGS is usually seen in viral infections, immune deficiency syndrome, sickle cell anemia, and hyperfiltration and secondary to interferon therapy. Here, we present a case of FSGS identified with kidney biopsy in a patient who had been diagnosed with Behçet’s disease and received interferon-alpha treatment for uveitis and presented with acute renal failure and nephrotic syndrome associated with interferon.

  15. Analytical learning and term-rewriting systems

    Science.gov (United States)

    Laird, Philip; Gamble, Evan

    1990-01-01

    Analytical learning is a set of machine learning techniques for revising the representation of a theory based on a small set of examples of that theory. When the representation of the theory is correct and complete but perhaps inefficient, an important objective of such analysis is to improve the computational efficiency of the representation. Several algorithms with this purpose have been suggested, most of which are closely tied to a first order logical language and are variants of goal regression, such as the familiar explanation based generalization (EBG) procedure. But because predicate calculus is a poor representation for some domains, these learning algorithms are extended to apply to other computational models. It is shown that the goal regression technique applies to a large family of programming languages, all based on a kind of term rewriting system. Included in this family are three language families of importance to artificial intelligence: logic programming, such as Prolog; lambda calculus, such as LISP; and combinatorial based languages, such as FP. A new analytical learning algorithm, AL-2, is exhibited that learns from success but is otherwise quite different from EBG. These results suggest that term rewriting systems are a good framework for analytical learning research in general, and that further research should be directed toward developing new techniques.

  16. Analytic definition of spin structure

    Science.gov (United States)

    Avetisyan, Zhirayr; Fang, Yan-Long; Saveliev, Nikolai; Vassiliev, Dmitri

    2017-08-01

    We work on a parallelizable time-orientable Lorentzian 4-manifold and prove that in this case, the notion of spin structure can be equivalently defined in a purely analytic fashion. Our analytic definition relies on the use of the concept of a non-degenerate two-by-two formally self-adjoint first order linear differential operator and gauge transformations of such operators. We also give an analytic definition of spin structure for the 3-dimensional Riemannian case.

  17. Rorty, Pragmatism, and Analytic Philosophy

    Directory of Open Access Journals (Sweden)

    Cheryl Misak

    2013-07-01

    Full Text Available One of Richard Rorty's legacies is to have put a Jamesian version of pragmatism on the contemporary philosophical map. Part of his argument has been that pragmatism and analytic philosophy are set against each other, with pragmatism almost having been killed off by the reigning analytic philosophy. The argument of this paper is that there is a better and more interesting reading of both the history of pragmatism and the history of analytic philosophy.

  18. Analytics for managers with Excel

    CERN Document Server

    Bell, Peter C

    2013-01-01

    Analytics is one of a number of terms which are used to describe a data-driven more scientific approach to management. Ability in analytics is an essential management skill: knowledge of data and analytics helps the manager to analyze decision situations, prevent problem situations from arising, identify new opportunities, and often enables many millions of dollars to be added to the bottom line for the organization.The objective of this book is to introduce analytics from the perspective of the general manager of a corporation. Rather than examine the details or attempt an encyclopaedic revie

  19. Multiphase Image Segmentation Using the Deformable Simplicial Complex Method

    DEFF Research Database (Denmark)

    Dahl, Vedrana Andersen; Christiansen, Asger Nyman; Bærentzen, Jakob Andreas

    2014-01-01

    in image segmentation based on deformable models. We show the benefits of using the deformable simplicial complex method for image segmentation by segmenting an image into a known number of segments characterized by distinct mean pixel intensities....

  20. Analytics for metabolic engineering

    Directory of Open Access Journals (Sweden)

    Christopher J Petzold

    2015-09-01

    Full Text Available Realizing the promise of metabolic engineering has been slowed by challenges related to moving beyond proof-of-concept examples to robust and economically viable systems. Key to advancing metabolic engineering beyond trial-and-error research is access to parts with well-defined performance metrics that can be readily applied in vastly different contexts with predictable effects. As the field now stands, research depends greatly on analytical tools that assay target molecules, transcripts, proteins, and metabolites across different hosts and pathways. Screening technologies yield specific information for many thousands of strain variants while deep omics analysis provide a systems-level view of the cell factory. Efforts focused on a combination of these analyses yield quantitative information of dynamic processes between parts and the host chassis that drive the next engineering steps. Overall, the data generated from these types of assays aid better decision-making at the design and strain construction stages to speed progress in metabolic engineering research.

  1. Normality in Analytical Psychology

    Directory of Open Access Journals (Sweden)

    Steve Myers

    2013-11-01

    Full Text Available Although C.G. Jung’s interest in normality wavered throughout his career, it was one of the areas he identified in later life as worthy of further research. He began his career using a definition of normality which would have been the target of Foucault’s criticism, had Foucault chosen to review Jung’s work. However, Jung then evolved his thinking to a standpoint that was more aligned to Foucault’s own. Thereafter, the post Jungian concept of normality has remained relatively undeveloped by comparison with psychoanalysis and mainstream psychology. Jung’s disjecta membra on the subject suggest that, in contemporary analytical psychology, too much focus is placed on the process of individuation to the neglect of applications that consider collective processes. Also, there is potential for useful research and development into the nature of conflict between individuals and societies, and how normal people typically develop in relation to the spectrum between individuation and collectivity.

  2. Hanford analytical sample projections FY 1998--FY 2002

    Energy Technology Data Exchange (ETDEWEB)

    Joyce, S.M.

    1998-02-12

    Analytical Services projections are compiled for the Hanford site based on inputs from the major programs for the years 1998 through 2002. Projections are categorized by radiation level, protocol, sample matrix and program. Analyses requirements are also presented. This document summarizes the Hanford sample projections for fiscal years 1998 to 2002. Sample projections are based on inputs submitted to Analytical Services covering Environmental Restoration, Tank Waste Remediation Systems (TWRS), Solid Waste, Liquid Effluents, Spent Nuclear Fuels, Transition Projects, Site Monitoring, Industrial Hygiene, Analytical Services and miscellaneous Hanford support activities. In addition, details on laboratory scale technology (development) work, Sample Management, and Data Management activities are included. This information will be used by Hanford Analytical Services (HAS) and the Sample Management Working Group (SMWG) to assure that laboratories and resources are available and effectively utilized to meet these documented needs.

  3. Directory of Analytical Methods, Department 1820

    Energy Technology Data Exchange (ETDEWEB)

    Whan, R.E. (ed.)

    1986-01-01

    The Materials Characterization Department performs chemical, physical, and thermophysical analyses in support of programs throughout the Laboratories. The department has a wide variety of techniques and instruments staffed by experienced personnel available for these analyses, and we strive to maintain near state-of-the-art technology by continued updates. We have prepared this Directory of Analytical Methods in order to acquaint you with our capabilities and to help you identify personnel who can assist with your analytical needs. The descriptions of the various capabilities are requester-oriented and have been limited in length and detail. Emphasis has been placed on applications and limitations with notations of estimated analysis time and alternative or related techniques. A short, simplified discussion of underlying principles is also presented along with references if more detail is desired. The contents of this document have been organized in the order: bulky analysis, microanalysis, surface analysis, optical and thermal property measurements.

  4. Multi-product dynamic advertisement planning in a segmented market

    Directory of Open Access Journals (Sweden)

    Aggarwal Sugandha

    2017-01-01

    Full Text Available In this paper, a dynamic multi-objective linear integer programming model is proposed to optimally distribute a firm’s advertising budget among multiple products and media in a segmented market. To make the media plan responsive to the changes in the market, the distribution is carried out dynamically by dividing the planning horizon into smaller periods. The model incorporates the effect of the previous period advertising reach on the current period (taken through retention factor, and it also considers cross-product effect of simultaneously advertising different products. An application of the model is presented for an insurance firm that markets five different products, using goal programming approach.

  5. Getting started with Greenplum for big data analytics

    CERN Document Server

    Gollapudi, Sunila

    2013-01-01

    Standard tutorial-based approach.""Getting Started with Greenplum for Big Data"" Analytics is great for data scientists and data analysts with a basic knowledge of Data Warehousing and Business Intelligence platforms who are new to Big Data and who are looking to get a good grounding in how to use the Greenplum Platform. It's assumed that you will have some experience with database design and programming as well as be familiar with analytics tools like R and Weka.

  6. Organizational and analytical frameworks for diagnostics human capital

    OpenAIRE

    Streltsova, Nadiya

    2015-01-01

    The authors propose organizational and analytical support for the diagnosis of human capital, which aims to create a procedure, that is, the established order, the carrying out of diagnostics. The main goals of organizational and analytical support of the diagnosis of human capital should include the following: diagnosis; determination of prospects of development of human capital as a diagnosis; development of programs, strategies, and concepts for the development of human capital based on di...

  7. Analytical Chemistry Laboratory Progress Report for FY 1994

    Energy Technology Data Exchange (ETDEWEB)

    Green, D.W.; Boparai, A.S.; Bowers, D.L. [and others

    1994-12-01

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1994 (October 1993 through September 1994). This annual report is the eleventh for the ACL and describes continuing effort on projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. The ACL also has a research program in analytical chemistry, conducts instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems. Some routine or standard analyses are done, but it is common for the Argonne programs to generate unique problems that require significant development of methods and adaption of techniques to obtain useful analytical data. The ACL has four technical groups -- Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis -- which together include about 45 technical staff members. Talents and interests of staff members cross the group lines, as do many projects within the ACL. The Chemical Analysis Group uses wet- chemical and instrumental methods for elemental, compositional, and isotopic determinations in solid, liquid, and gaseous samples and provides specialized analytical services. Major instruments in this group include an ion chromatograph (IC), an inductively coupled plasma/atomic emission spectrometer (ICP/AES), spectrophotometers, mass spectrometers (including gas-analysis and thermal-ionization mass spectrometers), emission spectrographs, autotitrators, sulfur and carbon determinators, and a kinetic phosphorescence uranium analyzer.

  8. Analytical Chemistry Division's sample transaction system

    Energy Technology Data Exchange (ETDEWEB)

    Stanton, J.S.; Tilson, P.A.

    1980-10-01

    The Analytical Chemistry Division uses the DECsystem-10 computer for a wide range of tasks: sample management, timekeeping, quality assurance, and data calculation. This document describes the features and operating characteristics of many of the computer programs used by the Division. The descriptions are divided into chapters which cover all of the information about one aspect of the Analytical Chemistry Division's computer processing.

  9. Documented Safety Analysis for the B695 Segment

    Energy Technology Data Exchange (ETDEWEB)

    Laycak, D

    2008-09-11

    This Documented Safety Analysis (DSA) was prepared for the Lawrence Livermore National Laboratory (LLNL) Building 695 (B695) Segment of the Decontamination and Waste Treatment Facility (DWTF). The report provides comprehensive information on design and operations, including safety programs and safety structures, systems and components to address the potential process-related hazards, natural phenomena, and external hazards that can affect the public, facility workers, and the environment. Consideration is given to all modes of operation, including the potential for both equipment failure and human error. The facilities known collectively as the DWTF are used by LLNL's Radioactive and Hazardous Waste Management (RHWM) Division to store and treat regulated wastes generated at LLNL. RHWM generally processes low-level radioactive waste with no, or extremely low, concentrations of transuranics (e.g., much less than 100 nCi/g). Wastes processed often contain only depleted uranium and beta- and gamma-emitting nuclides, e.g., {sup 90}Sr, {sup 137}Cs, or {sup 3}H. The mission of the B695 Segment centers on container storage, lab-packing, repacking, overpacking, bulking, sampling, waste transfer, and waste treatment. The B695 Segment is used for storage of radioactive waste (including transuranic and low-level), hazardous, nonhazardous, mixed, and other waste. Storage of hazardous and mixed waste in B695 Segment facilities is in compliance with the Resource Conservation and Recovery Act (RCRA). LLNL is operated by the Lawrence Livermore National Security, LLC, for the Department of Energy (DOE). The B695 Segment is operated by the RHWM Division of LLNL. Many operations in the B695 Segment are performed under a Resource Conservation and Recovery Act (RCRA) operation plan, similar to commercial treatment operations with best demonstrated available technologies. The buildings of the B695 Segment were designed and built considering such operations, using proven building

  10. Hanford analytical sample projections FY 1996 - FY 2001. Revision 4

    Energy Technology Data Exchange (ETDEWEB)

    Joyce, S.M.

    1997-07-02

    This document summarizes the biannual Hanford sample projections for fiscal year 1997-2001. Sample projections are based on inputs submitted to Analytical Services covering Environmental Restoration, Tank Wastes Remediation Systems, Solid Wastes, Liquid Effluents, Spent Nuclear Fuels, Transition Projects, Site Monitoring, Industrial Hygiene, Analytical Services and miscellaneous Hanford support activities. In addition to this revision, details on Laboratory scale technology (development), Sample management, and Data management activities were requested. This information will be used by the Hanford Analytical Services program and the Sample Management Working Group to assure that laboratories and resources are available and effectively utilized to meet these documented needs.

  11. Why Construct Analytical Models Of Laser Welding?

    Science.gov (United States)

    Dowden, John

    2008-09-01

    Much attention is given these days to the computational mathematical modelling of industrial processes in materials science. It is usually referred to, perhaps ambiguously, as mathematical modelling. Its value is obvious—once the initial outlay in terms of the effort of writing or purchasing a flexible, accurate and appropriate computer programme has been made, it is possible to simulate complex experiments whose outlay in terms of man-hours and equipment cost would be many times that of a computer run. Similarly, such a model can be used to find suitable parameters for the setting up of a new commercial process, rather than relying on costly trial and error with the equipment itself. This computational approach has almost entirely replaced the kind of analytical investigation that was adopted thirty or more years ago, at least for development purposes. The reasons are obvious. Analytical modelling is generally incapable of finding solutions of problems in anything but very simple geometries, and the model often has to be approximated drastically to obtain any kind of solution at all. It has little apparent value for the kind of purposes to which much computational modelling is put today. It is often overlooked, however, that analytical modelling has a very valuable role to play in a number of ways. One such is as a check on a computer algorithm. If an analytical solution to a model problem can be found that is in principle soluble computationally by a specific computer program, it can be used to check that the program does in fact give the right answer (to within a specifiable accuracy)—something that is in general very hard to establish by numerical analysis alone. Another use to which analytical models can be put is to investigate the underlying physical theory; nearly all models necessarily use approximations of one sort or another, and it is often simpler to test approximations to the theory, or to test the need to insert finer detail to the model than had

  12. Learning Analytics: Challenges and Limitations

    Science.gov (United States)

    Wilson, Anna; Watson, Cate; Thompson, Terrie Lynn; Drew, Valerie; Doyle, Sarah

    2017-01-01

    Learning analytic implementations are increasingly being included in learning management systems in higher education. We lay out some concerns with the way learning analytics--both data and algorithms--are often presented within an unproblematized Big Data discourse. We describe some potential problems with the often implicit assumptions about…

  13. Analytics for Cyber Network Defense

    Energy Technology Data Exchange (ETDEWEB)

    Plantenga, Todd. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Kolda, Tamara Gibson [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2011-06-01

    This report provides a brief survey of analytics tools considered relevant to cyber network defense (CND). Ideas and tools come from elds such as statistics, data mining, and knowledge discovery. Some analytics are considered standard mathematical or statistical techniques, while others re ect current research directions. In all cases the report attempts to explain the relevance to CND with brief examples.

  14. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  15. The machine in multimedia analytics

    NARCIS (Netherlands)

    Zahálka, J.

    2017-01-01

    This thesis investigates the role of the machine in multimedia analytics, a discipline that combines visual analytics with multimedia analysis algorithms in order to unlock the potential of multimedia collections as sources of knowledge in scientific and applied domains. Specifically, the central

  16. Experimental and analytical tools for evaluation of Stirling engine rod seal behavior

    Science.gov (United States)

    Krauter, A. I.; Cheng, H. S.

    1979-01-01

    The first year of a two year experimental and analytical program is reported. The program is directed at the elastohydrodynamic behavior of sliding elastomeric rod seals for the Stirling engine. During the year, experimental and analytical tools were developed for evaluating seal leakage, seal friction, and the fluid film thickness at the seal/cylinder interface.

  17. Discriminative parameter estimation for random walks segmentation.

    Science.gov (United States)

    Baudin, Pierre-Yves; Goodman, Danny; Kumrnar, Puneet; Azzabou, Noura; Carlier, Pierre G; Paragios, Nikos; Kumar, M Pawan

    2013-01-01

    The Random Walks (RW) algorithm is one of the most efficient and easy-to-use probabilistic segmentation methods. By combining contrast terms with prior terms, it provides accurate segmentations of medical images in a fully automated manner. However, one of the main drawbacks of using the RW algorithm is that its parameters have to be hand-tuned. we propose a novel discriminative learning framework that estimates the parameters using a training dataset. The main challenge we face is that the training samples are not fully supervised. Specifically, they provide a hard segmentation of the images, instead of a probabilistic segmentation. We overcome this challenge by treating the optimal probabilistic segmentation that is compatible with the given hard segmentation as a latent variable. This allows us to employ the latent support vector machine formulation for parameter estimation. We show that our approach significantly outperforms the baseline methods on a challenging dataset consisting of real clinical 3D MRI volumes of skeletal muscles.

  18. CDIS: Circle Density Based Iris Segmentation

    Science.gov (United States)

    Gupta, Anand; Kumari, Anita; Kundu, Boris; Agarwal, Isha

    Biometrics is an automated approach of measuring and analysing physical and behavioural characteristics for identity verification. The stability of the Iris texture makes it a robust biometric tool for security and authentication purposes. Reliable Segmentation of Iris is a necessary precondition as an error at this stage will propagate into later stages and requires proper segmentation of non-ideal images having noises like eyelashes, etc. Iris Segmentation work has been done earlier but we feel it lacks in detecting iris in low contrast images, removal of specular reflections, eyelids and eyelashes. Hence, it motivates us to enhance the said parameters. Thus, we advocate a new approach CDIS for Iris segmentation along with new algorithms for removal of eyelashes, eyelids and specular reflections and pupil segmentation. The results obtained have been presented using GAR vs. FAR graphs at the end and have been compared with prior works related to segmentation of iris.

  19. Intelligent multi-spectral IR image segmentation

    Science.gov (United States)

    Lu, Thomas; Luong, Andrew; Heim, Stephen; Patel, Maharshi; Chen, Kang; Chao, Tien-Hsin; Chow, Edward; Torres, Gilbert

    2017-05-01

    This article presents a neural network based multi-spectral image segmentation method. A neural network is trained on the selected features of both the objects and background in the longwave (LW) Infrared (IR) images. Multiple iterations of training are performed until the accuracy of the segmentation reaches satisfactory level. The segmentation boundary of the LW image is used to segment the midwave (MW) and shortwave (SW) IR images. A second neural network detects the local discontinuities and refines the accuracy of the local boundaries. This article compares the neural network based segmentation method to the Wavelet-threshold and Grab-Cut methods. Test results have shown increased accuracy and robustness of this segmentation scheme for multi-spectral IR images.

  20. Hierarchical image segmentation for learning object priors

    Energy Technology Data Exchange (ETDEWEB)

    Prasad, Lakshman [Los Alamos National Laboratory; Yang, Xingwei [TEMPLE UNIV.; Latecki, Longin J [TEMPLE UNIV.; Li, Nan [TEMPLE UNIV.

    2010-11-10

    The proposed segmentation approach naturally combines experience based and image based information. The experience based information is obtained by training a classifier for each object class. For a given test image, the result of each classifier is represented as a probability map. The final segmentation is obtained with a hierarchial image segmentation algorithm that considers both the probability maps and the image features such as color and edge strength. We also utilize image region hierarchy to obtain not only local but also semi-global features as input to the classifiers. Moreover, to get robust probability maps, we take into account the region context information by averaging the probability maps over different levels of the hierarchical segmentation algorithm. The obtained segmentation results are superior to the state-of-the-art supervised image segmentation algorithms.

  1. Segmental neurofibromatosis: report of two cases.

    Science.gov (United States)

    Sezer, Engin; Senayli, Atilla; Sezer, Taner; Bicakci, Unal

    2006-09-01

    Neurofibromatosis (NF), or von Recklinghausen's disease is comprised of a heterogeneous group of disorders, primarily affecting the skin, soft tissue, bone and central nervous system. Segmental neurofibromatosis (SN) is a rare form of NF, characterized by "café-au-lait" macules, freckles, and/or neurofibromas limited to a body segment. There are approximately 150 cases reported in the English published work. Bilateral segmental neurofibromatosis is a rare subtype of SN, manifesting with bilateral involvement of the body segments. Herein, we report two patients with SN; one associated with pectus excavatum, and the other case diagnosed as bilateral segmental neurofibromatosis. Asymmetry of the skull and thorax, kyphoscoliosis and segmental bone hypertrophy of the leg are skeletal abnormalities previously reported with SN. To the best of our knowledge, this is the first case of SN in association with pectus excavatum.

  2. Hierarchical photo stream segmentation using context

    Science.gov (United States)

    Gong, Bo; Jain, Ramesh

    2008-01-01

    Photo stream segmentation is to segment photo streams into groups, each of which corresponds to an event. Photo stream segmentation can be done with or without prior knowledge of event structure. In this paper, we study the problem by assuming that there is no a priori event model available. Although both context and content information are important for photo stream segmentation, we focus on investigating the usage of context information in this work. We consider different information components of context such as time, location, and optical setting for inexpensive segmentation of photo streams from common users of modern digital camera. As events are hierarchical, we propose to segment photo stream using hierarchical mixture model. We compare the generated hierarchy with that created by users to see how well results can be obtained without knowing the prior event model. We experimented with about 3000 photos from amateur photographers to study the efficacy of the approach for these context information components.

  3. Classifier combination for wafer segmentation

    Science.gov (United States)

    Bourgeat, Pierrick T.; Meriaudeau, Fabrice

    2005-02-01

    In the last decade, the accessibility of inexpensive and powerful computers has allowed true digital holography to be used for industrial inspection. This technique allows capturing a complex image of a scene (i.e. containing magnitude and phase), and reconstructing the phase and magnitude information. Digital holograms give a new dimension to texture analysis since the topology information can be used as an additional way to extract features. This new technique can be used to extend previous work on image segmentation of patterned wafers for defect detection. This paper presents a combination of features obtained from Gabor filters on different complex images. The combination enables to cope with the intensity variations occurring during the holography and provides final results which are independent from the selected training samples.

  4. Benchmarking of Remote Sensing Segmentation Methods

    Czech Academy of Sciences Publication Activity Database

    Mikeš, Stanislav; Haindl, Michal; Scarpa, G.; Gaetano, R.

    2015-01-01

    Roč. 8, č. 5 (2015), s. 2240-2248 ISSN 1939-1404 R&D Projects: GA ČR(CZ) GA14-10911S Institutional support: RVO:67985556 Keywords : benchmark * remote sensing segmentation * unsupervised segmentation * supervised segmentation Subject RIV: BD - Theory of Information Impact factor: 2.145, year: 2015 http://library.utia.cas.cz/separaty/2015/RO/haindl-0445995.pdf

  5. Automatic segmentation of speech into syllables

    OpenAIRE

    Mertens, Piet

    1987-01-01

    A multiple pass procedure for the automatic segmentation of syllabic units is described which involves (1) a broad segmentation triggered by the dips in the intensity curve of band-pass filtered speech, (2) a further segmentation on the basis of the shape of the curve, and (3) the readjustment of the syllabic nucleus within syllable boundaries, based on the intensity of the unfiltered speech.

  6. An algorithm for segmenting range imagery

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, R.S.

    1997-03-01

    This report describes the technical accomplishments of the FY96 Cross Cutting and Advanced Technology (CC&AT) project at Los Alamos National Laboratory. The project focused on developing algorithms for segmenting range images. The image segmentation algorithm developed during the project is described here. In addition to segmenting range images, the algorithm can fuse multiple range images thereby providing true 3D scene models. The algorithm has been incorporated into the Rapid World Modelling System at Sandia National Laboratory.

  7. Segmentation in the brazilian labor market

    OpenAIRE

    Botelho, Fernando; Ponczek, Vladimir Pinheiro

    2007-01-01

    This paper measures the degree of segmentation in the brazilian labor market. Controlling for observable and unobservable characteristics, workers earn more in the formal sector, which supports the segmentation hypothesis. We break down the degree of segmentation by socio-economic attributes to identify the groups where this phenomenon is more prevalent. We investigate the robustness of our findings to the inclusion of self-employed individuals, and apply a two-stage panel probit model using ...

  8. Segmenting Consumers Based on Luxury Value Perceptions

    OpenAIRE

    Bahar Teimourpour; Kambiz Heidarzadeh Hanzaee; Babak Teimourpour

    2013-01-01

    This study seeks to discover consumer segments by using a multidimensional concept of luxury by encompassing functional, individual and social components in the luxury market. Survey data was collected from 1097 consumers in Iran. Eight luxury factors were indentified through an exploratory factor analysis. These factors are used for segmenting these consumers with the K-means method. Cluster analysis of the data resulted in four different behavioral style segments namely: non-luxury consumer...

  9. Invariant texture segmentation via circular gabor filter

    OpenAIRE

    ZHANG, Jianguo; Tan, Tieniu

    2002-01-01

    International audience; In this paper, we focus on invariant texture segmentation, and propose a new method using circular Gabor filters (CGF) for rotation invariant texture segmentation. The traditional Gabor function is modified into a circular symmetric version. The rotation invariant texture features are achieved via the channel output of the CGF. A new scheme of the selection of Gabor parameters is also proposed for texture segmentation. Experiments show the efficacy of this method

  10. Analysing the Methods of Dzongkha Word Segmentation

    OpenAIRE

    Dhungyel Parshu Ram; Grundspeņķis Jānis

    2017-01-01

    In both Chinese and Dzongkha languages, the greatest challenge is to identify the word boundaries because there are no word delimiters as it is in English and other Western languages. Therefore, preprocessing and word segmentation is the first step in Dzongkha language processing, such as translation, spell-checking, and information retrieval. Research on Chinese word segmentation was conducted long time ago. Therefore, it is relatively mature, but the Dzongkha word segmentation has been less...

  11. IFRS 8 Operating Segments - A Closer Look

    OpenAIRE

    Muthupandian, K S

    2008-01-01

    The International Accounting Standards Board issued the International Financial Reporting Standard 8 Operating Segments. Segment information is one of the most vital aspects of financial reporting for investors and other users. The IFRS 8 requires an entity to adopt the ‘management approach’ to reporting on the financial performance of its operating segments. This article presents a closer look of the standard (objective, scope, and disclosures).

  12. LOGISTICS PRINCIPLES SEGMENTING CONSUMER COMPANIES

    OpenAIRE

    Карпунь, О.В.

    2013-01-01

    In article traditional methods customer segmentation were investigated and were ordered logistical principals of customer segmentation for receiving maximum profits and minimizing expenses. В статье были исследованы традиционные методы сегментации потребительского рынка и предложены логистические принципы сегментации потребителей компании с целью получения максимальной прибыли при минимизации затрат. В статті були досліджені традиційні маркетингові методи сегментації споживчого ринку та...

  13. Bayesian segmentation of brainstem structures in MRI

    DEFF Research Database (Denmark)

    Iglesias, Juan Eugenio; Van Leemput, Koen; Bhatt, Priyanka

    2015-01-01

    In this paper we present a method to segment four brainstem structures (midbrain, pons, medulla oblongata and superior cerebellar peduncle) from 3D brain MRI scans. The segmentation method relies on a probabilistic atlas of the brainstem and its neighboring brain structures. To build the atlas, we...... the brainstem structures in novel scans. Thanks to the generative nature of the scheme, the segmentation method is robust to changes in MRI contrast or acquisition hardware. Using cross validation, we show that the algorithm can segment the structures in previously unseen T1 and FLAIR scans with great accuracy...

  14. Osmotic and Heat Stress Effects on Segmentation

    National Research Council Canada - National Science Library

    Weiss, Julian; Devoto, Stephen H

    2016-01-01

    .... Environmental stresses such as hypoxia or heat shock produce segmentation defects, and significantly increase the penetrance and severity of vertebral defects in genetically susceptible individuals...

  15. Study on segmented distribution for reliability evaluation

    Directory of Open Access Journals (Sweden)

    Huaiyuan Li

    2017-02-01

    Full Text Available In practice, the failure rate of most equipment exhibits different tendencies at different stages and even its failure rate curve behaves a multimodal trace during its life cycle. As a result, traditionally evaluating the reliability of equipment with a single model may lead to severer errors. However, if lifetime is divided into several different intervals according to the characteristics of its failure rate, piecewise fitting can more accurately approximate the failure rate of equipment. Therefore, in this paper, failure rate is regarded as a piecewise function, and two kinds of segmented distribution are put forward to evaluate reliability. In order to estimate parameters in the segmented reliability function, Bayesian estimation and maximum likelihood estimation (MLE of the segmented distribution are discussed in this paper. Since traditional information criterion is not suitable for the segmented distribution, an improved information criterion is proposed to test and evaluate the segmented reliability model in this paper. After a great deal of testing and verification, the segmented reliability model and its estimation methods presented in this paper are proven more efficient and accurate than the traditional non-segmented single model, especially when the change of the failure rate is time-phased or multimodal. The significant performance of the segmented reliability model in evaluating reliability of proximity sensors of leading-edge flap in civil aircraft indicates that the segmented distribution and its estimation method in this paper could be useful and accurate.

  16. The First Active Segmented Mirror at ESO

    Science.gov (United States)

    Gonté, Frédéric; Dupuy, Christophe; Frank, Christoph; Araujo, Constanza; Brast, Roland; Frahm, Robert; Karban, Robert; Andolfato, Luigi; Esteves, Regina; Nylund, Matty; Sedghi, Babak; Fischer, Gerhard; Noethe, Lothar; Derie, Frédéric

    2007-06-01

    The Active Phasing Experiment (APE) is part of the Extremely Large Telescope Design Study which is supported by the European Framework Programme 6. This experiment, which is conducted in collaboration with several partners is a demonstrator to test and qualify newly-developed phasing sensors for the alignment of segmented mirrors and test the phasing software within a telescope control system to be developed for a future European Extremely Large Telescope. The segmentation of a primary mirror is simulated by a scaled-down Active Segmented Mirror of 61 segments which has been developed in-house.

  17. Interactive segmentation techniques algorithms and performance evaluation

    CERN Document Server

    He, Jia; Kuo, C-C Jay

    2013-01-01

    This book focuses on interactive segmentation techniques, which have been extensively studied in recent decades. Interactive segmentation emphasizes clear extraction of objects of interest, whose locations are roughly indicated by human interactions based on high level perception. This book will first introduce classic graph-cut segmentation algorithms and then discuss state-of-the-art techniques, including graph matching methods, region merging and label propagation, clustering methods, and segmentation methods based on edge detection. A comparative analysis of these methods will be provided

  18. Analysing the Methods of Dzongkha Word Segmentation

    Directory of Open Access Journals (Sweden)

    Dhungyel Parshu Ram

    2017-05-01

    Full Text Available In both Chinese and Dzongkha languages, the greatest challenge is to identify the word boundaries because there are no word delimiters as it is in English and other Western languages. Therefore, preprocessing and word segmentation is the first step in Dzongkha language processing, such as translation, spell-checking, and information retrieval. Research on Chinese word segmentation was conducted long time ago. Therefore, it is relatively mature, but the Dzongkha word segmentation has been less studied by researchers. In the paper, we have investigated this major problem in Dzongkha language processing using a probabilistic approach for selecting valid segments with probability being computed on the basis of the corpus.

  19. Review of segmentation process in consumer markets

    Directory of Open Access Journals (Sweden)

    Veronika Jadczaková

    2013-01-01

    Full Text Available Although there has been a considerable debate on market segmentation over five decades, attention was merely devoted to single stages of the segmentation process. In doing so, stages as segmentation base selection or segments profiling have been heavily covered in the extant literature, whereas stages as implementation of the marketing strategy or market definition were of a comparably lower interest. Capitalizing on this shortcoming, this paper strives to close the gap and provide each step of the segmentation process with equal treatment. Hence, the objective of this paper is two-fold. First, a snapshot of the segmentation process in a step-by-step fashion will be provided. Second, each step (where possible will be evaluated on chosen criteria by means of description, comparison, analysis and synthesis of 32 academic papers and 13 commercial typology systems. Ultimately, the segmentation stages will be discussed with empirical findings prevalent in the segmentation studies and last but not least suggestions calling for further investigation will be presented. This seven-step-framework may assist when segmenting in practice allowing for more confidential targeting which in turn might prepare grounds for creating of a differential advantage.

  20. Polarimetric Segmentation Using Wishart Test Statistic

    DEFF Research Database (Denmark)

    Skriver, Henning; Schou, Jesper; Nielsen, Allan Aasbjerg

    2002-01-01

    ) approach, which is a merging algorithm for single channel SAR images. The polarimetric version described in this paper uses the above-mentioned test statistic for merging. The segmentation algorithm has been applied to polarimetric SAR data from the Danish dual-frequency, airborne polarimetric SAR, EMISAR......A newly developed test statistic for equality of two complex covariance matrices following the complex Wishart distribution and an associated asymptotic probability for the test statistic has been used in a segmentation algorithm. The segmentation algorithm is based on the MUM (merge using moments....... The results show clearly an improved segmentation performance for the full polarimetric algorithm compared to single channel approaches....

  1. Open-source algorithm for automatic choroid segmentation of OCT volume reconstructions

    Science.gov (United States)

    Mazzaferri, Javier; Beaton, Luke; Hounye, Gisèle; Sayah, Diane N.; Costantino, Santiago

    2017-02-01

    The use of optical coherence tomography (OCT) to study ocular diseases associated with choroidal physiology is sharply limited by the lack of available automated segmentation tools. Current research largely relies on hand-traced, single B-Scan segmentations because commercially available programs require high quality images, and the existing implementations are closed, scarce and not freely available. We developed and implemented a robust algorithm for segmenting and quantifying the choroidal layer from 3-dimensional OCT reconstructions. Here, we describe the algorithm, validate and benchmark the results, and provide an open-source implementation under the General Public License for any researcher to use (https://www.mathworks.com/matlabcentral/fileexchange/61275-choroidsegmentation).

  2. Semi-automatic tool for segmentation and volumetric analysis of medical images.

    Science.gov (United States)

    Heinonen, T; Dastidar, P; Kauppinen, P; Malmivuo, J; Eskola, H

    1998-05-01

    Segmentation software is described, developed for medical image processing and run on Windows. The software applies basic image processing techniques through a graphical user interface. For particular applications, such as brain lesion segmentation, the software enables the combination of different segmentation techniques to improve its efficiency. The program is applied for magnetic resonance imaging, computed tomography and optical images of cryosections. The software can be utilised in numerous applications, including pre-processing for three-dimensional presentations, volumetric analysis and construction of volume conductor models.

  3. Use of market segmentation to identify untapped consumer needs in vision correction surgery for future growth.

    Science.gov (United States)

    Loarie, Thomas M; Applegate, David; Kuenne, Christopher B; Choi, Lawrence J; Horowitz, Diane P

    2003-01-01

    Market segmentation analysis identifies discrete segments of the population whose beliefs are consistent with exhibited behaviors such as purchase choice. This study applies market segmentation analysis to low myopes (-1 to -3 D with less than 1 D cylinder) in their consideration and choice of a refractive surgery procedure to discover opportunities within the market. A quantitative survey based on focus group research was sent to a demographically balanced sample of myopes using contact lenses and/or glasses. A variable reduction process followed by a clustering analysis was used to discover discrete belief-based segments. The resulting segments were validated both analytically and through in-market testing. Discontented individuals who wear contact lenses are the primary target for vision correction surgery. However, 81% of the target group is apprehensive about laser in situ keratomileusis (LASIK). They are nervous about the procedure and strongly desire reversibility and exchangeability. There exists a large untapped opportunity for vision correction surgery within the low myope population. Market segmentation analysis helped determine how to best meet this opportunity through repositioning existing procedures or developing new vision correction technology, and could also be applied to identify opportunities in other vision correction populations.

  4. Segmental stabilization and muscular strengthening in chronic low back pain: a comparative study

    OpenAIRE

    Fábio Renovato França; Thomaz Nogueira Burke; Erica Sato Hanada; Amélia Pasqual Marques

    2010-01-01

    OBJECTIVE: To contrast the efficacy of two exercise programs, segmental stabilization and strengthening of abdominal and trunk muscles, on pain, functional disability, and activation of the transversus abdominis muscle (TrA), in individuals with chronic low back pain. DESIGN: Our sample consisted of 30 individuals, randomly assigned to one of two treatment groups: segmental stabilization, where exercises focused on the TrA and lumbar multifidus muscles, and superficial strengthening, where ex...

  5. Nucleic Acid i-Motif Structures in Analytical Chemistry.

    Science.gov (United States)

    Alba, Joan Josep; Sadurní, Anna; Gargallo, Raimundo

    2016-09-02

    Under the appropriate experimental conditions of pH and temperature, cytosine-rich segments in DNA or RNA sequences may produce a characteristic folded structure known as an i-motif. Besides its potential role in vivo, which is still under investigation, this structure has attracted increasing interest in other fields due to its sharp, fast and reversible pH-driven conformational changes. This "on/off" switch at molecular level is being used in nanotechnology and analytical chemistry to develop nanomachines and sensors, respectively. This paper presents a review of the latest applications of this structure in the field of chemical analysis.

  6. Evaluation of use of a very short polar microbore column segment in high-speed gas chromatography analysis.

    Science.gov (United States)

    Tranchida, Peter Quinto; Mondello, Monica; Sciarrone, Danilo; Dugo, Paola; Dugo, Giovanni; Mondello, Luigi

    2008-08-01

    Very fast GC analyses are commonly carried out by using 10 m x 0.1 mm id capillaries. In order to achieve rapid elution times (1-3 min), the latter are operated under suboptimum conditions. The present research is focused on the evaluation of use of a 0.1 mm id polar column segment (2 m), operated under near-to-optimum conditions, in very fast GC analysis. The results attained are compared with those derived from using a 10 m microbore column in very fast GC experiments. Prior to method development, the effects of gas velocity, temperature program rate, and sample amounts on analytical performance were evaluated. Following these preliminary applications, a complex lipidic sample, cod liver oil, was subjected to rapid separation (approximately 2.1 min) on the 10 m capillary through the application of a 50 degrees C/min temperature rate and a 130 cm/s gas velocity. The same matrix was analyzed on the 2 m capillary using the same temperature program rate and range, but with a close-to-ideal linear velocity. The results observed were of interest, as the separation was achieved in less time (1.45 min) with improved peak resolution. Finally, both methods were validated in terms of retention time and peak area repeatability, LOQ, and linearity.

  7. Second International Workshop on Teaching Analytics

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Reimann, Peter; Halb, Wolfgang

    2013-01-01

    Teaching Analytics is conceived as a subfield of learning analytics that focuses on the design, development, evaluation, and education of visual analytics methods and tools for teachers in primary, secondary, and tertiary educational settings. The Second International Workshop on Teaching Analytics...... (IWTA) 2013 seeks to bring together researchers and practitioners in the fields of education, learning sciences, learning analytics, and visual analytics to investigate the design, development, use, evaluation, and impact of visual analytical methods and tools for teachers’ dynamic diagnostic decision...

  8. Near-Tubular Fiber Bundle Segmentation for Diffusion Weighted Imaging: Segmentation Through Frame Reorientation

    Science.gov (United States)

    2010-03-01

    Near-Tubular Fiber Bundle Segmentation for Diffusion Weighted Imaging: Segmentation Through Frame Reorientation Marc Niethammera,b, Christopher Zacha...John Melonakosc, and Allen Tannenbaumc Marc Niethammer: mn@cs.unc.edu; Christopher Zach: cmzach@cs.unc.edu; John Melonakos: jmelonak@ece.gatech.edu...modification of a recent segmentation approach by Bresson et al. allows for a convex optimization formulation of the segmentation problem, combining

  9. Nuclear techniques in analytical chemistry

    CERN Document Server

    Moses, Alfred J; Gordon, L

    1964-01-01

    Nuclear Techniques in Analytical Chemistry discusses highly sensitive nuclear techniques that determine the micro- and macro-amounts or trace elements of materials. With the increasingly frequent demand for the chemical determination of trace amounts of elements in materials, the analytical chemist had to search for more sensitive methods of analysis. This book accustoms analytical chemists with nuclear techniques that possess the desired sensitivity and applicability at trace levels. The topics covered include safe handling of radioactivity; measurement of natural radioactivity; and neutron a

  10. Banach spaces of analytic functions

    CERN Document Server

    Hoffman, Kenneth

    2007-01-01

    A classic of pure mathematics, this advanced graduate-level text explores the intersection of functional analysis and analytic function theory. Close in spirit to abstract harmonic analysis, it is confined to Banach spaces of analytic functions in the unit disc.The author devotes the first four chapters to proofs of classical theorems on boundary values and boundary integral representations of analytic functions in the unit disc, including generalizations to Dirichlet algebras. The fifth chapter contains the factorization theory of Hp functions, a discussion of some partial extensions of the f

  11. The feasibility of using manual segmentation in a multifeature computer-aided diagnosis system for classification of skin lesions: a retrospective comparative study

    National Research Council Canada - National Science Library

    Chang, Wen-Yu; Huang, Adam; Chen, Yin-Chun; Lin, Chi-Wei; Tsai, John; Yang, Chung-Kai; Huang, Yin-Tseng; Wu, Yi-Fan; Chen, Gwo-Shing

    2015-01-01

    ... practitioners, as well as by an automated segmentation software program, JSEG. The performance of CADx based on inputs from these two groups of physicians and that of the JSEG program was compared using feature agreement analysis...

  12. Prioritizing pesticide compounds for analytical methods development

    Science.gov (United States)

    Norman, Julia E.; Kuivila, Kathryn; Nowell, Lisa H.

    2012-01-01

    compounds are high priority as new analytes. The objective for analytical methods development is to design an integrated analytical strategy that includes as many of the Tier 1 pesticide compounds as possible in a relatively few, cost-effective methods. More than 60 percent of the Tier 1 compounds are high priority because they are anticipated to be present at concentrations approaching levels that could be of concern to human health or aquatic life in surface water or groundwater. An additional 17 percent of Tier 1 compounds were frequently detected in monitoring studies, but either were not measured at levels potentially relevant to humans or aquatic organisms, or do not have benchmarks available with which to compare concentrations. The remaining 21 percent are pesticide degradates that were included because their parent pesticides were in Tier 1. Tier 1 pesticide compounds for water span all major pesticide use groups and a diverse range of chemical classes, with herbicides and their degradates composing half of compounds. Many of the high priority pesticide compounds also are in several national regulatory programs for water, including those that are regulated in drinking water by the U.S. Environmental Protection Agency under the Safe Drinking Water Act and those that are on the latest Contaminant Candidate List. For sediment, a total of 175 pesticide compounds were classified as Tier 1 and, thus, are high priority for inclusion in analytical methods available for monitoring and studies. More than 60 percent of these compounds are included in some USGS analytical method; however, some are spread across several research methods that are expensive to perform, and monitoring data are not extensive for many compounds. The remaining Tier 1 compounds for sediment are high priority as new analytes. The objective for analytical methods development for sediment is to enhance an existing analytical method that currently includes nearly half of the pesticide compounds in Tier 1

  13. Segmented block copolymers based on poly(propylene oxide) and monodisperse polyamide-6,T segments

    NARCIS (Netherlands)

    van der Schuur, J.M.; Gaymans, R.J.

    2006-01-01

    Polyether(ester amide)s with poly(propylene oxide) (PPO) and monodisperse poly(hexamethylene terephthalamide) segments were synthesized, and their structure-property relations were investigated. The length of the amide segments was varied from diamide to tetraamide to hexaamide segments, and

  14. A Multiatlas Segmentation Using Graph Cuts with Applications to Liver Segmentation in CT Scans

    Science.gov (United States)

    2014-01-01

    An atlas-based segmentation approach is presented that combines low-level operations, an affine probabilistic atlas, and a multiatlas-based segmentation. The proposed combination provides highly accurate segmentation due to registrations and atlas selections based on the regions of interest (ROIs) and coarse segmentations. Our approach shares the following common elements between the probabilistic atlas and multiatlas segmentation: (a) the spatial normalisation and (b) the segmentation method, which is based on minimising a discrete energy function using graph cuts. The method is evaluated for the segmentation of the liver in computed tomography (CT) images. Low-level operations define a ROI around the liver from an abdominal CT. We generate a probabilistic atlas using an affine registration based on geometry moments from manually labelled data. Next, a coarse segmentation of the liver is obtained from the probabilistic atlas with low computational effort. Then, a multiatlas segmentation approach improves the accuracy of the segmentation. Both the atlas selections and the nonrigid registrations of the multiatlas approach use a binary mask defined by coarse segmentation. We experimentally demonstrate that this approach performs better than atlas selections and nonrigid registrations in the entire ROI. The segmentation results are comparable to those obtained by human experts and to other recently published results. PMID:25276219

  15. Mild toxic anterior segment syndrome mimicking delayed onset toxic anterior segment syndrome after cataract surgery

    Directory of Open Access Journals (Sweden)

    Su-Na Lee

    2014-01-01

    Full Text Available Toxic anterior segment syndrome (TASS is an acute sterile postoperative anterior segment inflammation that may occur after anterior segment surgery. I report herein a case that developed mild TASS in one eye after bilateral uneventful cataract surgery, which was masked during early postoperative period under steroid eye drop and mimicking delayed onset TASS after switching to weaker steroid eye drop.

  16. GeoSegmenter: A statistically learned Chinese word segmenter for the geoscience domain

    Science.gov (United States)

    Huang, Lan; Du, Youfu; Chen, Gongyang

    2015-03-01

    Unlike English, the Chinese language has no space between words. Segmenting texts into words, known as the Chinese word segmentation (CWS) problem, thus becomes a fundamental issue for processing Chinese documents and the first step in many text mining applications, including information retrieval, machine translation and knowledge acquisition. However, for the geoscience subject domain, the CWS problem remains unsolved. Although a generic segmenter can be applied to process geoscience documents, they lack the domain specific knowledge and consequently their segmentation accuracy drops dramatically. This motivated us to develop a segmenter specifically for the geoscience subject domain: the GeoSegmenter. We first proposed a generic two-step framework for domain specific CWS. Following this framework, we built GeoSegmenter using conditional random fields, a principled statistical framework for sequence learning. Specifically, GeoSegmenter first identifies general terms by using a generic baseline segmenter. Then it recognises geoscience terms by learning and applying a model that can transform the initial segmentation into the goal segmentation. Empirical experimental results on geoscience documents and benchmark datasets showed that GeoSegmenter could effectively recognise both geoscience terms and general terms.

  17. Flexible diamond-like carbon films on rubber : On the origin of self-acting segmentation and film flexibility

    NARCIS (Netherlands)

    Pei, Y.T.; Bui, X.L.; Pal, J.P. van der; Martinez-Martinez, D.; Zhou, X.B.; Hosson, J.Th.M. De

    This paper reports an experimental approach to deposit flexible diamond-like carbon (DLC) films on hydrogenated nitrile butadiene rubber (HNBR) with plasma-assisted chemical vapor deposition and an analytical model to describe the self-segmentation mechanism of the DLC films. By making use of the

  18. Analytic boosted boson discrimination

    Energy Technology Data Exchange (ETDEWEB)

    Larkoski, Andrew J.; Moult, Ian; Neill, Duff [Center for Theoretical Physics, Massachusetts Institute of Technology,Cambridge, MA 02139 (United States)

    2016-05-20

    Observables which discriminate boosted topologies from massive QCD jets are of great importance for the success of the jet substructure program at the Large Hadron Collider. Such observables, while both widely and successfully used, have been studied almost exclusively with Monte Carlo simulations. In this paper we present the first all-orders factorization theorem for a two-prong discriminant based on a jet shape variable, D{sub 2}, valid for both signal and background jets. Our factorization theorem simultaneously describes the production of both collinear and soft subjets, and we introduce a novel zero-bin procedure to correctly describe the transition region between these limits. By proving an all orders factorization theorem, we enable a systematically improvable description, and allow for precision comparisons between data, Monte Carlo, and first principles QCD calculations for jet substructure observables. Using our factorization theorem, we present numerical results for the discrimination of a boosted Z boson from massive QCD background jets. We compare our results with Monte Carlo predictions which allows for a detailed understanding of the extent to which these generators accurately describe the formation of two-prong QCD jets, and informs their usage in substructure analyses. Our calculation also provides considerable insight into the discrimination power and calculability of jet substructure observables in general.

  19. Handwriting segmentation of unconstrained Oriya text

    Indian Academy of Sciences (India)

    Segmentation of handwritten text into lines, words and characters is one of the important steps in the handwritten text recognition process. In this paper we propose a water reservoir concept-based scheme for segmentation of unconstrained Oriya handwritten text into individual characters. Here, at first, the text image is ...

  20. Market Segmentation Using Bayesian Model Based Clustering

    NARCIS (Netherlands)

    Van Hattum, P.|info:eu-repo/dai/nl/304848646

    2009-01-01

    This dissertation deals with two basic problems in marketing, that are market segmentation, which is the grouping of persons who share common aspects, and market targeting, which is focusing your marketing efforts on one or more attractive market segments. For the grouping of persons who share

  1. FISICO: Fast Image SegmentatIon COrrection.

    Directory of Open Access Journals (Sweden)

    Waldo Valenzuela

    Full Text Available In clinical diagnosis, medical image segmentation plays a key role in the analysis of pathological regions. Despite advances in automatic and semi-automatic segmentation techniques, time-effective correction tools are commonly needed to improve segmentation results. Therefore, these tools must provide faster corrections with a lower number of interactions, and a user-independent solution to reduce the time frame between image acquisition and diagnosis.We present a new interactive method for correcting image segmentations. Our method provides 3D shape corrections through 2D interactions. This approach enables an intuitive and natural corrections of 3D segmentation results. The developed method has been implemented into a software tool and has been evaluated for the task of lumbar muscle and knee joint segmentations from MR images.Experimental results show that full segmentation corrections could be performed within an average correction time of 5.5±3.3 minutes and an average of 56.5±33.1 user interactions, while maintaining the quality of the final segmentation result within an average Dice coefficient of 0.92±0.02 for both anatomies. In addition, for users with different levels of expertise, our method yields a correction time and number of interaction decrease from 38±19.2 minutes to 6.4±4.3 minutes, and 339±157.1 to 67.7±39.6 interactions, respectively.

  2. Convolutional Neural Networks for SAR Image Segmentation

    DEFF Research Database (Denmark)

    Malmgren-Hansen, David; Nobel-Jørgensen, Morten

    2015-01-01

    Segmentation of Synthetic Aperture Radar (SAR) images has several uses, but it is a difficult task due to a number of properties related to SAR images. In this article we show how Convolutional Neural Networks (CNNs) can easily be trained for SAR image segmentation with good results. Besides...

  3. Storing tooth segments for optimal esthetics

    NARCIS (Netherlands)

    Tuzuner, T.; Turgut, S.; Özen, B.; Kılınç, H.; Bagis, B.

    2016-01-01

    Objective: A fractured whole crown segment can be reattached to its remnant; crowns from extracted teeth may be used as pontics in splinting techniques. We aimed to evaluate the effect of different storage solutions on tooth segment optical properties after different durations. Study design: Sixty

  4. Segmental blood pressure after total hip replacement

    DEFF Research Database (Denmark)

    Gebuhr, Peter Henrik; Soelberg, M; Henriksen, Jens Henrik Sahl

    1992-01-01

    Twenty-nine patients due to have a total hip replacement had their systemic systolic and segmental blood pressures measured prior to operation and 1 and 6 weeks postoperatively. No patients had signs of ischemia. The segmental blood pressure was measured at the ankle and at the toes. A significan...

  5. ESL batteries and multi-segment applications

    Science.gov (United States)

    Hay, J. L.; Pearce, J. G.; Turnbull, L.; Owen, J. R.

    1987-06-01

    A simulation model for nickel cadmium battery cells operating in the environment of a low Earth orbit satellite; simulation applications to investigate the performance of the ESA Simulation Language (ESL) segment or multiprocessor emulation features; a double-precision version of ESL; and development of ESL in equation sorting, segmentation, character handling, and file input/output are described.

  6. Bilateral segmental neurofibromatosis diagnosed during pregnancy.

    Science.gov (United States)

    Maldonado Cid, P; Sendagorta Cudós, E; Noguera Morel, L; Beato Merino, M J

    2011-05-15

    Bilateral segmental neurofibromatosis is a rare subtype of neurofibromatosis type 1 defined by lesions affecting a single segment of the body and crossing the midline, with no systemic involvement. We present a case diagnosed during pregnancy because of the characteristic increase in size of the lesions during this period.

  7. Benefit segmentation of the fitness market.

    Science.gov (United States)

    Brown, J D

    1992-01-01

    While considerate attention is being paid to the fitness and wellness needs of people by healthcare and related marketing organizations, little research attention has been directed to identifying the market segments for fitness based upon consumers' perceived benefits of fitness. This article describes three distinct segments of fitness consumers comprising an estimated 50 percent of households. Implications for marketing strategies are also presented.

  8. 47 CFR 101.1505 - Segmentation plan.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 5 2010-10-01 2010-10-01 false Segmentation plan. 101.1505 Section 101.1505 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES FIXED MICROWAVE SERVICES Service and Technical Rules for the 70/80/90 GHz Bands § 101.1505 Segmentation plan. (a) An entity...

  9. Spinal cord grey matter segmentation challenge.

    Science.gov (United States)

    Prados, Ferran; Ashburner, John; Blaiotta, Claudia; Brosch, Tom; Carballido-Gamio, Julio; Cardoso, Manuel Jorge; Conrad, Benjamin N; Datta, Esha; Dávid, Gergely; Leener, Benjamin De; Dupont, Sara M; Freund, Patrick; Wheeler-Kingshott, Claudia A M Gandini; Grussu, Francesco; Henry, Roland; Landman, Bennett A; Ljungberg, Emil; Lyttle, Bailey; Ourselin, Sebastien; Papinutto, Nico; Saporito, Salvatore; Schlaeger, Regina; Smith, Seth A; Summers, Paul; Tam, Roger; Yiannakas, Marios C; Zhu, Alyssa; Cohen-Adad, Julien

    2017-05-15

    An important image processing step in spinal cord magnetic resonance imaging is the ability to reliably and accurately segment grey and white matter for tissue specific analysis. There are several semi- or fully-automated segmentation methods for cervical cord cross-sectional area measurement with an excellent performance close or equal to the manual segmentation. However, grey matter segmentation is still challenging due to small cross-sectional size and shape, and active research is being conducted by several groups around the world in this field. Therefore a grey matter spinal cord segmentation challenge was organised to test different capabilities of various methods using the same multi-centre and multi-vendor dataset acquired with distinct 3D gradient-echo sequences. This challenge aimed to characterize the state-of-the-art in the field as well as identifying new opportunities for future improvements. Six different spinal cord grey matter segmentation methods developed independently by various research groups across the world and their performance were compared to manual segmentation outcomes, the present gold-standard. All algorithms provided good overall results for detecting the grey matter butterfly, albeit with variable performance in certain quality-of-segmentation metrics. The data have been made publicly available and the challenge web site remains open to new submissions. No modifications were introduced to any of the presented methods as a result of this challenge for the purposes of this publication. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  10. Fingerprint segmentation based on hidden Markov models

    NARCIS (Netherlands)

    Klein, S.; Bazen, A.M.; Veldhuis, Raymond N.J.

    An important step in fingerprint recognition is segmentation. During segmentation the fingerprint image is decomposed into foreground, background and low-quality regions. The foreground is used in the recognition process, the background is ignored. The low-quality regions may or may not be used,

  11. Infants Segment Continuous Events Using Transitional Probabilities

    Science.gov (United States)

    Stahl, Aimee E.; Romberg, Alexa R.; Roseberry, Sarah; Golinkoff, Roberta Michnick; Hirsh-Pasek, Kathryn

    2014-01-01

    Throughout their 1st year, infants adeptly detect statistical structure in their environment. However, little is known about whether statistical learning is a primary mechanism for event segmentation. This study directly tests whether statistical learning alone is sufficient to segment continuous events. Twenty-eight 7- to 9-month-old infants…

  12. Segmenting Michigan tourists based on distance traveled

    Science.gov (United States)

    Xiamei Xu; Tsao-Fang Yuan; Edwin Gomez; Joseph D. Fridgen

    1998-01-01

    The purpose of this study was to segment Michigan travelers into short, medium and long distance traveler groups by distance that they traveled from home to a primary destination in Michigan, and to compare travel behavior, trip characteristics and sociodemographics among these segments. Significant differences were identified in past trip experiences in Michigan,...

  13. Speech Segmentation Using Bayesian Autoregressive Changepoint Detector

    Directory of Open Access Journals (Sweden)

    P. Sovka

    1998-12-01

    Full Text Available This submission is devoted to the study of the Bayesian autoregressive changepoint detector (BCD and its use for speech segmentation. Results of the detector application to autoregressive signals as well as to real speech are given. BCD basic properties are described and discussed. The novel two-step algorithm consisting of cepstral analysis and BCD for automatic speech segmentation is suggested.

  14. Translocations used to generate chromosome segment duplications ...

    Indian Academy of Sciences (India)

    Supplementary figure 1. (a–i) Putative novel genes created by the breakpoints. Translocation chromosomes are shown with the translocated segment indicated in red and the untranslocated segments in black or blue. Purple arrows indicate whether the chromosome is a donor (arrow pointing up) or a recipient (arrow ...

  15. Translocations used to generate chromosome segment duplications ...

    Indian Academy of Sciences (India)

    progeny bearing a duplication (Dp) of the translocated chromosome segment. Here, 30 ... [Singh P K, Iyer V S, Sowjanya T N, Raj B K and Kasbekar D P 2010 Translocations used to generate chromosome segment duplications in. Neurospora can ... of this work, namely, the definition of breakpoint junction sequences of 12 ...

  16. Text Segmentation for Chinese Spell Checking.

    Science.gov (United States)

    Lee, Kin Hong; Lu, Qin; Ng, Mau Kit Michael

    1999-01-01

    Discussion of spell checking for Chinese words proposes a Block-of-Combinations (BOC) text-segmentation method based on frequency of word usage to reduce the word combinations from exponential growth to linear growth. Suggests user interaction to make the segmentation more suitable for spell checking. (Author/LRW)

  17. Moving window segmentation framework for point clouds

    NARCIS (Netherlands)

    Sithole, G.; Gorte, B.G.H.

    2012-01-01

    As lidar point clouds become larger streamed processing becomes more attractive. This paper presents a framework for the streamed segmentation of point clouds with the intention of segmenting unstructured point clouds in real-time. The framework is composed of two main components. The first

  18. Webpage Segments Classification with Incremental Knowledge Acquisition

    Science.gov (United States)

    Guo, Wei; Kim, Yang Sok; Kang, Byeong Ho

    This paper suggests an incremental information extraction method for social network analysis of web publications. For this purpose, we employed an incremental knowledge acquisition method, called MCRDR (Multiple Classification Ripple-Down Rules), to classify web page segments. Our experimental results show that our MCRDR-based web page segments classification system successfully supports easy acquisition and maintenance of information extraction rules.

  19. Limb-segment selection in drawing behaviour

    NARCIS (Netherlands)

    Meulenbroek, R G; Rosenbaum, D A; Thomassen, A.J.W.M.; Schomaker, L R

    How do we select combinations of limb segments to carry out physical tasks? Three possible determinants of limb-segment selection are hypothesized here: (1) optimal amplitudes and frequencies of motion for the effectors; (2) preferred movement axes for the effectors; and (3) a tendency to continue

  20. LIMB-SEGMENT SELECTION IN DRAWING BEHAVIOR

    NARCIS (Netherlands)

    MEULENBROEK, RGJ; ROSENBAUM, DA; THOMASSEN, AJWM; SCHOMAKER, LRB; Schomaker, Lambertus

    How do we select combinations of limb segments to carry out physical tasks? Three possible determinants of limb-segment selection are hypothesized here: (1) optimal amplitudes and frequencies of motion for the effectors; (2) preferred movement axes for the effectors; and (3) a tendency to continue

  1. CMS Program Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Office of Enterprise Data and Analytics has developed CMS Program Statistics, which includes detailed summary statistics on national health care, Medicare...

  2. Protein-segment universe exhibiting transitions at intermediate segment length in conformational subspaces.

    Science.gov (United States)

    Ikeda, Kazuyoshi; Hirokawa, Takatsugu; Higo, Junichi; Tomii, Kentaro

    2008-08-13

    Many studies have examined rules governing two aspects of protein structures: short segments and proteins' structural domains. Nevertheless, the organization and nature of the conformational space of segments with intermediate length between short segments and domains remain unclear. Conformational spaces of intermediate length segments probably differ from those of short segments. We investigated the identification and characterization of the boundary(s) between peptide-like (short segment) and protein-like (long segment) distributions. We generated ensembles embedded in globular proteins comprising segments 10-50 residues long. We explored the relationships between the conformational distribution of segments and their lengths, and also protein structural classes using principal component analysis based on the intra-segment Calpha-Calpha atomic distances. Our statistical analyses of segment conformations and length revealed critical dual transitions in their conformational distribution with segments derived from all four structural classes. Dual transitions were identified with the intermediate phase between the short segments and domains. Consequently, protein segment universes were categorized. i) Short segments (10-22 residues) showed a distribution with a high frequency of secondary structure clusters. ii) Medium segments (23-26 residues) showed a distribution corresponding to an intermediate state of transitions. iii) Long segments (27-50 residues) showed a distribution converging on one huge cluster containing compact conformations with a smaller radius of gyration. This distribution reflects the protein structures' organization and protein domains' origin. Three major conformational components (radius of gyration, structural symmetry with respect to the N-terminal and C-terminal halves, and single-turn/two-turn structure) well define most of the segment universes. Furthermore, we identified several conformational components that were unique to each

  3. Protein-segment universe exhibiting transitions at intermediate segment length in conformational subspaces

    Directory of Open Access Journals (Sweden)

    Hirokawa Takatsugu

    2008-08-01

    Full Text Available Abstract Background Many studies have examined rules governing two aspects of protein structures: short segments and proteins' structural domains. Nevertheless, the organization and nature of the conformational space of segments with intermediate length between short segments and domains remain unclear. Conformational spaces of intermediate length segments probably differ from those of short segments. We investigated the identification and characterization of the boundary(s between peptide-like (short segment and protein-like (long segment distributions. We generated ensembles embedded in globular proteins comprising segments 10–50 residues long. We explored the relationships between the conformational distribution of segments and their lengths, and also protein structural classes using principal component analysis based on the intra-segment Cα-Cα atomic distances. Results Our statistical analyses of segment conformations and length revealed critical dual transitions in their conformational distribution with segments derived from all four structural classes. Dual transitions were identified with the intermediate phase between the short segments and domains. Consequently, protein segment universes were categorized. i Short segments (10–22 residues showed a distribution with a high frequency of secondary structure clusters. ii Medium segments (23���26 residues showed a distribution corresponding to an intermediate state of transitions. iii Long segments (27–50 residues showed a distribution converging on one huge cluster containing compact conformations with a smaller radius of gyration. This distribution reflects the protein structures' organization and protein domains' origin. Three major conformational components (radius of gyration, structural symmetry with respect to the N-terminal and C-terminal halves, and single-turn/two-turn structure well define most of the segment universes. Furthermore, we identified several

  4. General Staining and Segmentation Procedures for High Content Imaging and Analysis.

    Science.gov (United States)

    Chambers, Kevin M; Mandavilli, Bhaskar S; Dolman, Nick J; Janes, Michael S

    2018-01-01

    Automated quantitative fluorescence microscopy, also known as high content imaging (HCI), is a rapidly growing analytical approach in cell biology. Because automated image analysis relies heavily on robust demarcation of cells and subcellular regions, reliable methods for labeling cells is a critical component of the HCI workflow. Labeling of cells for image segmentation is typically performed with fluorescent probes that bind DNA for nuclear-based cell demarcation or with those which react with proteins for image analysis based on whole cell staining. These reagents, along with instrument and software settings, play an important role in the successful segmentation of cells in a population for automated and quantitative image analysis. In this chapter, we describe standard procedures for labeling and image segmentation in both live and fixed cell samples. The chapter will also provide troubleshooting guidelines for some of the common problems associated with these aspects of HCI.

  5. "Analytical" vector-functions I

    Science.gov (United States)

    Todorov, Vladimir Todorov

    2017-12-01

    In this note we try to give a new (or different) approach to the investigation of analytical vector functions. More precisely a notion of a power xn; n ∈ ℕ+ of a vector x ∈ ℝ3 is introduced which allows to define an "analytical" function f : ℝ3 → ℝ3. Let furthermore f (ξ )= ∑n =0 ∞ anξn be an analytical function of the real variable ξ. Here we replace the power ξn of the number ξ with the power of a vector x ∈ ℝ3 to obtain a vector "power series" f (x )= ∑n =0 ∞ anxn . We research some properties of the vector series as well as some applications of this idea. Note that an "analytical" vector function does not depend of any basis, which may be used in research into some problems in physics.

  6. Strong nonlinear oscillators analytical solutions

    CERN Document Server

    Cveticanin, Livija

    2017-01-01

    This book outlines an analytical solution procedure of the pure nonlinear oscillator system, offering a solution for free and forced vibrations of the one-degree-of-freedom strong nonlinear system with constant and time variable parameter. Includes exercises.

  7. Streamlining Smart Meter Data Analytics

    DEFF Research Database (Denmark)

    Liu, Xiufeng; Nielsen, Per Sieverts

    2015-01-01

    -economic metrics such as the geographic information of meters, the information about users and their property, geographic location and others, which make the data management very complex. On the other hand, data-mining and the emerging cloud computing technologies make the collection, management, and analysis...... of the so-called big data possible. This can improve energy management, e.g., help utilities improve the management of energy and services, and help customers save money. As this regard, the paper focuses on building an innovative software solution to streamline smart meter data analytic, aiming at dealing...... with the complexity of data processing and data analytics. The system offers an information integration pipeline to ingest smart meter data; scalable data processing and analytic platform for pre-processing and mining big smart meter data sets; and a web-based portal for visualizing data analytics results. The system...

  8. Labour Market Driven Learning Analytics

    Science.gov (United States)

    Kobayashi, Vladimer; Mol, Stefan T.; Kismihók, Gábor

    2014-01-01

    This paper briefly outlines a project about integrating labour market information in a learning analytics goal-setting application that provides guidance to students in their transition from education to employment.

  9. Analytical Chemistry Laboratory. Progress report for FY 1996

    Energy Technology Data Exchange (ETDEWEB)

    Green, D.W.; Boparai, A.S.; Bowers, D.L.

    1996-12-01

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1996. This annual report is the thirteenth for the ACL. It describes effort on continuing and new projects and contributions of the ACL staff to various programs at ANL. The ACL operates in the ANL system as a full-cost-recovery service center, but has a mission that includes a complementary research and development component: The Analytical Chemistry Laboratory will provide high-quality, cost-effective chemical analysis and related technical support to solve research problems of our clients -- Argonne National Laboratory, the Department of Energy, and others -- and will conduct world-class research and development in analytical chemistry and its applications. Because of the diversity of research and development work at ANL, the ACL handles a wide range of analytical chemistry problems. Some routine or standard analyses are done, but the ACL usually works with commercial laboratories if our clients require high-volume, production-type analyses. It is common for ANL programs to generate unique problems that require significant development of methods and adaption of techniques to obtain useful analytical data. Thus, much of the support work done by the ACL is very similar to our applied analytical chemistry research.

  10. Improving image segmentation by learning region affinities

    Energy Technology Data Exchange (ETDEWEB)

    Prasad, Lakshman [Los Alamos National Laboratory; Yang, Xingwei [TEMPLE UNIV.; Latecki, Longin J [TEMPLE UNIV.

    2010-11-03

    We utilize the context information of other regions in hierarchical image segmentation to learn new regions affinities. It is well known that a single choice of quantization of an image space is highly unlikely to be a common optimal quantization level for all categories. Each level of quantization has its own benefits. Therefore, we utilize the hierarchical information among different quantizations as well as spatial proximity of their regions. The proposed affinity learning takes into account higher order relations among image regions, both local and long range relations, making it robust to instabilities and errors of the original, pairwise region affinities. Once the learnt affinities are obtained, we use a standard image segmentation algorithm to get the final segmentation. Moreover, the learnt affinities can be naturally unutilized in interactive segmentation. Experimental results on Berkeley Segmentation Dataset and MSRC Object Recognition Dataset are comparable and in some aspects better than the state-of-art methods.

  11. Anterior Segment Imaging in Combat Ocular Trauma

    Directory of Open Access Journals (Sweden)

    Denise S. Ryan

    2013-01-01

    Full Text Available Purpose. To evaluate the use of ocular imaging to enhance management and diagnosis of war-related anterior segment ocular injuries. Methods. This study was a prospective observational case series from an ongoing IRB-approved combat ocular trauma tracking study. Subjects with anterior segment ocular injury were imaged, when possible, using anterior segment optical coherence tomography (AS-OCT, confocal microscopy (CM, and slit lamp biomicroscopy. Results. Images captured from participants with combat ocular trauma on different systems provided comprehensive and alternate views of anterior segment injury to investigators. Conclusion. In combat-related trauma of the anterior segment, adjunct image acquisition enhances slit lamp examination and enables real time In vivo observation of the cornea facilitating injury characterization, progression, and management.

  12. A Hybrid Technique for Medical Image Segmentation

    Directory of Open Access Journals (Sweden)

    Alamgir Nyma

    2012-01-01

    Full Text Available Medical image segmentation is an essential and challenging aspect in computer-aided diagnosis and also in pattern recognition research. This paper proposes a hybrid method for magnetic resonance (MR image segmentation. We first remove impulsive noise inherent in MR images by utilizing a vector median filter. Subsequently, Otsu thresholding is used as an initial coarse segmentation method that finds the homogeneous regions of the input image. Finally, an enhanced suppressed fuzzy c-means is used to partition brain MR images into multiple segments, which employs an optimal suppression factor for the perfect clustering in the given data set. To evaluate the robustness of the proposed approach in noisy environment, we add different types of noise and different amount of noise to T1-weighted brain MR images. Experimental results show that the proposed algorithm outperforms other FCM based algorithms in terms of segmentation accuracy for both noise-free and noise-inserted MR images.

  13. Efficient segmentation by sparse pixel classification

    DEFF Research Database (Denmark)

    Dam, Erik B; Loog, Marco

    2008-01-01

    Segmentation methods based on pixel classification are powerful but often slow. We introduce two general algorithms, based on sparse classification, for optimizing the computation while still obtaining accurate segmentations. The computational costs of the algorithms are derived, and they are dem......Segmentation methods based on pixel classification are powerful but often slow. We introduce two general algorithms, based on sparse classification, for optimizing the computation while still obtaining accurate segmentations. The computational costs of the algorithms are derived......, and they are demonstrated on real 3-D magnetic resonance imaging and 2-D radiograph data. We show that each algorithm is optimal for specific tasks, and that both algorithms allow a speedup of one or more orders of magnitude on typical segmentation tasks....

  14. Unfolding Implementation in Industrial Market Segmentation

    DEFF Research Database (Denmark)

    Bøjgaard, John; Ellegaard, Chris

    2011-01-01

    Market segmentation is an important method of strategic marketing and constitutes a cornerstone of the marketing literature. It has undergone extensive scientific inquiry during the past 50 years. Reporting on an extensive review of the market segmentation literature, the challenging task...... of implementing industrial market segmentation is discussed and unfolded in this article. Extant literature has identified segmentation implementation as a core challenge for marketers, but also one, which has received limited empirical attention. Future research opportunities are formulated in this article...... to pave the way towards closing this gap. The extent of implementation coverage is assessed and various notions of implementation are identified. Implementation as the task of converting segmentation plans into action (referred to as execution) is identified as a particularly beneficial focus area...

  15. Integrated active contours for texture segmentation.

    Science.gov (United States)

    Sagiv, Chen; Sochen, Nir A; Zeevi, Yehoshua Y

    2006-06-01

    We address the issue of textured image segmentation in the context of the Gabor feature space of images. Gabor filters tuned to a set of orientations, scales and frequencies are applied to the images to create the Gabor feature space. A two-dimensional Riemannian manifold of local features is extracted via the Beltrami framework. The metric of this surface provides a good indicator of texture changes and is used, therefore, in a Beltrami-based diffusion mechanism and in a geodesic active contours algorithm for texture segmentation. The performance of the proposed algorithm is compared with that of the edgeless active contours algorithm applied for texture segmentation. Moreover, an integrated approach, extending the geodesic and edgeless active contours approaches to texture segmentation, is presented. We show that combining boundary and region information yields more robust and accurate texture segmentation results.

  16. Organizational Models for Big Data and Analytics

    Directory of Open Access Journals (Sweden)

    Robert L. Grossman

    2014-04-01

    Full Text Available In this article, we introduce a framework for determining how analytics capability should be distributed within an organization. Our framework stresses the importance of building a critical mass of analytics staff, centralizing or decentralizing the analytics staff to support business processes, and establishing an analytics governance structure to ensure that analytics processes are supported by the organization as a whole.

  17. Organizational Models for Big Data and Analytics

    OpenAIRE

    Robert L. Grossman; Kevin P. Siegel

    2014-01-01

    In this article, we introduce a framework for determining how analytics capability should be distributed within an organization. Our framework stresses the importance of building a critical mass of analytics staff, centralizing or decentralizing the analytics staff to support business processes, and establishing an analytics governance structure to ensure that analytics processes are supported by the organization as a whole.

  18. Bootstrapping white matter segmentation, Eve++

    Science.gov (United States)

    Plassard, Andrew; Hinton, Kendra E.; Venkatraman, Vijay; Gonzalez, Christopher; Resnick, Susan M.; Landman, Bennett A.

    2015-03-01

    Multi-atlas labeling has come in wide spread use for whole brain labeling on magnetic resonance imaging. Recent challenges have shown that leading techniques are near (or at) human expert reproducibility for cortical gray matter labels. However, these approaches tend to treat white matter as essentially homogeneous (as white matter exhibits isointense signal on structural MRI). The state-of-the-art for white matter atlas is the single-subject Johns Hopkins Eve atlas. Numerous approaches have attempted to use tractography and/or orientation information to identify homologous white matter structures across subjects. Despite success with large tracts, these approaches have been plagued by difficulties in with subtle differences in course, low signal to noise, and complex structural relationships for smaller tracts. Here, we investigate use of atlas-based labeling to propagate the Eve atlas to unlabeled datasets. We evaluate single atlas labeling and multi-atlas labeling using synthetic atlases derived from the single manually labeled atlas. On 5 representative tracts for 10 subjects, we demonstrate that (1) single atlas labeling generally provides segmentations within 2mm mean surface distance, (2) morphologically constraining DTI labels within structural MRI white matter reduces variability, and (3) multi-atlas labeling did not improve accuracy. These efforts present a preliminary indication that single atlas labels with correction is reasonable, but caution should be applied. To purse multi-atlas labeling and more fully characterize overall performance, more labeled datasets would be necessary.

  19. Vessel segmentation in screening mammograms

    Science.gov (United States)

    Mordang, J. J.; Karssemeijer, N.

    2015-03-01

    Blood vessels are a major cause of false positives in computer aided detection systems for the detection of breast cancer. Therefore, the purpose of this study is to construct a framework for the segmentation of blood vessels in screening mammograms. The proposed framework is based on supervised learning using a cascade classifier. This cascade classifier consists of several stages where in each stage a GentleBoost classifier is trained on Haar-like features. A total of 30 cases were included in this study. In each image, vessel pixels were annotated by selecting pixels on the centerline of the vessel, control samples were taken by annotating a region without any visible vascular structures. This resulted in a total of 31,000 pixels marked as vascular and over 4 million control pixels. After training, the classifier assigns a vesselness likelihood to the pixels. The proposed framework was compared to three other vessel enhancing methods, i) a vesselness filter, ii) a gaussian derivative filter, and iii) a tubeness filter. The methods were compared in terms of area under the receiver operating characteristics curves, the Az values. The Az value of the cascade approach is 0:85. This is superior to the vesselness, Gaussian, and tubeness methods, with Az values of 0:77, 0:81, and 0:78, respectively. From these results, it can be concluded that our proposed framework is a promising method for the detection of vessels in screening mammograms.

  20. Risk Assessment Update: Russian Segment

    Science.gov (United States)

    Christiansen, Eric; Lear, Dana; Hyde, James; Bjorkman, Michael; Hoffman, Kevin

    2012-01-01

    BUMPER-II version 1.95j source code was provided to RSC-E- and Khrunichev at January 2012 MMOD TIM in Moscow. MEMCxP and ORDEM 3.0 environments implemented as external data files. NASA provided a sample ORDEM 3.0 g."key" & "daf" environment file set for demonstration and benchmarking BUMPER -II v1.95j installation at the Jan-12 TIM. ORDEM 3.0 has been completed and is currently in beta testing. NASA will provide a preliminary set of ORDEM 3.0 ".key" & ".daf" environment files for the years 2012 through 2028. Bumper output files produced using the new ORDEM 3.0 data files are intended for internal use only, not for requirements verification. Output files will contain these words ORDEM FILE DESCRIPTION = PRELIMINARY VERSION: not for production. The projectile density term in many BUMPER-II ballistic limit equations will need to be updated. Cube demo scripts and output files delivered at the Jan-12 TIM have been updated for the new ORDEM 3.0 data files. Risk assessment results based on ORDEM 3.0 and MEM will be presented for the Russian Segment (RS) of ISS.