WorldWideScience

Sample records for point process analysis

  1. Scattering analysis of point processes and random measures

    International Nuclear Information System (INIS)

    Hanisch, K.H.

    1984-01-01

    In the present paper scattering analysis of point processes and random measures is studied. Known formulae which connect the scattering intensity with the pair distribution function of the studied structures are proved in a rigorous manner with tools of the theory of point processes and random measures. For some special fibre processes the scattering intensity is computed. For a class of random measures, namely for 'grain-germ-models', a new formula is proved which yields the pair distribution function of the 'grain-germ-model' in terms of the pair distribution function of the underlying point process (the 'germs') and of the mean structure factor and the mean squared structure factor of the particles (the 'grains'). (author)

  2. Bayesian analysis of Markov point processes

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    2006-01-01

    Recently Møller, Pettitt, Berthelsen and Reeves introduced a new MCMC methodology for drawing samples from a posterior distribution when the likelihood function is only specified up to a normalising constant. We illustrate the method in the setting of Bayesian inference for Markov point processes...... a partially ordered Markov point process as the auxiliary variable. As the method requires simulation from the "unknown" likelihood, perfect simulation algorithms for spatial point processes become useful....

  3. Residual analysis for spatial point processes

    DEFF Research Database (Denmark)

    Baddeley, A.; Turner, R.; Møller, Jesper

    We define residuals for point process models fitted to spatial point pattern data, and propose diagnostic plots based on these residuals. The techniques apply to any Gibbs point process model, which may exhibit spatial heterogeneity, interpoint interaction and dependence on spatial covariates. Ou...... or covariate effects. Q-Q plots of the residuals are effective in diagnosing interpoint interaction. Some existing ad hoc statistics of point patterns (quadrat counts, scan statistic, kernel smoothed intensity, Berman's diagnostic) are recovered as special cases....

  4. Definition of distance for nonlinear time series analysis of marked point process data

    Energy Technology Data Exchange (ETDEWEB)

    Iwayama, Koji, E-mail: koji@sat.t.u-tokyo.ac.jp [Research Institute for Food and Agriculture, Ryukoku Univeristy, 1-5 Yokotani, Seta Oe-cho, Otsu-Shi, Shiga 520-2194 (Japan); Hirata, Yoshito; Aihara, Kazuyuki [Institute of Industrial Science, The University of Tokyo, 4-6-1 Komaba, Meguro-ku, Tokyo 153-8505 (Japan)

    2017-01-30

    Marked point process data are time series of discrete events accompanied with some values, such as economic trades, earthquakes, and lightnings. A distance for marked point process data allows us to apply nonlinear time series analysis to such data. We propose a distance for marked point process data which can be calculated much faster than the existing distance when the number of marks is small. Furthermore, under some assumptions, the Kullback–Leibler divergences between posterior distributions for neighbors defined by this distance are small. We performed some numerical simulations showing that analysis based on the proposed distance is effective. - Highlights: • A new distance for marked point process data is proposed. • The distance can be computed fast enough for a small number of marks. • The method to optimize parameter values of the distance is also proposed. • Numerical simulations indicate that the analysis based on the distance is effective.

  5. Second-order analysis of structured inhomogeneous spatio-temporal point processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Ghorbani, Mohammad

    Statistical methodology for spatio-temporal point processes is in its infancy. We consider second-order analysis based on pair correlation functions and K-functions for first general inhomogeneous spatio-temporal point processes and second inhomogeneous spatio-temporal Cox processes. Assuming...... spatio-temporal separability of the intensity function, we clarify different meanings of second-order spatio-temporal separability. One is second-order spatio-temporal independence and relates e.g. to log-Gaussian Cox processes with an additive covariance structure of the underlying spatio......-temporal Gaussian process. Another concerns shot-noise Cox processes with a separable spatio-temporal covariance density. We propose diagnostic procedures for checking hypotheses of second-order spatio-temporal separability, which we apply on simulated and real data (the UK 2001 epidemic foot and mouth disease data)....

  6. Modeling fixation locations using spatial point processes.

    Science.gov (United States)

    Barthelmé, Simon; Trukenbrod, Hans; Engbert, Ralf; Wichmann, Felix

    2013-10-01

    Whenever eye movements are measured, a central part of the analysis has to do with where subjects fixate and why they fixated where they fixated. To a first approximation, a set of fixations can be viewed as a set of points in space; this implies that fixations are spatial data and that the analysis of fixation locations can be beneficially thought of as a spatial statistics problem. We argue that thinking of fixation locations as arising from point processes is a very fruitful framework for eye-movement data, helping turn qualitative questions into quantitative ones. We provide a tutorial introduction to some of the main ideas of the field of spatial statistics, focusing especially on spatial Poisson processes. We show how point processes help relate image properties to fixation locations. In particular we show how point processes naturally express the idea that image features' predictability for fixations may vary from one image to another. We review other methods of analysis used in the literature, show how they relate to point process theory, and argue that thinking in terms of point processes substantially extends the range of analyses that can be performed and clarify their interpretation.

  7. Fingerprint Analysis with Marked Point Processes

    DEFF Research Database (Denmark)

    Forbes, Peter G. M.; Lauritzen, Steffen; Møller, Jesper

    We present a framework for fingerprint matching based on marked point process models. An efficient Monte Carlo algorithm is developed to calculate the marginal likelihood ratio for the hypothesis that two observed prints originate from the same finger against the hypothesis that they originate from...... different fingers. Our model achieves good performance on an NIST-FBI fingerprint database of 258 matched fingerprint pairs....

  8. Hazard analysis and critical control point (HACCP) for an ultrasound food processing operation.

    Science.gov (United States)

    Chemat, Farid; Hoarau, Nicolas

    2004-05-01

    Emerging technologies, such as ultrasound (US), used for food and drink production often cause hazards for product safety. Classical quality control methods are inadequate to control these hazards. Hazard analysis of critical control points (HACCP) is the most secure and cost-effective method for controlling possible product contamination or cross-contamination, due to physical or chemical hazard during production. The following case study on the application of HACCP to an US food-processing operation demonstrates how the hazards at the critical control points of the process are effectively controlled through the implementation of HACCP.

  9. Hygienic-sanitary working practices and implementation of a Hazard Analysis and Critical Control Point (HACCP plan in lobster processing industries

    Directory of Open Access Journals (Sweden)

    Cristina Farias da Fonseca

    2013-03-01

    Full Text Available This study aimed to verify the hygienic-sanitary working practices and to create and implement a Hazard Analysis Critical Control Point (HACCP in two lobster processing industries in Pernambuco State, Brazil. The industries studied process frozen whole lobsters, frozen whole cooked lobsters, and frozen lobster tails for exportation. The application of the hygienic-sanitary checklist in the industries analyzed achieved conformity rates over 96% to the aspects evaluated. The use of the Hazard Analysis Critical Control Point (HACCP plan resulted in the detection of two critical control points (CCPs including the receiving and classification steps in the processing of frozen lobster and frozen lobster tails, and an additional critical control point (CCP was detected during the cooking step of processing of the whole frozen cooked lobster. The proper implementation of the Hazard Analysis Critical Control Point (HACCP plan in the lobster processing industries studied proved to be the safest and most cost-effective method to monitor each critical control point (CCP hazards.

  10. Poisson branching point processes

    International Nuclear Information System (INIS)

    Matsuo, K.; Teich, M.C.; Saleh, B.E.A.

    1984-01-01

    We investigate the statistical properties of a special branching point process. The initial process is assumed to be a homogeneous Poisson point process (HPP). The initiating events at each branching stage are carried forward to the following stage. In addition, each initiating event independently contributes a nonstationary Poisson point process (whose rate is a specified function) located at that point. The additional contributions from all points of a given stage constitute a doubly stochastic Poisson point process (DSPP) whose rate is a filtered version of the initiating point process at that stage. The process studied is a generalization of a Poisson branching process in which random time delays are permitted in the generation of events. Particular attention is given to the limit in which the number of branching stages is infinite while the average number of added events per event of the previous stage is infinitesimal. In the special case when the branching is instantaneous this limit of continuous branching corresponds to the well-known Yule--Furry process with an initial Poisson population. The Poisson branching point process provides a useful description for many problems in various scientific disciplines, such as the behavior of electron multipliers, neutron chain reactions, and cosmic ray showers

  11. Thinning spatial point processes into Poisson processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Schoenberg, Frederic Paik

    , and where one simulates backwards and forwards in order to obtain the thinned process. In the case of a Cox process, a simple independent thinning technique is proposed. In both cases, the thinning results in a Poisson process if and only if the true Papangelou conditional intensity is used, and thus can......This paper describes methods for randomly thinning certain classes of spatial point processes. In the case of a Markov point process, the proposed method involves a dependent thinning of a spatial birth-and-death process, where clans of ancestors associated with the original points are identified...... be used as a diagnostic for assessing the goodness-of-fit of a spatial point process model. Several examples, including clustered and inhibitive point processes, are considered....

  12. Thinning spatial point processes into Poisson processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Schoenberg, Frederic Paik

    2010-01-01

    are identified, and where we simulate backwards and forwards in order to obtain the thinned process. In the case of a Cox process, a simple independent thinning technique is proposed. In both cases, the thinning results in a Poisson process if and only if the true Papangelou conditional intensity is used, and......In this paper we describe methods for randomly thinning certain classes of spatial point processes. In the case of a Markov point process, the proposed method involves a dependent thinning of a spatial birth-and-death process, where clans of ancestors associated with the original points......, thus, can be used as a graphical exploratory tool for inspecting the goodness-of-fit of a spatial point process model. Several examples, including clustered and inhibitive point processes, are considered....

  13. Change point analysis and assessment

    DEFF Research Database (Denmark)

    Müller, Sabine; Neergaard, Helle; Ulhøi, John Parm

    2011-01-01

    The aim of this article is to develop an analytical framework for studying processes such as continuous innovation and business development in high-tech SME clusters that transcends the traditional qualitative-quantitative divide. It integrates four existing and well-recognized approaches...... to studying events, processes and change, mamely change-point analysis, event-history analysis, critical-incident technique and sequence analysis....

  14. Implementation of hazard analysis and critical control point (HACCP) in dried anchovy production process

    Science.gov (United States)

    Citraresmi, A. D. P.; Wahyuni, E. E.

    2018-03-01

    The aim of this study was to inspect the implementation of Hazard Analysis and Critical Control Point (HACCP) for identification and prevention of potential hazards in the production process of dried anchovy at PT. Kelola Mina Laut (KML), Lobuk unit, Sumenep. Cold storage process is needed in each anchovy processing step in order to maintain its physical and chemical condition. In addition, the implementation of quality assurance system should be undertaken to maintain product quality. The research was conducted using a survey method, by following the whole process of making anchovy from the receiving raw materials to the packaging of final product. The method of data analysis used was descriptive analysis method. Implementation of HACCP at PT. KML, Lobuk unit, Sumenep was conducted by applying Pre Requisite Programs (PRP) and preparation stage consisting of 5 initial stages and 7 principles of HACCP. The results showed that CCP was found in boiling process flow with significant hazard of Listeria monocytogenesis bacteria and final sorting process with significant hazard of foreign material contamination in the product. Actions taken were controlling boiling temperature of 100 – 105°C for 3 - 5 minutes and training for sorting process employees.

  15. Critical Control Points in the Processing of Cassava Tuber for Ighu ...

    African Journals Online (AJOL)

    Determination of the critical control points in the processing of cassava tuber into Ighu was carried out. The critical control points were determined according to the Codex guidelines for the application of the HACCP system by conducting hazard analysis. Hazard analysis involved proper examination of each processing step ...

  16. Detecting determinism from point processes.

    Science.gov (United States)

    Andrzejak, Ralph G; Mormann, Florian; Kreuz, Thomas

    2014-12-01

    The detection of a nonrandom structure from experimental data can be crucial for the classification, understanding, and interpretation of the generating process. We here introduce a rank-based nonlinear predictability score to detect determinism from point process data. Thanks to its modular nature, this approach can be adapted to whatever signature in the data one considers indicative of deterministic structure. After validating our approach using point process signals from deterministic and stochastic model dynamics, we show an application to neuronal spike trains recorded in the brain of an epilepsy patient. While we illustrate our approach in the context of temporal point processes, it can be readily applied to spatial point processes as well.

  17. Spatial Mixture Modelling for Unobserved Point Processes: Examples in Immunofluorescence Histology.

    Science.gov (United States)

    Ji, Chunlin; Merl, Daniel; Kepler, Thomas B; West, Mike

    2009-12-04

    We discuss Bayesian modelling and computational methods in analysis of indirectly observed spatial point processes. The context involves noisy measurements on an underlying point process that provide indirect and noisy data on locations of point outcomes. We are interested in problems in which the spatial intensity function may be highly heterogenous, and so is modelled via flexible nonparametric Bayesian mixture models. Analysis aims to estimate the underlying intensity function and the abundance of realized but unobserved points. Our motivating applications involve immunological studies of multiple fluorescent intensity images in sections of lymphatic tissue where the point processes represent geographical configurations of cells. We are interested in estimating intensity functions and cell abundance for each of a series of such data sets to facilitate comparisons of outcomes at different times and with respect to differing experimental conditions. The analysis is heavily computational, utilizing recently introduced MCMC approaches for spatial point process mixtures and extending them to the broader new context here of unobserved outcomes. Further, our example applications are problems in which the individual objects of interest are not simply points, but rather small groups of pixels; this implies a need to work at an aggregate pixel region level and we develop the resulting novel methodology for this. Two examples with with immunofluorescence histology data demonstrate the models and computational methodology.

  18. Parametric methods for spatial point processes

    DEFF Research Database (Denmark)

    Møller, Jesper

    is studied in Section 4, and Bayesian inference in Section 5. On one hand, as the development in computer technology and computational statistics continues,computationally-intensive simulation-based methods for likelihood inference probably will play a increasing role for statistical analysis of spatial...... inference procedures for parametric spatial point process models. The widespread use of sensible but ad hoc methods based on summary statistics of the kind studied in Chapter 4.3 have through the last two decades been supplied by likelihood based methods for parametric spatial point process models......(This text is submitted for the volume ‘A Handbook of Spatial Statistics' edited by A.E. Gelfand, P. Diggle, M. Fuentes, and P. Guttorp, to be published by Chapmand and Hall/CRC Press, and planned to appear as Chapter 4.4 with the title ‘Parametric methods'.) 1 Introduction This chapter considers...

  19. Some properties of point processes in statistical optics

    International Nuclear Information System (INIS)

    Picinbono, B.; Bendjaballah, C.

    2010-01-01

    The analysis of the statistical properties of the point process (PP) of photon detection times can be used to determine whether or not an optical field is classical, in the sense that its statistical description does not require the methods of quantum optics. This determination is, however, more difficult than ordinarily admitted and the first aim of this paper is to illustrate this point by using some results of the PP theory. For example, it is well known that the analysis of the photodetection of classical fields exhibits the so-called bunching effect. But this property alone cannot be used to decide the nature of a given optical field. Indeed, we have presented examples of point processes for which a bunching effect appears and yet they cannot be obtained from a classical field. These examples are illustrated by computer simulations. Similarly, it is often admitted that for fields with very low light intensity the bunching or antibunching can be described by using the statistical properties of the distance between successive events of the point process, which simplifies the experimental procedure. We have shown that, while this property is valid for classical PPs, it has no reason to be true for nonclassical PPs, and we have presented some examples of this situation also illustrated by computer simulations.

  20. Post-Processing in the Material-Point Method

    DEFF Research Database (Denmark)

    Andersen, Søren; Andersen, Lars Vabbersgaard

    The material-point method (MPM) is a numerical method for dynamic or static analysis of solids using a discretization in time and space. The method has shown to be successful in modelling physical problems involving large deformations, which are difficult to model with traditional numerical tools...... such as the finite element method. In the material-point method, a set of material points is utilized to track the problem in time and space, while a computational background grid is utilized to obtain spatial derivatives relevant to the physical problem. Currently, the research within the material-point method......-point method. The first idea involves associating a volume with each material point and displaying the deformation of this volume. In the discretization process, the physical domain is divided into a number of smaller volumes each represented by a simple shape; here quadrilaterals are chosen for the presented...

  1. Framework for adaptive multiscale analysis of nonhomogeneous point processes.

    Science.gov (United States)

    Helgason, Hannes; Bartroff, Jay; Abry, Patrice

    2011-01-01

    We develop the methodology for hypothesis testing and model selection in nonhomogeneous Poisson processes, with an eye toward the application of modeling and variability detection in heart beat data. Modeling the process' non-constant rate function using templates of simple basis functions, we develop the generalized likelihood ratio statistic for a given template and a multiple testing scheme to model-select from a family of templates. A dynamic programming algorithm inspired by network flows is used to compute the maximum likelihood template in a multiscale manner. In a numerical example, the proposed procedure is nearly as powerful as the super-optimal procedures that know the true template size and true partition, respectively. Extensions to general history-dependent point processes is discussed.

  2. Inhomogeneous Markov point processes by transformation

    DEFF Research Database (Denmark)

    Jensen, Eva B. Vedel; Nielsen, Linda Stougaard

    2000-01-01

    We construct parametrized models for point processes, allowing for both inhomogeneity and interaction. The inhomogeneity is obtained by applying parametrized transformations to homogeneous Markov point processes. An interesting model class, which can be constructed by this transformation approach......, is that of exponential inhomogeneous Markov point processes. Statistical inference For such processes is discussed in some detail....

  3. Process for structural geologic analysis of topography and point data

    Science.gov (United States)

    Eliason, Jay R.; Eliason, Valerie L. C.

    1987-01-01

    A quantitative method of geologic structural analysis of digital terrain data is described for implementation on a computer. Assuming selected valley segments are controlled by the underlying geologic structure, topographic lows in the terrain data, defining valley bottoms, are detected, filtered and accumulated into a series line segments defining contiguous valleys. The line segments are then vectorized to produce vector segments, defining valley segments, which may be indicative of the underlying geologic structure. Coplanar analysis is performed on vector segment pairs to determine which vectors produce planes which represent underlying geologic structure. Point data such as fracture phenomena which can be related to fracture planes in 3-dimensional space can be analyzed to define common plane orientation and locations. The vectors, points, and planes are displayed in various formats for interpretation.

  4. Seeking a fingerprint: analysis of point processes in actigraphy recording

    Science.gov (United States)

    Gudowska-Nowak, Ewa; Ochab, Jeremi K.; Oleś, Katarzyna; Beldzik, Ewa; Chialvo, Dante R.; Domagalik, Aleksandra; Fąfrowicz, Magdalena; Marek, Tadeusz; Nowak, Maciej A.; Ogińska, Halszka; Szwed, Jerzy; Tyburczyk, Jacek

    2016-05-01

    Motor activity of humans displays complex temporal fluctuations which can be characterised by scale-invariant statistics, thus demonstrating that structure and fluctuations of such kinetics remain similar over a broad range of time scales. Previous studies on humans regularly deprived of sleep or suffering from sleep disorders predicted a change in the invariant scale parameters with respect to those for healthy subjects. In this study we investigate the signal patterns from actigraphy recordings by means of characteristic measures of fractional point processes. We analyse spontaneous locomotor activity of healthy individuals recorded during a week of regular sleep and a week of chronic partial sleep deprivation. Behavioural symptoms of lack of sleep can be evaluated by analysing statistics of duration times during active and resting states, and alteration of behavioural organisation can be assessed by analysis of power laws detected in the event count distribution, distribution of waiting times between consecutive movements and detrended fluctuation analysis of recorded time series. We claim that among different measures characterising complexity of the actigraphy recordings and their variations implied by chronic sleep distress, the exponents characterising slopes of survival functions in resting states are the most effective biomarkers distinguishing between healthy and sleep-deprived groups.

  5. Hierarchical spatial point process analysis for a plant community with high biodiversity

    DEFF Research Database (Denmark)

    Illian, Janine B.; Møller, Jesper; Waagepetersen, Rasmus

    2009-01-01

    A complex multivariate spatial point pattern of a plant community with high biodiversity is modelled using a hierarchical multivariate point process model. In the model, interactions between plants with different post-fire regeneration strategies are of key interest. We consider initially a maxim...

  6. Lévy based Cox point processes

    DEFF Research Database (Denmark)

    Hellmund, Gunnar; Prokesová, Michaela; Jensen, Eva Bjørn Vedel

    2008-01-01

    In this paper we introduce Lévy-driven Cox point processes (LCPs) as Cox point processes with driving intensity function Λ defined by a kernel smoothing of a Lévy basis (an independently scattered, infinitely divisible random measure). We also consider log Lévy-driven Cox point processes (LLCPs......) with Λ equal to the exponential of such a kernel smoothing. Special cases are shot noise Cox processes, log Gaussian Cox processes, and log shot noise Cox processes. We study the theoretical properties of Lévy-based Cox processes, including moment properties described by nth-order product densities...

  7. Analysis of multi-species point patterns using multivariate log Gaussian Cox processes

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Guan, Yongtao; Jalilian, Abdollah

    Multivariate log Gaussian Cox processes are flexible models for multivariate point patterns. However, they have so far only been applied in bivariate cases. In this paper we move beyond the bivariate case in order to model multi-species point patterns of tree locations. In particular we address t...... of the data. The selected number of common latent fields provides an index of complexity of the multivariate covariance structure. Hierarchical clustering is used to identify groups of species with similar patterns of dependence on the common latent fields.......Multivariate log Gaussian Cox processes are flexible models for multivariate point patterns. However, they have so far only been applied in bivariate cases. In this paper we move beyond the bivariate case in order to model multi-species point patterns of tree locations. In particular we address...... the problems of identifying parsimonious models and of extracting biologically relevant information from the fitted models. The latent multivariate Gaussian field is decomposed into components given in terms of random fields common to all species and components which are species specific. This allows...

  8. Parametric statistical change point analysis

    CERN Document Server

    Chen, Jie

    2000-01-01

    This work is an in-depth study of the change point problem from a general point of view and a further examination of change point analysis of the most commonly used statistical models Change point problems are encountered in such disciplines as economics, finance, medicine, psychology, signal processing, and geology, to mention only several The exposition is clear and systematic, with a great deal of introductory material included Different models are presented in each chapter, including gamma and exponential models, rarely examined thus far in the literature Other models covered in detail are the multivariate normal, univariate normal, regression, and discrete models Extensive examples throughout the text emphasize key concepts and different methodologies are used, namely the likelihood ratio criterion, and the Bayesian and information criterion approaches A comprehensive bibliography and two indices complete the study

  9. State estimation for temporal point processes

    NARCIS (Netherlands)

    van Lieshout, Maria Nicolette Margaretha

    2015-01-01

    This paper is concerned with combined inference for point processes on the real line observed in a broken interval. For such processes, the classic history-based approach cannot be used. Instead, we adapt tools from sequential spatial point processes. For a range of models, the marginal and

  10. Spatial point process analysis for a plant community with high biodiversity

    DEFF Research Database (Denmark)

    Illian, Janine; Møller, Jesper; Waagepetersen, Rasmus Plenge

    A complex multivariate spatial point pattern for a plant community with high biodiversity is modelled using a hierarchical multivariate point process model. In the model, interactions between plants with different post-fire regeneration strategies are of key interest. We consider initially...... a maximum likelihood approach to inference where problems arise due to unknown interaction radii for the plants. We next demonstrate that a Bayesian approach provides a flexible framework for incorporating prior information concerning the interaction radii. From an ecological perspective, we are able both...

  11. Point Information Gain and Multidimensional Data Analysis

    Directory of Open Access Journals (Sweden)

    Renata Rychtáriková

    2016-10-01

    Full Text Available We generalize the point information gain (PIG and derived quantities, i.e., point information gain entropy (PIE and point information gain entropy density (PIED, for the case of the Rényi entropy and simulate the behavior of PIG for typical distributions. We also use these methods for the analysis of multidimensional datasets. We demonstrate the main properties of PIE/PIED spectra for the real data with the examples of several images and discuss further possible utilizations in other fields of data processing.

  12. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Science.gov (United States)

    2010-04-01

    ... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... Provisions § 123.6 Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. (a) Hazard... fish or fishery product being processed in the absence of those controls. (b) The HACCP plan. Every...

  13. Mechanistic spatio-temporal point process models for marked point processes, with a view to forest stand data

    DEFF Research Database (Denmark)

    Møller, Jesper; Ghorbani, Mohammad; Rubak, Ege Holger

    We show how a spatial point process, where to each point there is associated a random quantitative mark, can be identified with a spatio-temporal point process specified by a conditional intensity function. For instance, the points can be tree locations, the marks can express the size of trees......, and the conditional intensity function can describe the distribution of a tree (i.e., its location and size) conditionally on the larger trees. This enable us to construct parametric statistical models which are easily interpretable and where likelihood-based inference is tractable. In particular, we consider maximum...

  14. Hazard analysis and critical control point (HACCP) history and conceptual overview.

    Science.gov (United States)

    Hulebak, Karen L; Schlosser, Wayne

    2002-06-01

    The concept of Hazard Analysis and Critical Control Point (HACCP) is a system that enables the production of safe meat and poultry products through the thorough analysis of production processes, identification of all hazards that are likely to occur in the production establishment, the identification of critical points in the process at which these hazards may be introduced into product and therefore should be controlled, the establishment of critical limits for control at those points, the verification of these prescribed steps, and the methods by which the processing establishment and the regulatory authority can monitor how well process control through the HACCP plan is working. The history of the development of HACCP is reviewed, and examples of practical applications of HACCP are described.

  15. Testing Local Independence between Two Point Processes

    DEFF Research Database (Denmark)

    Allard, Denis; Brix, Anders; Chadæuf, Joël

    2001-01-01

    Independence test, Inhomogeneous point processes, Local test, Monte Carlo, Nonstationary, Rotations, Spatial pattern, Tiger bush......Independence test, Inhomogeneous point processes, Local test, Monte Carlo, Nonstationary, Rotations, Spatial pattern, Tiger bush...

  16. Dew point vs bubble point : a misunderstood constraint on gravity drainage processes

    Energy Technology Data Exchange (ETDEWEB)

    Nenninger, J. [N-Solv Corp., Calgary, AB (Canada); Gunnewiek, L. [Hatch Ltd., Mississauga, ON (Canada)

    2009-07-01

    This study demonstrated that gravity drainage processes that use blended fluids such as solvents have an inherently unstable material balance due to differences between dew point and bubble point compositions. The instability can lead to the accumulation of volatile components within the chamber, and impair mass and heat transfer processes. Case studies were used to demonstrate the large temperature gradients within the vapour chamber caused by temperature differences between the bubble point and dew point for blended fluids. A review of published data showed that many experiments on in-situ processes do not account for unstable material balances caused by a lack of steam trap control. A study of temperature profiles during steam assisted gravity drainage (SAGD) studies showed significant temperature depressions caused by methane accumulations at the outside perimeter of the steam chamber. It was demonstrated that the condensation of large volumes of purified solvents provided an efficient mechanism for the removal of methane from the chamber. It was concluded that gravity drainage processes can be optimized by using pure propane during the injection process. 22 refs., 1 tab., 18 figs.

  17. Modelling estimation and analysis of dynamic processes from image sequences using temporal random closed sets and point processes with application to the cell exocytosis and endocytosis

    OpenAIRE

    Díaz Fernández, Ester

    2010-01-01

    In this thesis, new models and methodologies are introduced for the analysis of dynamic processes characterized by image sequences with spatial temporal overlapping. The spatial temporal overlapping exists in many natural phenomena and should be addressed properly in several Science disciplines such as Microscopy, Material Sciences, Biology, Geostatistics or Communication Networks. This work is related to the Point Process and Random Closed Set theories, within Stochastic Ge...

  18. Fixed-point signal processing

    CERN Document Server

    Padgett, Wayne T

    2009-01-01

    This book is intended to fill the gap between the ""ideal precision"" digital signal processing (DSP) that is widely taught, and the limited precision implementation skills that are commonly required in fixed-point processors and field programmable gate arrays (FPGAs). These skills are often neglected at the university level, particularly for undergraduates. We have attempted to create a resource both for a DSP elective course and for the practicing engineer with a need to understand fixed-point implementation. Although we assume a background in DSP, Chapter 2 contains a review of basic theory

  19. Comparison of second-generation processes for the conversion of sugarcane bagasse to liquid biofuels in terms of energy efficiency, pinch point analysis and Life Cycle Analysis

    International Nuclear Information System (INIS)

    Petersen, A.M.; Melamu, Rethabi; Knoetze, J.H.; Görgens, J.F.

    2015-01-01

    Highlights: • Process evaluation of thermochemical and biological routes for bagasse to fuels. • Pinch point analysis increases overall efficiencies by reducing utility consumption. • Advanced biological route increased efficiency and local environmental impacts. • Thermochemical routes have the highest efficiencies and low life cycle impacts. - Abstract: Three alternative processes for the production of liquid transportation biofuels from sugar cane bagasse were compared, on the perspective of energy efficiencies using process modelling, Process Environmental Assessments and Life Cycle Assessment. Bio-ethanol via two biological processes was considered, i.e. Separate Hydrolysis and Fermentation (Process 1) and Simultaneous Saccharification and Fermentation (Process 2), in comparison to Gasification and Fischer Tropsch synthesis for the production of synthetic fuels (Process 3). The energy efficiency of each process scenario was maximised by pinch point analysis for heat integration. The more advanced bio-ethanol process was Process 2 and it had a higher energy efficiency at 42.3%. Heat integration was critical for the Process 3, whereby the energy efficiency was increased from 51.6% to 55.7%. For both the Process Environmental and Life Cycle Assessment, Process 3 had the least potential for detrimental environmental impacts, due to its relatively high energy efficiency. Process 2 had the greatest Process Environmental Impact due to the intensive use of processing chemicals. Regarding the Life Cycle Assessments, Process 1 was the most severe due to its low energy efficiency

  20. Shot-noise-weighted processes : a new family of spatial point processes

    NARCIS (Netherlands)

    M.N.M. van Lieshout (Marie-Colette); I.S. Molchanov (Ilya)

    1995-01-01

    textabstractThe paper suggests a new family of of spatial point processes distributions. They are defined by means of densities with respect to the Poisson point process within a bounded set. These densities are given in terms of a functional of the shot-noise process with a given influence

  1. Self-exciting point process in modeling earthquake occurrences

    International Nuclear Information System (INIS)

    Pratiwi, H.; Slamet, I.; Respatiwulan; Saputro, D. R. S.

    2017-01-01

    In this paper, we present a procedure for modeling earthquake based on spatial-temporal point process. The magnitude distribution is expressed as truncated exponential and the event frequency is modeled with a spatial-temporal point process that is characterized uniquely by its associated conditional intensity process. The earthquakes can be regarded as point patterns that have a temporal clustering feature so we use self-exciting point process for modeling the conditional intensity function. The choice of main shocks is conducted via window algorithm by Gardner and Knopoff and the model can be fitted by maximum likelihood method for three random variables. (paper)

  2. Multiscale change-point analysis of inhomogeneous Poisson processes using unbalanced wavelet decompositions

    NARCIS (Netherlands)

    Jansen, M.H.; Di Bucchianico, A.; Mattheij, R.M.M.; Peletier, M.A.

    2006-01-01

    We present a continuous wavelet analysis of count data with timevarying intensities. The objective is to extract intervals with significant intensities from background intervals. This includes the precise starting point of the significant interval, its exact duration and the (average) level of

  3. Estimating Function Approaches for Spatial Point Processes

    Science.gov (United States)

    Deng, Chong

    Spatial point pattern data consist of locations of events that are often of interest in biological and ecological studies. Such data are commonly viewed as a realization from a stochastic process called spatial point process. To fit a parametric spatial point process model to such data, likelihood-based methods have been widely studied. However, while maximum likelihood estimation is often too computationally intensive for Cox and cluster processes, pairwise likelihood methods such as composite likelihood, Palm likelihood usually suffer from the loss of information due to the ignorance of correlation among pairs. For many types of correlated data other than spatial point processes, when likelihood-based approaches are not desirable, estimating functions have been widely used for model fitting. In this dissertation, we explore the estimating function approaches for fitting spatial point process models. These approaches, which are based on the asymptotic optimal estimating function theories, can be used to incorporate the correlation among data and yield more efficient estimators. We conducted a series of studies to demonstrate that these estmating function approaches are good alternatives to balance the trade-off between computation complexity and estimating efficiency. First, we propose a new estimating procedure that improves the efficiency of pairwise composite likelihood method in estimating clustering parameters. Our approach combines estimating functions derived from pairwise composite likeli-hood estimation and estimating functions that account for correlations among the pairwise contributions. Our method can be used to fit a variety of parametric spatial point process models and can yield more efficient estimators for the clustering parameters than pairwise composite likelihood estimation. We demonstrate its efficacy through a simulation study and an application to the longleaf pine data. Second, we further explore the quasi-likelihood approach on fitting

  4. Effect of processing conditions on oil point pressure of moringa oleifera seed.

    Science.gov (United States)

    Aviara, N A; Musa, W B; Owolarafe, O K; Ogunsina, B S; Oluwole, F A

    2015-07-01

    Seed oil expression is an important economic venture in rural Nigeria. The traditional techniques of carrying out the operation is not only energy sapping and time consuming but also wasteful. In order to reduce the tedium involved in the expression of oil from moringa oleifera seed and develop efficient equipment for carrying out the operation, the oil point pressure of the seed was determined under different processing conditions using a laboratory press. The processing conditions employed were moisture content (4.78, 6.00, 8.00 and 10.00 % wet basis), heating temperature (50, 70, 85 and 100 °C) and heating time (15, 20, 25 and 30 min). Results showed that the oil point pressure increased with increase in seed moisture content, but decreased with increase in heating temperature and heating time within the above ranges. Highest oil point pressure value of 1.1239 MPa was obtained at the processing conditions of 10.00 % moisture content, 50 °C heating temperature and 15 min heating time. The lowest oil point pressure obtained was 0.3164 MPa and it occurred at the moisture content of 4.78 %, heating temperature of 100 °C and heating time of 30 min. Analysis of Variance (ANOVA) showed that all the processing variables and their interactions had significant effect on the oil point pressure of moringa oleifera seed at 1 % level of significance. This was further demonstrated using Response Surface Methodology (RSM). Tukey's test and Duncan's Multiple Range Analysis successfully separated the means and a multiple regression equation was used to express the relationship existing between the oil point pressure of moringa oleifera seed and its moisture content, processing temperature, heating time and their interactions. The model yielded coefficients that enabled the oil point pressure of the seed to be predicted with very high coefficient of determination.

  5. Fourier analysis and stochastic processes

    CERN Document Server

    Brémaud, Pierre

    2014-01-01

    This work is unique as it provides a uniform treatment of the Fourier theories of functions (Fourier transforms and series, z-transforms), finite measures (characteristic functions, convergence in distribution), and stochastic processes (including arma series and point processes). It emphasises the links between these three themes. The chapter on the Fourier theory of point processes and signals structured by point processes is a novel addition to the literature on Fourier analysis of stochastic processes. It also connects the theory with recent lines of research such as biological spike signals and ultrawide-band communications. Although the treatment is mathematically rigorous, the convivial style makes the book accessible to a large audience. In particular, it will be interesting to anyone working in electrical engineering and communications, biology (point process signals) and econometrics (arma models). A careful review of the prerequisites (integration and probability theory in the appendix, Hilbert spa...

  6. SHAPE FROM TEXTURE USING LOCALLY SCALED POINT PROCESSES

    Directory of Open Access Journals (Sweden)

    Eva-Maria Didden

    2015-09-01

    Full Text Available Shape from texture refers to the extraction of 3D information from 2D images with irregular texture. This paper introduces a statistical framework to learn shape from texture where convex texture elements in a 2D image are represented through a point process. In a first step, the 2D image is preprocessed to generate a probability map corresponding to an estimate of the unnormalized intensity of the latent point process underlying the texture elements. The latent point process is subsequently inferred from the probability map in a non-parametric, model free manner. Finally, the 3D information is extracted from the point pattern by applying a locally scaled point process model where the local scaling function represents the deformation caused by the projection of a 3D surface onto a 2D image.

  7. Processing Terrain Point Cloud Data

    KAUST Repository

    DeVore, Ronald

    2013-01-10

    Terrain point cloud data are typically acquired through some form of Light Detection And Ranging sensing. They form a rich resource that is important in a variety of applications including navigation, line of sight, and terrain visualization. Processing terrain data has not received the attention of other forms of surface reconstruction or of image processing. The goal of terrain data processing is to convert the point cloud into a succinct representation system that is amenable to the various application demands. The present paper presents a platform for terrain processing built on the following principles: (i) measuring distortion in the Hausdorff metric, which we argue is a good match for the application demands, (ii) a multiscale representation based on tree approximation using local polynomial fitting. The basic elements held in the nodes of the tree can be efficiently encoded, transmitted, visualized, and utilized for the various target applications. Several challenges emerge because of the variable resolution of the data, missing data, occlusions, and noise. Techniques for identifying and handling these challenges are developed. © 2013 Society for Industrial and Applied Mathematics.

  8. Transforming spatial point processes into Poisson processes using random superposition

    DEFF Research Database (Denmark)

    Møller, Jesper; Berthelsen, Kasper Klitgaaard

    with a complementary spatial point process Y  to obtain a Poisson process X∪Y  with intensity function β. Underlying this is a bivariate spatial birth-death process (Xt,Yt) which converges towards the distribution of (X,Y). We study the joint distribution of X and Y, and their marginal and conditional distributions....... In particular, we introduce a fast and easy simulation procedure for Y conditional on X. This may be used for model checking: given a model for the Papangelou intensity of the original spatial point process, this model is used to generate the complementary process, and the resulting superposition is a Poisson...... process with intensity function β if and only if the true Papangelou intensity is used. Whether the superposition is actually such a Poisson process can easily be examined using well known results and fast simulation procedures for Poisson processes. We illustrate this approach to model checking...

  9. Digital analyzer for point processes based on first-in-first-out memories

    Science.gov (United States)

    Basano, Lorenzo; Ottonello, Pasquale; Schiavi, Enore

    1992-06-01

    We present an entirely new version of a multipurpose instrument designed for the statistical analysis of point processes, especially those characterized by high bunching. A long sequence of pulses can be recorded in the RAM bank of a personal computer via a suitably designed front end which employs a pair of first-in-first-out (FIFO) memories; these allow one to build an analyzer that, besides being simpler from the electronic point of view, is capable of sustaining much higher intensity fluctuations of the point process. The overflow risk of the device is evaluated by treating the FIFO pair as a queueing system. The apparatus was tested using both a deterministic signal and a sequence of photoelectrons obtained from laser light scattered by random surfaces.

  10. Point cloud processing for smart systems

    Directory of Open Access Journals (Sweden)

    Jaromír Landa

    2013-01-01

    Full Text Available High population as well as the economical tension emphasises the necessity of effective city management – from land use planning to urban green maintenance. The management effectiveness is based on precise knowledge of the city environment. Point clouds generated by mobile and terrestrial laser scanners provide precise data about objects in the scanner vicinity. From these data pieces the state of the roads, buildings, trees and other objects important for this decision-making process can be obtained. Generally, they can support the idea of “smart” or at least “smarter” cities.Unfortunately the point clouds do not provide this type of information automatically. It has to be extracted. This extraction is done by expert personnel or by object recognition software. As the point clouds can represent large areas (streets or even cities, usage of expert personnel to identify the required objects can be very time-consuming, therefore cost ineffective. Object recognition software allows us to detect and identify required objects semi-automatically or automatically.The first part of the article reviews and analyses the state of current art point cloud object recognition techniques. The following part presents common formats used for point cloud storage and frequently used software tools for point cloud processing. Further, a method for extraction of geospatial information about detected objects is proposed. Therefore, the method can be used not only to recognize the existence and shape of certain objects, but also to retrieve their geospatial properties. These objects can be later directly used in various GIS systems for further analyses.

  11. Digital image processing and analysis human and computer vision applications with CVIPtools

    CERN Document Server

    Umbaugh, Scott E

    2010-01-01

    Section I Introduction to Digital Image Processing and AnalysisDigital Image Processing and AnalysisOverviewImage Analysis and Computer VisionImage Processing and Human VisionKey PointsExercisesReferencesFurther ReadingComputer Imaging SystemsImaging Systems OverviewImage Formation and SensingCVIPtools SoftwareImage RepresentationKey PointsExercisesSupplementary ExercisesReferencesFurther ReadingSection II Digital Image Analysis and Computer VisionIntroduction to Digital Image AnalysisIntroductionPreprocessingBinary Image AnalysisKey PointsExercisesSupplementary ExercisesReferencesFurther Read

  12. A logistic regression estimating function for spatial Gibbs point processes

    DEFF Research Database (Denmark)

    Baddeley, Adrian; Coeurjolly, Jean-François; Rubak, Ege

    We propose a computationally efficient logistic regression estimating function for spatial Gibbs point processes. The sample points for the logistic regression consist of the observed point pattern together with a random pattern of dummy points. The estimating function is closely related to the p......We propose a computationally efficient logistic regression estimating function for spatial Gibbs point processes. The sample points for the logistic regression consist of the observed point pattern together with a random pattern of dummy points. The estimating function is closely related...

  13. Simple computation of reaction–diffusion processes on point clouds

    KAUST Repository

    Macdonald, Colin B.

    2013-05-20

    The study of reaction-diffusion processes is much more complicated on general curved surfaces than on standard Cartesian coordinate spaces. Here we show how to formulate and solve systems of reaction-diffusion equations on surfaces in an extremely simple way, using only the standard Cartesian form of differential operators, and a discrete unorganized point set to represent the surface. Our method decouples surface geometry from the underlying differential operators. As a consequence, it becomes possible to formulate and solve rather general reaction-diffusion equations on general surfaces without having to consider the complexities of differential geometry or sophisticated numerical analysis. To illustrate the generality of the method, computations for surface diffusion, pattern formation, excitable media, and bulk-surface coupling are provided for a variety of complex point cloud surfaces.

  14. Simple computation of reaction–diffusion processes on point clouds

    KAUST Repository

    Macdonald, Colin B.; Merriman, Barry; Ruuth, Steven J.

    2013-01-01

    The study of reaction-diffusion processes is much more complicated on general curved surfaces than on standard Cartesian coordinate spaces. Here we show how to formulate and solve systems of reaction-diffusion equations on surfaces in an extremely simple way, using only the standard Cartesian form of differential operators, and a discrete unorganized point set to represent the surface. Our method decouples surface geometry from the underlying differential operators. As a consequence, it becomes possible to formulate and solve rather general reaction-diffusion equations on general surfaces without having to consider the complexities of differential geometry or sophisticated numerical analysis. To illustrate the generality of the method, computations for surface diffusion, pattern formation, excitable media, and bulk-surface coupling are provided for a variety of complex point cloud surfaces.

  15. Weak convergence of marked point processes generated by crossings of multivariate jump processes

    DEFF Research Database (Denmark)

    Tamborrino, Massimiliano; Sacerdote, Laura; Jacobsen, Martin

    2014-01-01

    We consider the multivariate point process determined by the crossing times of the components of a multivariate jump process through a multivariate boundary, assuming to reset each component to an initial value after its boundary crossing. We prove that this point process converges weakly...... process converging to a multivariate Ornstein–Uhlenbeck process is discussed as a guideline for applying diffusion limits for jump processes. We apply our theoretical findings to neural network modeling. The proposed model gives a mathematical foundation to the generalization of the class of Leaky...

  16. Food safety and nutritional quality for the prevention of non communicable diseases: the Nutrient, hazard Analysis and Critical Control Point process (NACCP).

    Science.gov (United States)

    Di Renzo, Laura; Colica, Carmen; Carraro, Alberto; Cenci Goga, Beniamino; Marsella, Luigi Tonino; Botta, Roberto; Colombo, Maria Laura; Gratteri, Santo; Chang, Ting Fa Margherita; Droli, Maurizio; Sarlo, Francesca; De Lorenzo, Antonino

    2015-04-23

    The important role of food and nutrition in public health is being increasingly recognized as crucial for its potential impact on health-related quality of life and the economy, both at the societal and individual levels. The prevalence of non-communicable diseases calls for a reformulation of our view of food. The Hazard Analysis and Critical Control Point (HACCP) system, first implemented in the EU with the Directive 43/93/CEE, later replaced by Regulation CE 178/2002 and Regulation CE 852/2004, is the internationally agreed approach for food safety control. Our aim is to develop a new procedure for the assessment of the Nutrient, hazard Analysis and Critical Control Point (NACCP) process, for total quality management (TMQ), and optimize nutritional levels. NACCP was based on four general principles: i) guarantee of health maintenance; ii) evaluate and assure the nutritional quality of food and TMQ; iii) give correct information to the consumers; iv) ensure an ethical profit. There are three stages for the application of the NACCP process: 1) application of NACCP for quality principles; 2) application of NACCP for health principals; 3) implementation of the NACCP process. The actions are: 1) identification of nutritional markers, which must remain intact throughout the food supply chain; 2) identification of critical control points which must monitored in order to minimize the likelihood of a reduction in quality; 3) establishment of critical limits to maintain adequate levels of nutrient; 4) establishment, and implementation of effective monitoring procedures of critical control points; 5) establishment of corrective actions; 6) identification of metabolic biomarkers; 7) evaluation of the effects of food intake, through the application of specific clinical trials; 8) establishment of procedures for consumer information; 9) implementation of the Health claim Regulation EU 1924/2006; 10) starting a training program. We calculate the risk assessment as follows

  17. Integrated modeling and analysis methodology for precision pointing applications

    Science.gov (United States)

    Gutierrez, Homero L.

    2002-07-01

    Space-based optical systems that perform tasks such as laser communications, Earth imaging, and astronomical observations require precise line-of-sight (LOS) pointing. A general approach is described for integrated modeling and analysis of these types of systems within the MATLAB/Simulink environment. The approach can be applied during all stages of program development, from early conceptual design studies to hardware implementation phases. The main objective is to predict the dynamic pointing performance subject to anticipated disturbances and noise sources. Secondary objectives include assessing the control stability, levying subsystem requirements, supporting pointing error budgets, and performing trade studies. The integrated model resides in Simulink, and several MATLAB graphical user interfaces (GUI"s) allow the user to configure the model, select analysis options, run analyses, and process the results. A convenient parameter naming and storage scheme, as well as model conditioning and reduction tools and run-time enhancements, are incorporated into the framework. This enables the proposed architecture to accommodate models of realistic complexity.

  18. A tutorial on Palm distributions for spatial point processes

    DEFF Research Database (Denmark)

    Coeurjolly, Jean-Francois; Møller, Jesper; Waagepetersen, Rasmus Plenge

    2017-01-01

    This tutorial provides an introduction to Palm distributions for spatial point processes. Initially, in the context of finite point processes, we give an explicit definition of Palm distributions in terms of their density functions. Then we review Palm distributions in the general case. Finally, we...

  19. Efficient point cloud data processing in shipbuilding: Reformative component extraction method and registration method

    Directory of Open Access Journals (Sweden)

    Jingyu Sun

    2014-07-01

    Full Text Available To survive in the current shipbuilding industry, it is of vital importance for shipyards to have the ship components’ accuracy evaluated efficiently during most of the manufacturing steps. Evaluating components’ accuracy by comparing each component’s point cloud data scanned by laser scanners and the ship’s design data formatted in CAD cannot be processed efficiently when (1 extract components from point cloud data include irregular obstacles endogenously, or when (2 registration of the two data sets have no clear direction setting. This paper presents reformative point cloud data processing methods to solve these problems. K-d tree construction of the point cloud data fastens a neighbor searching of each point. Region growing method performed on the neighbor points of the seed point extracts the continuous part of the component, while curved surface fitting and B-spline curved line fitting at the edge of the continuous part recognize the neighbor domains of the same component divided by obstacles’ shadows. The ICP (Iterative Closest Point algorithm conducts a registration of the two sets of data after the proper registration’s direction is decided by principal component analysis. By experiments conducted at the shipyard, 200 curved shell plates are extracted from the scanned point cloud data, and registrations are conducted between them and the designed CAD data using the proposed methods for an accuracy evaluation. Results show that the methods proposed in this paper support the accuracy evaluation targeted point cloud data processing efficiently in practice.

  20. Determinantal point process models on the sphere

    DEFF Research Database (Denmark)

    Møller, Jesper; Nielsen, Morten; Porcu, Emilio

    defined on Sd × Sd . We review the appealing properties of such processes, including their specific moment properties, density expressions and simulation procedures. Particularly, we characterize and construct isotropic DPPs models on Sd , where it becomes essential to specify the eigenvalues......We consider determinantal point processes on the d-dimensional unit sphere Sd . These are finite point processes exhibiting repulsiveness and with moment properties determined by a certain determinant whose entries are specified by a so-called kernel which we assume is a complex covariance function...... and eigenfunctions in a spectral representation for the kernel, and we figure out how repulsive isotropic DPPs can be. Moreover, we discuss the shortcomings of adapting existing models for isotropic covariance functions and consider strategies for developing new models, including a useful spectral approach....

  1. Linear and quadratic models of point process systems: contributions of patterned input to output.

    Science.gov (United States)

    Lindsay, K A; Rosenberg, J R

    2012-08-01

    In the 1880's Volterra characterised a nonlinear system using a functional series connecting continuous input and continuous output. Norbert Wiener, in the 1940's, circumvented problems associated with the application of Volterra series to physical problems by deriving from it a new series of terms that are mutually uncorrelated with respect to Gaussian processes. Subsequently, Brillinger, in the 1970's, introduced a point-process analogue of Volterra's series connecting point-process inputs to the instantaneous rate of point-process output. We derive here a new series from this analogue in which its terms are mutually uncorrelated with respect to Poisson processes. This new series expresses how patterned input in a spike train, represented by third-order cross-cumulants, is converted into the instantaneous rate of an output point-process. Given experimental records of suitable duration, the contribution of arbitrary patterned input to an output process can, in principle, be determined. Solutions for linear and quadratic point-process models with one and two inputs and a single output are investigated. Our theoretical results are applied to isolated muscle spindle data in which the spike trains from the primary and secondary endings from the same muscle spindle are recorded in response to stimulation of one and then two static fusimotor axons in the absence and presence of a random length change imposed on the parent muscle. For a fixed mean rate of input spikes, the analysis of the experimental data makes explicit which patterns of two input spikes contribute to an output spike. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Non-parametric Bayesian inference for inhomogeneous Markov point processes

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper; Johansen, Per Michael

    is a shot noise process, and the interaction function for a pair of points depends only on the distance between the two points and is a piecewise linear function modelled by a marked Poisson process. Simulation of the resulting posterior using a Metropolis-Hastings algorithm in the "conventional" way...

  3. Statistical aspects of determinantal point processes

    DEFF Research Database (Denmark)

    Lavancier, Frédéric; Møller, Jesper; Rubak, Ege

    The statistical aspects of determinantal point processes (DPPs) seem largely unexplored. We review the appealing properties of DDPs, demonstrate that they are useful models for repulsiveness, detail a simulation procedure, and provide freely available software for simulation and statistical infer...

  4. Point processes and the position distribution of infinite boson systems

    International Nuclear Information System (INIS)

    Fichtner, K.H.; Freudenberg, W.

    1987-01-01

    It is shown that to each locally normal state of a boson system one can associate a point process that can be interpreted as the position distribution of the state. The point process contains all information one can get by position measurements and is determined by the latter. On the other hand, to each so-called Σ/sup c/-point process Q they relate a locally normal state with position distribution Q

  5. PROCESSING UAV AND LIDAR POINT CLOUDS IN GRASS GIS

    Directory of Open Access Journals (Sweden)

    V. Petras

    2016-06-01

    Full Text Available Today’s methods of acquiring Earth surface data, namely lidar and unmanned aerial vehicle (UAV imagery, non-selectively collect or generate large amounts of points. Point clouds from different sources vary in their properties such as number of returns, density, or quality. We present a set of tools with applications for different types of points clouds obtained by a lidar scanner, structure from motion technique (SfM, and a low-cost 3D scanner. To take advantage of the vertical structure of multiple return lidar point clouds, we demonstrate tools to process them using 3D raster techniques which allow, for example, the development of custom vegetation classification methods. Dense point clouds obtained from UAV imagery, often containing redundant points, can be decimated using various techniques before further processing. We implemented and compared several decimation techniques in regard to their performance and the final digital surface model (DSM. Finally, we will describe the processing of a point cloud from a low-cost 3D scanner, namely Microsoft Kinect, and its application for interaction with physical models. All the presented tools are open source and integrated in GRASS GIS, a multi-purpose open source GIS with remote sensing capabilities. The tools integrate with other open source projects, specifically Point Data Abstraction Library (PDAL, Point Cloud Library (PCL, and OpenKinect libfreenect2 library to benefit from the open source point cloud ecosystem. The implementation in GRASS GIS ensures long term maintenance and reproducibility by the scientific community but also by the original authors themselves.

  6. Modern Statistics for Spatial Point Processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Waagepetersen, Rasmus

    2007-01-01

    We summarize and discuss the current state of spatial point process theory and directions for future research, making an analogy with generalized linear models and random effect models, and illustrating the theory with various examples of applications. In particular, we consider Poisson, Gibbs...

  7. Modern statistics for spatial point processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Waagepetersen, Rasmus

    We summarize and discuss the current state of spatial point process theory and directions for future research, making an analogy with generalized linear models and random effect models, and illustrating the theory with various examples of applications. In particular, we consider Poisson, Gibbs...

  8. Assessment of Peer Mediation Process from Conflicting Students’ Point of Views

    Directory of Open Access Journals (Sweden)

    Fulya TÜRK

    2016-12-01

    Full Text Available The purpose of this study was to analyze peer mediation process that was applied in a high school on conflicting students’ point of views. This research was carried out in a high school in Denizli. After ten sessions of training in peer mediation, peer mediators mediated peers’ real conflicts. In the research, 41 students (28 girls, 13 boys who got help at least once were interviewed as a party to the conflict. Through semistructured interviews with conflicting students, the mediation process has been evaluated through the point of views of students. Eight questions were asked about the conflicting parties. Verbal data obtained from interviews were analyzed using the content analysis. When conflicting students’ opinions and experiences about peer mediation were analyzed, it is seen that they were satisfied regarding the process, they have resolved their conflicts in a constructive and peaceful way, their friendship has been continuing as before. All of these results also indicate that peer mediation is an effective method of resolving student conflicts constructively

  9. Quality control for electron beam processing of polymeric materials by end-point analysis

    International Nuclear Information System (INIS)

    DeGraff, E.; McLaughlin, W.L.

    1981-01-01

    Properties of certain plastics, e.g. polytetrafluoroethylene, polyethylene, ethylene vinyl acetate copolymer, can be modified selectively by ionizing radiation. One of the advantages of this treatment over chemical methods is better control of the process and the end-product properties. The most convenient method of dosimetry for monitoring quality control is post-irradiation evaluation of the plastic itself, e.g., melt index and melt point determination. It is shown that by proper calibration in terms of total dose and sufficiently reproducible radiation effects, such product test methods provide convenient and meaningful analyses. Other appropriate standardized analytical methods include stress-crack resistance, stress-strain-to-fracture testing and solubility determination. Standard routine dosimetry over the dose and dose rate ranges of interest confirm that measured product end points can be correlated with calibrated values of absorbed dose in the product within uncertainty limits of the measurements. (author)

  10. Statistical properties of several models of fractional random point processes

    Science.gov (United States)

    Bendjaballah, C.

    2011-08-01

    Statistical properties of several models of fractional random point processes have been analyzed from the counting and time interval statistics points of view. Based on the criterion of the reduced variance, it is seen that such processes exhibit nonclassical properties. The conditions for these processes to be treated as conditional Poisson processes are examined. Numerical simulations illustrate part of the theoretical calculations.

  11. A Marked Point Process Framework for Extracellular Electrical Potentials

    Directory of Open Access Journals (Sweden)

    Carlos A. Loza

    2017-12-01

    Full Text Available Neuromodulations are an important component of extracellular electrical potentials (EEP, such as the Electroencephalogram (EEG, Electrocorticogram (ECoG and Local Field Potentials (LFP. This spatially temporal organized multi-frequency transient (phasic activity reflects the multiscale spatiotemporal synchronization of neuronal populations in response to external stimuli or internal physiological processes. We propose a novel generative statistical model of a single EEP channel, where the collected signal is regarded as the noisy addition of reoccurring, multi-frequency phasic events over time. One of the main advantages of the proposed framework is the exceptional temporal resolution in the time location of the EEP phasic events, e.g., up to the sampling period utilized in the data collection. Therefore, this allows for the first time a description of neuromodulation in EEPs as a Marked Point Process (MPP, represented by their amplitude, center frequency, duration, and time of occurrence. The generative model for the multi-frequency phasic events exploits sparseness and involves a shift-invariant implementation of the clustering technique known as k-means. The cost function incorporates a robust estimation component based on correntropy to mitigate the outliers caused by the inherent noise in the EEP. Lastly, the background EEP activity is explicitly modeled as the non-sparse component of the collected signal to further improve the delineation of the multi-frequency phasic events in time. The framework is validated using two publicly available datasets: the DREAMS sleep spindles database and one of the Brain-Computer Interface (BCI competition datasets. The results achieve benchmark performance and provide novel quantitative descriptions based on power, event rates and timing in order to assess behavioral correlates beyond the classical power spectrum-based analysis. This opens the possibility for a unifying point process framework of

  12. Extreme values, regular variation and point processes

    CERN Document Server

    Resnick, Sidney I

    1987-01-01

    Extremes Values, Regular Variation and Point Processes is a readable and efficient account of the fundamental mathematical and stochastic process techniques needed to study the behavior of extreme values of phenomena based on independent and identically distributed random variables and vectors It presents a coherent treatment of the distributional and sample path fundamental properties of extremes and records It emphasizes the core primacy of three topics necessary for understanding extremes the analytical theory of regularly varying functions; the probabilistic theory of point processes and random measures; and the link to asymptotic distribution approximations provided by the theory of weak convergence of probability measures in metric spaces The book is self-contained and requires an introductory measure-theoretic course in probability as a prerequisite Almost all sections have an extensive list of exercises which extend developments in the text, offer alternate approaches, test mastery and provide for enj...

  13. On statistical analysis of compound point process

    Czech Academy of Sciences Publication Activity Database

    Volf, Petr

    2006-01-01

    Roč. 35, 2-3 (2006), s. 389-396 ISSN 1026-597X R&D Projects: GA ČR(CZ) GA402/04/1294 Institutional research plan: CEZ:AV0Z10750506 Keywords : counting process * compound process * hazard function * Cox -model Subject RIV: BB - Applied Statistics, Operational Research

  14. A case study on point process modelling in disease mapping

    DEFF Research Database (Denmark)

    Møller, Jesper; Waagepetersen, Rasmus Plenge; Benes, Viktor

    2005-01-01

    of the risk on the covariates. Instead of using the common areal level approaches we base the analysis on a Bayesian approach for a log Gaussian Cox point process with covariates. Posterior characteristics for a discretized version of the log Gaussian Cox process are computed using Markov chain Monte Carlo...... methods. A particular problem which is thoroughly discussed is to determine a model for the background population density. The risk map shows a clear dependency with the population intensity models and the basic model which is adopted for the population intensity determines what covariates influence...... the risk of TBE. Model validation is based on the posterior predictive distribution of various summary statistics....

  15. Analysis of the stochastic channel model by Saleh & Valenzuela via the theory of point processes

    DEFF Research Database (Denmark)

    Jakobsen, Morten Lomholt; Pedersen, Troels; Fleury, Bernard Henri

    2012-01-01

    and underlying features, like the intensity function of the component delays and the delaypower intensity. The flexibility and clarity of the mathematical instruments utilized to obtain these results lead us to conjecture that the theory of spatial point processes provides a unifying mathematical framework...

  16. Critical point analysis of phase envelope diagram

    International Nuclear Information System (INIS)

    Soetikno, Darmadi; Siagian, Ucok W. R.; Kusdiantara, Rudy; Puspita, Dila; Sidarto, Kuntjoro A.; Soewono, Edy; Gunawan, Agus Y.

    2014-01-01

    Phase diagram or phase envelope is a relation between temperature and pressure that shows the condition of equilibria between the different phases of chemical compounds, mixture of compounds, and solutions. Phase diagram is an important issue in chemical thermodynamics and hydrocarbon reservoir. It is very useful for process simulation, hydrocarbon reactor design, and petroleum engineering studies. It is constructed from the bubble line, dew line, and critical point. Bubble line and dew line are composed of bubble points and dew points, respectively. Bubble point is the first point at which the gas is formed when a liquid is heated. Meanwhile, dew point is the first point where the liquid is formed when the gas is cooled. Critical point is the point where all of the properties of gases and liquids are equal, such as temperature, pressure, amount of substance, and others. Critical point is very useful in fuel processing and dissolution of certain chemicals. Here in this paper, we will show the critical point analytically. Then, it will be compared with numerical calculations of Peng-Robinson equation by using Newton-Raphson method. As case studies, several hydrocarbon mixtures are simulated using by Matlab

  17. Critical point analysis of phase envelope diagram

    Energy Technology Data Exchange (ETDEWEB)

    Soetikno, Darmadi; Siagian, Ucok W. R. [Department of Petroleum Engineering, Institut Teknologi Bandung, Jl. Ganesha 10, Bandung 40132 (Indonesia); Kusdiantara, Rudy, E-mail: rkusdiantara@s.itb.ac.id; Puspita, Dila, E-mail: rkusdiantara@s.itb.ac.id; Sidarto, Kuntjoro A., E-mail: rkusdiantara@s.itb.ac.id; Soewono, Edy; Gunawan, Agus Y. [Department of Mathematics, Institut Teknologi Bandung, Jl. Ganesha 10, Bandung 40132 (Indonesia)

    2014-03-24

    Phase diagram or phase envelope is a relation between temperature and pressure that shows the condition of equilibria between the different phases of chemical compounds, mixture of compounds, and solutions. Phase diagram is an important issue in chemical thermodynamics and hydrocarbon reservoir. It is very useful for process simulation, hydrocarbon reactor design, and petroleum engineering studies. It is constructed from the bubble line, dew line, and critical point. Bubble line and dew line are composed of bubble points and dew points, respectively. Bubble point is the first point at which the gas is formed when a liquid is heated. Meanwhile, dew point is the first point where the liquid is formed when the gas is cooled. Critical point is the point where all of the properties of gases and liquids are equal, such as temperature, pressure, amount of substance, and others. Critical point is very useful in fuel processing and dissolution of certain chemicals. Here in this paper, we will show the critical point analytically. Then, it will be compared with numerical calculations of Peng-Robinson equation by using Newton-Raphson method. As case studies, several hydrocarbon mixtures are simulated using by Matlab.

  18. Point process modeling and estimation: Advances in the analysis of dynamic neural spiking data

    Science.gov (United States)

    Deng, Xinyi

    2016-08-01

    A common interest of scientists in many fields is to understand the relationship between the dynamics of a physical system and the occurrences of discrete events within such physical system. Seismologists study the connection between mechanical vibrations of the Earth and the occurrences of earthquakes so that future earthquakes can be better predicted. Astrophysicists study the association between the oscillating energy of celestial regions and the emission of photons to learn the Universe's various objects and their interactions. Neuroscientists study the link between behavior and the millisecond-timescale spike patterns of neurons to understand higher brain functions. Such relationships can often be formulated within the framework of state-space models with point process observations. The basic idea is that the dynamics of the physical systems are driven by the dynamics of some stochastic state variables and the discrete events we observe in an interval are noisy observations with distributions determined by the state variables. This thesis proposes several new methodological developments that advance the framework of state-space models with point process observations at the intersection of statistics and neuroscience. In particular, we develop new methods 1) to characterize the rhythmic spiking activity using history-dependent structure, 2) to model population spike activity using marked point process models, 3) to allow for real-time decision making, and 4) to take into account the need for dimensionality reduction for high-dimensional state and observation processes. We applied these methods to a novel problem of tracking rhythmic dynamics in the spiking of neurons in the subthalamic nucleus of Parkinson's patients with the goal of optimizing placement of deep brain stimulation electrodes. We developed a decoding algorithm that can make decision in real-time (for example, to stimulate the neurons or not) based on various sources of information present in

  19. A MARKED POINT PROCESS MODEL FOR VEHICLE DETECTION IN AERIAL LIDAR POINT CLOUDS

    Directory of Open Access Journals (Sweden)

    A. Börcs

    2012-07-01

    Full Text Available In this paper we present an automated method for vehicle detection in LiDAR point clouds of crowded urban areas collected from an aerial platform. We assume that the input cloud is unordered, but it contains additional intensity and return number information which are jointly exploited by the proposed solution. Firstly, the 3-D point set is segmented into ground, vehicle, building roof, vegetation and clutter classes. Then the points with the corresponding class labels and intensity values are projected to the ground plane, where the optimal vehicle configuration is described by a Marked Point Process (MPP model of 2-D rectangles. Finally, the Multiple Birth and Death algorithm is utilized to find the configuration with the highest confidence.

  20. The application of hazard analysis and critical control points and risk management in the preparation of anti-cancer drugs.

    Science.gov (United States)

    Bonan, Brigitte; Martelli, Nicolas; Berhoune, Malik; Maestroni, Marie-Laure; Havard, Laurent; Prognon, Patrice

    2009-02-01

    To apply the Hazard analysis and Critical Control Points method to the preparation of anti-cancer drugs. To identify critical control points in our cancer chemotherapy process and to propose control measures and corrective actions to manage these processes. The Hazard Analysis and Critical Control Points application began in January 2004 in our centralized chemotherapy compounding unit. From October 2004 to August 2005, monitoring of the process nonconformities was performed to assess the method. According to the Hazard Analysis and Critical Control Points method, a multidisciplinary team was formed to describe and assess the cancer chemotherapy process. This team listed all of the critical points and calculated their risk indexes according to their frequency of occurrence, their severity and their detectability. The team defined monitoring, control measures and corrective actions for each identified risk. Finally, over a 10-month period, pharmacists reported each non-conformity of the process in a follow-up document. Our team described 11 steps in the cancer chemotherapy process. The team identified 39 critical control points, including 11 of higher importance with a high-risk index. Over 10 months, 16,647 preparations were performed; 1225 nonconformities were reported during this same period. The Hazard Analysis and Critical Control Points method is relevant when it is used to target a specific process such as the preparation of anti-cancer drugs. This method helped us to focus on the production steps, which can have a critical influence on product quality, and led us to improve our process.

  1. STRUCTURE LINE DETECTION FROM LIDAR POINT CLOUDS USING TOPOLOGICAL ELEVATION ANALYSIS

    Directory of Open Access Journals (Sweden)

    C. Y. Lo

    2012-07-01

    Full Text Available Airborne LIDAR point clouds, which have considerable points on object surfaces, are essential to building modeling. In the last two decades, studies have developed different approaches to identify structure lines using two main approaches, data-driven and modeldriven. These studies have shown that automatic modeling processes depend on certain considerations, such as used thresholds, initial value, designed formulas, and predefined cues. Following the development of laser scanning systems, scanning rates have increased and can provide point clouds with higher point density. Therefore, this study proposes using topological elevation analysis (TEA to detect structure lines instead of threshold-dependent concepts and predefined constraints. This analysis contains two parts: data pre-processing and structure line detection. To preserve the original elevation information, a pseudo-grid for generating digital surface models is produced during the first part. The highest point in each grid is set as the elevation value, and its original threedimensional position is preserved. In the second part, using TEA, the structure lines are identified based on the topology of local elevation changes in two directions. Because structure lines can contain certain geometric properties, their locations have small relieves in the radial direction and steep elevation changes in the circular direction. Following the proposed approach, TEA can be used to determine 3D line information without selecting thresholds. For validation, the TEA results are compared with those of the region growing approach. The results indicate that the proposed method can produce structure lines using dense point clouds.

  2. Pointo - a Low Cost Solution to Point Cloud Processing

    Science.gov (United States)

    Houshiar, H.; Winkler, S.

    2017-11-01

    With advance in technology access to data especially 3D point cloud data becomes more and more an everyday task. 3D point clouds are usually captured with very expensive tools such as 3D laser scanners or very time consuming methods such as photogrammetry. Most of the available softwares for 3D point cloud processing are designed for experts and specialists in this field and are usually very large software packages containing variety of methods and tools. This results in softwares that are usually very expensive to acquire and also very difficult to use. Difficulty of use is caused by complicated user interfaces that is required to accommodate a large list of features. The aim of these complex softwares is to provide a powerful tool for a specific group of specialist. However they are not necessary required by the majority of the up coming average users of point clouds. In addition to complexity and high costs of these softwares they generally rely on expensive and modern hardware and only compatible with one specific operating system. Many point cloud customers are not point cloud processing experts or willing to spend the high acquisition costs of these expensive softwares and hardwares. In this paper we introduce a solution for low cost point cloud processing. Our approach is designed to accommodate the needs of the average point cloud user. To reduce the cost and complexity of software our approach focuses on one functionality at a time in contrast with most available softwares and tools that aim to solve as many problems as possible at the same time. Our simple and user oriented design improve the user experience and empower us to optimize our methods for creation of an efficient software. In this paper we introduce Pointo family as a series of connected softwares to provide easy to use tools with simple design for different point cloud processing requirements. PointoVIEWER and PointoCAD are introduced as the first components of the Pointo family to provide a

  3. ALTERNATIVE METHODOLOGIES FOR THE ESTIMATION OF LOCAL POINT DENSITY INDEX: MOVING TOWARDS ADAPTIVE LIDAR DATA PROCESSING

    Directory of Open Access Journals (Sweden)

    Z. Lari

    2012-07-01

    Full Text Available Over the past few years, LiDAR systems have been established as a leading technology for the acquisition of high density point clouds over physical surfaces. These point clouds will be processed for the extraction of geo-spatial information. Local point density is one of the most important properties of the point cloud that highly affects the performance of data processing techniques and the quality of extracted information from these data. Therefore, it is necessary to define a standard methodology for the estimation of local point density indices to be considered for the precise processing of LiDAR data. Current definitions of local point density indices, which only consider the 2D neighbourhood of individual points, are not appropriate for 3D LiDAR data and cannot be applied for laser scans from different platforms. In order to resolve the drawbacks of these methods, this paper proposes several approaches for the estimation of the local point density index which take the 3D relationship among the points and the physical properties of the surfaces they belong to into account. In the simplest approach, an approximate value of the local point density for each point is defined while considering the 3D relationship among the points. In the other approaches, the local point density is estimated by considering the 3D neighbourhood of the point in question and the physical properties of the surface which encloses this point. The physical properties of the surfaces enclosing the LiDAR points are assessed through eigen-value analysis of the 3D neighbourhood of individual points and adaptive cylinder methods. This paper will discuss these approaches and highlight their impact on various LiDAR data processing activities (i.e., neighbourhood definition, region growing, segmentation, boundary detection, and classification. Experimental results from airborne and terrestrial LiDAR data verify the efficacy of considering local point density variation for

  4. A scalable and multi-purpose point cloud server (PCS) for easier and faster point cloud data management and processing

    Science.gov (United States)

    Cura, Rémi; Perret, Julien; Paparoditis, Nicolas

    2017-05-01

    In addition to more traditional geographical data such as images (rasters) and vectors, point cloud data are becoming increasingly available. Such data are appreciated for their precision and true three-Dimensional (3D) nature. However, managing point clouds can be difficult due to scaling problems and specificities of this data type. Several methods exist but are usually fairly specialised and solve only one aspect of the management problem. In this work, we propose a comprehensive and efficient point cloud management system based on a database server that works on groups of points (patches) rather than individual points. This system is specifically designed to cover the basic needs of point cloud users: fast loading, compressed storage, powerful patch and point filtering, easy data access and exporting, and integrated processing. Moreover, the proposed system fully integrates metadata (like sensor position) and can conjointly use point clouds with other geospatial data, such as images, vectors, topology and other point clouds. Point cloud (parallel) processing can be done in-base with fast prototyping capabilities. Lastly, the system is built on open source technologies; therefore it can be easily extended and customised. We test the proposed system with several billion points obtained from Lidar (aerial and terrestrial) and stereo-vision. We demonstrate loading speeds in the ˜50 million pts/h per process range, transparent-for-user and greater than 2 to 4:1 compression ratio, patch filtering in the 0.1 to 1 s range, and output in the 0.1 million pts/s per process range, along with classical processing methods, such as object detection.

  5. Poisson point processes imaging, tracking, and sensing

    CERN Document Server

    Streit, Roy L

    2010-01-01

    This overview of non-homogeneous and multidimensional Poisson point processes and their applications features mathematical tools and applications from emission- and transmission-computed tomography to multiple target tracking and distributed sensor detection.

  6. Development and evaluation of spatial point process models for epidermal nerve fibers.

    Science.gov (United States)

    Olsbo, Viktor; Myllymäki, Mari; Waller, Lance A; Särkkä, Aila

    2013-06-01

    We propose two spatial point process models for the spatial structure of epidermal nerve fibers (ENFs) across human skin. The models derive from two point processes, Φb and Φe, describing the locations of the base and end points of the fibers. Each point of Φe (the end point process) is connected to a unique point in Φb (the base point process). In the first model, both Φe and Φb are Poisson processes, yielding a null model of uniform coverage of the skin by end points and general baseline results and reference values for moments of key physiologic indicators. The second model provides a mechanistic model to generate end points for each base, and we model the branching structure more directly by defining Φe as a cluster process conditioned on the realization of Φb as its parent points. In both cases, we derive distributional properties for observable quantities of direct interest to neurologists such as the number of fibers per base, and the direction and range of fibers on the skin. We contrast both models by fitting them to data from skin blister biopsy images of ENFs and provide inference regarding physiological properties of ENFs. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. Distinguishing different types of inhomogeneity in Neyman-Scott point processes

    Czech Academy of Sciences Publication Activity Database

    Mrkvička, Tomáš

    2014-01-01

    Roč. 16, č. 2 (2014), s. 385-395 ISSN 1387-5841 Institutional support: RVO:60077344 Keywords : clustering * growing clusters * inhomogeneous cluster centers * inhomogeneous point process * location dependent scaling * Neyman-Scott point process Subject RIV: BA - General Mathematics Impact factor: 0.913, year: 2014

  8. Marked point process for modelling seismic activity (case study in Sumatra and Java)

    Science.gov (United States)

    Pratiwi, Hasih; Sulistya Rini, Lia; Wayan Mangku, I.

    2018-05-01

    Earthquake is a natural phenomenon that is random, irregular in space and time. Until now the forecast of earthquake occurrence at a location is still difficult to be estimated so that the development of earthquake forecast methodology is still carried out both from seismology aspect and stochastic aspect. To explain the random nature phenomena, both in space and time, a point process approach can be used. There are two types of point processes: temporal point process and spatial point process. The temporal point process relates to events observed over time as a sequence of time, whereas the spatial point process describes the location of objects in two or three dimensional spaces. The points on the point process can be labelled with additional information called marks. A marked point process can be considered as a pair (x, m) where x is the point of location and m is the mark attached to the point of that location. This study aims to model marked point process indexed by time on earthquake data in Sumatra Island and Java Island. This model can be used to analyse seismic activity through its intensity function by considering the history process up to time before t. Based on data obtained from U.S. Geological Survey from 1973 to 2017 with magnitude threshold 5, we obtained maximum likelihood estimate for parameters of the intensity function. The estimation of model parameters shows that the seismic activity in Sumatra Island is greater than Java Island.

  9. Point Cloud Analysis for Conservation and Enhancement of Modernist Architecture

    Science.gov (United States)

    Balzani, M.; Maietti, F.; Mugayar Kühl, B.

    2017-02-01

    Documentation of cultural assets through improved acquisition processes for advanced 3D modelling is one of the main challenges to be faced in order to address, through digital representation, advanced analysis on shape, appearance and conservation condition of cultural heritage. 3D modelling can originate new avenues in the way tangible cultural heritage is studied, visualized, curated, displayed and monitored, improving key features such as analysis and visualization of material degradation and state of conservation. An applied research focused on the analysis of surface specifications and material properties by means of 3D laser scanner survey has been developed within the project of Digital Preservation of FAUUSP building, Faculdade de Arquitetura e Urbanismo da Universidade de São Paulo, Brazil. The integrated 3D survey has been performed by the DIAPReM Center of the Department of Architecture of the University of Ferrara in cooperation with the FAUUSP. The 3D survey has allowed the realization of a point cloud model of the external surfaces, as the basis to investigate in detail the formal characteristics, geometric textures and surface features. The digital geometric model was also the basis for processing the intensity values acquired by laser scanning instrument; this method of analysis was an essential integration to the macroscopic investigations in order to manage additional information related to surface characteristics displayable on the point cloud.

  10. A Bayesian MCMC method for point process models with intractable normalising constants

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    2004-01-01

    to simulate from the "unknown distribution", perfect simulation algorithms become useful. We illustrate the method in cases whre the likelihood is given by a Markov point process model. Particularly, we consider semi-parametric Bayesian inference in connection to both inhomogeneous Markov point process models...... and pairwise interaction point processes....

  11. Farmer cooperatives in the food economy of Western Europe: an analysis from the Marketing point of view

    NARCIS (Netherlands)

    Meulenberg, M.T.G.

    1979-01-01

    This paper is concerned with an analysis of farmer cooperatives in Western Europe from the marketing point of view. The analysis is restricted to marketing and processing cooperatives. First some basic characteristics of farmer cooperatives are discussed from a systems point of view. Afterwards

  12. A J–function for inhomogeneous point processes

    NARCIS (Netherlands)

    M.N.M. van Lieshout (Marie-Colette)

    2010-01-01

    htmlabstractWe propose new summary statistics for intensity-reweighted moment stationary point processes that generalise the well known J-, empty space, and nearest-neighbour distance dis- tribution functions, represent them in terms of generating functionals and conditional intensities, and relate

  13. The importance of topographically corrected null models for analyzing ecological point processes.

    Science.gov (United States)

    McDowall, Philip; Lynch, Heather J

    2017-07-01

    Analyses of point process patterns and related techniques (e.g., MaxEnt) make use of the expected number of occurrences per unit area and second-order statistics based on the distance between occurrences. Ecologists working with point process data often assume that points exist on a two-dimensional x-y plane or within a three-dimensional volume, when in fact many observed point patterns are generated on a two-dimensional surface existing within three-dimensional space. For many surfaces, however, such as the topography of landscapes, the projection from the surface to the x-y plane preserves neither area nor distance. As such, when these point patterns are implicitly projected to and analyzed in the x-y plane, our expectations of the point pattern's statistical properties may not be met. When used in hypothesis testing, we find that the failure to account for the topography of the generating surface may bias statistical tests that incorrectly identify clustering and, furthermore, may bias coefficients in inhomogeneous point process models that incorporate slope as a covariate. We demonstrate the circumstances under which this bias is significant, and present simple methods that allow point processes to be simulated with corrections for topography. These point patterns can then be used to generate "topographically corrected" null models against which observed point processes can be compared. © 2017 by the Ecological Society of America.

  14. Analysis of residual stress state in sheet metal parts processed by single point incremental forming

    Science.gov (United States)

    Maaß, F.; Gies, S.; Dobecki, M.; Brömmelhoff, K.; Tekkaya, A. E.; Reimers, W.

    2018-05-01

    The mechanical properties of formed metal components are highly affected by the prevailing residual stress state. A selective induction of residual compressive stresses in the component, can improve the product properties such as the fatigue strength. By means of single point incremental forming (SPIF), the residual stress state can be influenced by adjusting the process parameters during the manufacturing process. To achieve a fundamental understanding of the residual stress formation caused by the SPIF process, a valid numerical process model is essential. Within the scope of this paper the significance of kinematic hardening effects on the determined residual stress state is presented based on numerical simulations. The effect of the unclamping step after the manufacturing process is also analyzed. An average deviation of the residual stress amplitudes in the clamped and unclamped condition of 18 % reveals, that the unclamping step needs to be considered to reach a high numerical prediction quality.

  15. Implementation of 5S tools as a starting point in business process reengineering

    Directory of Open Access Journals (Sweden)

    Vorkapić Miloš 0000-0002-3463-8665

    2017-01-01

    Full Text Available The paper deals with the analysis of elements which represent a starting point in implementation of a business process reengineering. We have used Lean tools through the analysis of 5S model in our research. On the example of finalization of the finished transmitter in IHMT-CMT production, 5S tools were implemented with a focus on Quality elements although the theory shows that BPR and TQM are two opposite activities in an enterprise. We wanted to distinguish the significance of employees’ self-discipline which helps the process of product finalization to develop in time and without waste and losses. In addition, the employees keep their work place clean, tidy and functional.

  16. EFFICIENT LIDAR POINT CLOUD DATA MANAGING AND PROCESSING IN A HADOOP-BASED DISTRIBUTED FRAMEWORK

    Directory of Open Access Journals (Sweden)

    C. Wang

    2017-10-01

    Full Text Available Light Detection and Ranging (LiDAR is one of the most promising technologies in surveying and mapping,city management, forestry, object recognition, computer vision engineer and others. However, it is challenging to efficiently storage, query and analyze the high-resolution 3D LiDAR data due to its volume and complexity. In order to improve the productivity of Lidar data processing, this study proposes a Hadoop-based framework to efficiently manage and process LiDAR data in a distributed and parallel manner, which takes advantage of Hadoop’s storage and computing ability. At the same time, the Point Cloud Library (PCL, an open-source project for 2D/3D image and point cloud processing, is integrated with HDFS and MapReduce to conduct the Lidar data analysis algorithms provided by PCL in a parallel fashion. The experiment results show that the proposed framework can efficiently manage and process big LiDAR data.

  17. Efficient LIDAR Point Cloud Data Managing and Processing in a Hadoop-Based Distributed Framework

    Science.gov (United States)

    Wang, C.; Hu, F.; Sha, D.; Han, X.

    2017-10-01

    Light Detection and Ranging (LiDAR) is one of the most promising technologies in surveying and mapping city management, forestry, object recognition, computer vision engineer and others. However, it is challenging to efficiently storage, query and analyze the high-resolution 3D LiDAR data due to its volume and complexity. In order to improve the productivity of Lidar data processing, this study proposes a Hadoop-based framework to efficiently manage and process LiDAR data in a distributed and parallel manner, which takes advantage of Hadoop's storage and computing ability. At the same time, the Point Cloud Library (PCL), an open-source project for 2D/3D image and point cloud processing, is integrated with HDFS and MapReduce to conduct the Lidar data analysis algorithms provided by PCL in a parallel fashion. The experiment results show that the proposed framework can efficiently manage and process big LiDAR data.

  18. Washing and chilling as critical control points in pork slaughter hazard analysis and critical control point (HACCP) systems.

    Science.gov (United States)

    Bolton, D J; Pearce, R A; Sheridan, J J; Blair, I S; McDowell, D A; Harrington, D

    2002-01-01

    The aim of this research was to examine the effects of preslaughter washing, pre-evisceration washing, final carcass washing and chilling on final carcass quality and to evaluate these operations as possible critical control points (CCPs) within a pork slaughter hazard analysis and critical control point (HACCP) system. This study estimated bacterial numbers (total viable counts) and the incidence of Salmonella at three surface locations (ham, belly and neck) on 60 animals/carcasses processed through a small commercial pork abattoir (80 pigs d(-1)). Significant reductions (P HACCP in pork slaughter plants. This research will provide a sound scientific basis on which to develop and implement effective HACCP in pork abattoirs.

  19. Multivariate Product-Shot-noise Cox Point Process Models

    DEFF Research Database (Denmark)

    Jalilian, Abdollah; Guan, Yongtao; Mateu, Jorge

    We introduce a new multivariate product-shot-noise Cox process which is useful for model- ing multi-species spatial point patterns with clustering intra-specific interactions and neutral, negative or positive inter-specific interactions. The auto and cross pair correlation functions of the process...... can be obtained in closed analytical forms and approximate simulation of the process is straightforward. We use the proposed process to model interactions within and among five tree species in the Barro Colorado Island plot....

  20. Statistical representation of a spray as a point process

    International Nuclear Information System (INIS)

    Subramaniam, S.

    2000-01-01

    The statistical representation of a spray as a finite point process is investigated. One objective is to develop a better understanding of how single-point statistical information contained in descriptions such as the droplet distribution function (ddf), relates to the probability density functions (pdfs) associated with the droplets themselves. Single-point statistical information contained in the droplet distribution function (ddf) is shown to be related to a sequence of single surrogate-droplet pdfs, which are in general different from the physical single-droplet pdfs. It is shown that the ddf contains less information than the fundamental single-point statistical representation of the spray, which is also described. The analysis shows which events associated with the ensemble of spray droplets can be characterized by the ddf, and which cannot. The implications of these findings for the ddf approach to spray modeling are discussed. The results of this study also have important consequences for the initialization and evolution of direct numerical simulations (DNS) of multiphase flows, which are usually initialized on the basis of single-point statistics such as the droplet number density in physical space. If multiphase DNS are initialized in this way, this implies that even the initial representation contains certain implicit assumptions concerning the complete ensemble of realizations, which are invalid for general multiphase flows. Also the evolution of a DNS initialized in this manner is shown to be valid only if an as yet unproven commutation hypothesis holds true. Therefore, it is questionable to what extent DNS that are initialized in this manner constitute a direct simulation of the physical droplets. Implications of these findings for large eddy simulations of multiphase flows are also discussed. (c) 2000 American Institute of Physics

  1. Discrete Approximations of Determinantal Point Processes on Continuous Spaces: Tree Representations and Tail Triviality

    Science.gov (United States)

    Osada, Hirofumi; Osada, Shota

    2018-01-01

    We prove tail triviality of determinantal point processes μ on continuous spaces. Tail triviality has been proved for such processes only on discrete spaces, and hence we have generalized the result to continuous spaces. To do this, we construct tree representations, that is, discrete approximations of determinantal point processes enjoying a determinantal structure. There are many interesting examples of determinantal point processes on continuous spaces such as zero points of the hyperbolic Gaussian analytic function with Bergman kernel, and the thermodynamic limit of eigenvalues of Gaussian random matrices for Sine_2 , Airy_2 , Bessel_2 , and Ginibre point processes. Our main theorem proves all these point processes are tail trivial.

  2. System implementation of hazard analysis and critical control points (HACCP) in a nitrogen production plant

    International Nuclear Information System (INIS)

    Barrantes Salazar, Alexandra

    2014-01-01

    System of hazard analysis and critical control points are deployed in a production plant of liquid nitrogen. The fact that the nitrogen has become a complement to food packaging to increase shelf life, or provide a surface that protect it from manipulation, has been the main objective. Analysis of critical control points for the nitrogen production plant has been the adapted methodology. The knowledge of both the standard and the production process, as well as the on site verification process, have been necessary. In addition, all materials and/or processing units that are found in contact with the raw material or the product under study were evaluated. Such a way that the intrinsic risks of each were detected, from the physical, chemical and biological points of view according to the origin or pollution source. For each found risk was evaluated the probability of occurrence according to the frequency and gravity of it, with these variables determined was achieved the definition of the type of risk detected. In the cases that was presented a greater risk or critical, these were subjected decision tree; with which is concluded the non determination of critical control points. However, for each one of them were established the maximum permitted limits. To generate each of the results it has literature or scientific reference of reliable provenance, where is indicated properly the support of the evaluated matter. In a general way, the material matrix and the process matrix are found without critical control points; so that the project is concluded in the analysis, and it has to generate without the monitoring system and verification. To increase this project is suggested in order to cover the packaging system of gaseous nitrogen, due to it was delimited to liquid nitrogen. Furthermore, the liquid nitrogen is a 100% automated and closed process so the introduction of contaminants is very reduced, unlike the gaseous nitrogen process. (author) [es

  3. Point process analyses of variations in smoking rate by setting, mood, gender, and dependence

    Science.gov (United States)

    Shiffman, Saul; Rathbun, Stephen L.

    2010-01-01

    The immediate emotional and situational antecedents of ad libitum smoking are still not well understood. We re-analyzed data from Ecological Momentary Assessment using novel point-process analyses, to assess how craving, mood, and social setting influence smoking rate, as well as assessing the moderating effects of gender and nicotine dependence. 304 smokers recorded craving, mood, and social setting using electronic diaries when smoking and at random nonsmoking times over 16 days of smoking. Point-process analysis, which makes use of the known random sampling scheme for momentary variables, examined main effects of setting and interactions with gender and dependence. Increased craving was associated with higher rates of smoking, particularly among women. Negative affect was not associated with smoking rate, even in interaction with arousal, but restlessness was associated with substantially higher smoking rates. Women's smoking tended to be less affected by negative affect. Nicotine dependence had little moderating effect on situational influences. Smoking rates were higher when smokers were alone or with others smoking, and smoking restrictions reduced smoking rates. However, the presence of others smoking undermined the effects of restrictions. The more sensitive point-process analyses confirmed earlier findings, including the surprising conclusion that negative affect by itself was not related to smoking rates. Contrary to hypothesis, men's and not women's smoking was influenced by negative affect. Both smoking restrictions and the presence of others who are not smoking suppress smoking, but others’ smoking undermines the effects of restrictions. Point-process analyses of EMA data can bring out even small influences on smoking rate. PMID:21480683

  4. Microbial profile and critical control points during processing of 'robo ...

    African Journals Online (AJOL)

    Microbial profile and critical control points during processing of 'robo' snack from ... the relevant critical control points especially in relation to raw materials and ... to the quality of the various raw ingredients used were the roasting using earthen

  5. Benchmarking of radiological departments. Starting point for successful process optimization

    International Nuclear Information System (INIS)

    Busch, Hans-Peter

    2010-01-01

    Continuous optimization of the process of organization and medical treatment is part of the successful management of radiological departments. The focus of this optimization can be cost units such as CT and MRI or the radiological parts of total patient treatment. Key performance indicators for process optimization are cost- effectiveness, service quality and quality of medical treatment. The potential for improvements can be seen by comparison (benchmark) with other hospitals and radiological departments. Clear definitions of key data and criteria are absolutely necessary for comparability. There is currently little information in the literature regarding the methodology and application of benchmarks especially from the perspective of radiological departments and case-based lump sums, even though benchmarking has frequently been applied to radiological departments by hospital management. The aim of this article is to describe and discuss systematic benchmarking as an effective starting point for successful process optimization. This includes the description of the methodology, recommendation of key parameters and discussion of the potential for cost-effectiveness analysis. The main focus of this article is cost-effectiveness (efficiency and effectiveness) with respect to cost units and treatment processes. (orig.)

  6. Statistical aspects of determinantal point processes

    DEFF Research Database (Denmark)

    Lavancier, Frédéric; Møller, Jesper; Rubak, Ege Holger

    The statistical aspects of determinantal point processes (DPPs) seem largely unexplored. We review the appealing properties of DDPs, demonstrate that they are useful models for repulsiveness, detail a simulation procedure, and provide freely available software for simulation and statistical...... inference. We pay special attention to stationary DPPs, where we give a simple condition ensuring their existence, construct parametric models, describe how they can be well approximated so that the likelihood can be evaluated and realizations can be simulated, and discuss how statistical inference...

  7. Equivalence of functional limit theorems for stationary point processes and their Palm distributions

    NARCIS (Netherlands)

    Nieuwenhuis, G.

    1989-01-01

    Let P be the distribution of a stationary point process on the real line and let P0 be its Palm distribution. In this paper we consider two types of functional limit theorems, those in terms of the number of points of the point process in (0, t] and those in terms of the location of the nth point

  8. Analysis method of beam pointing stability based on optical transmission matrix

    Science.gov (United States)

    Wang, Chuanchuan; Huang, PingXian; Li, Xiaotong; Cen, Zhaofen

    2016-10-01

    Quite a lot of factors will make effects on beam pointing stability of an optical system, Among them, the element tolerance is one of the most important and common factors. In some large laser systems, it will make final micro beams spot on the image plane deviate obviously. So it is essential for us to achieve effective and accurate analysis theoretically on element tolerance. In order to make the analysis of beam pointing stability convenient and theoretical, we consider transmission of a single chief ray rather than beams approximately to stand for the whole spot deviation. According to optical matrix, we also simplify this complex process of light transmission to multiplication of many matrices. So that we can set up element tolerance model, namely having mathematical expression to illustrate spot deviation in an optical system with element tolerance. In this way, we can realize quantitative analysis of beam pointing stability theoretically. In second half of the paper, we design an experiment to get the spot deviation in a multipass optical system caused by element tolerance, then we adjust the tolerance step by step and compare the results with the datum got from tolerance model, finally prove the correction of tolerance model successfully.

  9. A CASE STUDY ON POINT PROCESS MODELLING IN DISEASE MAPPING

    Directory of Open Access Journals (Sweden)

    Viktor Beneš

    2011-05-01

    Full Text Available We consider a data set of locations where people in Central Bohemia have been infected by tick-borne encephalitis (TBE, and where population census data and covariates concerning vegetation and altitude are available. The aims are to estimate the risk map of the disease and to study the dependence of the risk on the covariates. Instead of using the common area level approaches we base the analysis on a Bayesian approach for a log Gaussian Cox point process with covariates. Posterior characteristics for a discretized version of the log Gaussian Cox process are computed using Markov chain Monte Carlo methods. A particular problem which is thoroughly discussed is to determine a model for the background population density. The risk map shows a clear dependency with the population intensity models and the basic model which is adopted for the population intensity determines what covariates influence the risk of TBE. Model validation is based on the posterior predictive distribution of various summary statistics.

  10. The cylindrical K-function and Poisson line cluster point processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Safavimanesh, Farzaneh; Rasmussen, Jakob G.

    Poisson line cluster point processes, is also introduced. Parameter estimation based on moment methods or Bayesian inference for this model is discussed when the underlying Poisson line process and the cluster memberships are treated as hidden processes. To illustrate the methodologies, we analyze two...

  11. Pasteurised milk and implementation of HACCP (Hazard Analysis Critical Control Point

    Directory of Open Access Journals (Sweden)

    T.B Murdiati

    2004-10-01

    Full Text Available The purpose of pasteurisation is to destroy pathogen bacteria without affecting the taste, flavor, and nutritional value. A study on the implementation of HACCP (Hazard Analysis Critical Control Point in producing pasteurized milk was carried out in four processing unit of pasteurised milk, one in Jakarta, two in Bandung and one in Bogor. The critical control points in the production line were identified. Milk samples were collected from the critical points and were analysed for the total number of microbes. Antibiotic residues were detected on raw milks. The study indicated that one unit in Bandung dan one unit in Jakarta produced pasteurized milk with lower number of microbes than the other units, due to better management and control applied along the chain of production. Penisilin residues was detected in raw milk used by unit in Bogor. Six critical points and the hazard might arise in those points were identified, as well as how to prevent the hazards. Quality assurance system such as HACCP would be able to produce high quality and safety of pasteurised milk, and should be implemented gradually.

  12. Point-point and point-line moving-window correlation spectroscopy and its applications

    Science.gov (United States)

    Zhou, Qun; Sun, Suqin; Zhan, Daqi; Yu, Zhiwu

    2008-07-01

    In this paper, we present a new extension of generalized two-dimensional (2D) correlation spectroscopy. Two new algorithms, namely point-point (P-P) correlation and point-line (P-L) correlation, have been introduced to do the moving-window 2D correlation (MW2D) analysis. The new method has been applied to a spectral model consisting of two different processes. The results indicate that P-P correlation spectroscopy can unveil the details and re-constitute the entire process, whilst the P-L can provide general feature of the concerned processes. Phase transition behavior of dimyristoylphosphotidylethanolamine (DMPE) has been studied using MW2D correlation spectroscopy. The newly proposed method verifies that the phase transition temperature is 56 °C, same as the result got from a differential scanning calorimeter. To illustrate the new method further, a lysine and lactose mixture has been studied under thermo perturbation. Using the P-P MW2D, the Maillard reaction of the mixture was clearly monitored, which has been very difficult using conventional display of FTIR spectra.

  13. INHOMOGENEITY IN SPATIAL COX POINT PROCESSES – LOCATION DEPENDENT THINNING IS NOT THE ONLY OPTION

    Directory of Open Access Journals (Sweden)

    Michaela Prokešová

    2010-11-01

    Full Text Available In the literature on point processes the by far most popular option for introducing inhomogeneity into a point process model is the location dependent thinning (resulting in a second-order intensity-reweighted stationary point process. This produces a very tractable model and there are several fast estimation procedures available. Nevertheless, this model dilutes the interaction (or the geometrical structure of the original homogeneous model in a special way. When concerning the Markov point processes several alternative inhomogeneous models were suggested and investigated in the literature. But it is not so for the Cox point processes, the canonical models for clustered point patterns. In the contribution we discuss several other options how to define inhomogeneous Cox point process models that result in point patterns with different types of geometric structure. We further investigate the possible parameter estimation procedures for such models.

  14. From point process observations to collective neural dynamics: Nonlinear Hawkes process GLMs, low-dimensional dynamics and coarse graining.

    Science.gov (United States)

    Truccolo, Wilson

    2016-11-01

    This review presents a perspective on capturing collective dynamics in recorded neuronal ensembles based on multivariate point process models, inference of low-dimensional dynamics and coarse graining of spatiotemporal measurements. A general probabilistic framework for continuous time point processes reviewed, with an emphasis on multivariate nonlinear Hawkes processes with exogenous inputs. A point process generalized linear model (PP-GLM) framework for the estimation of discrete time multivariate nonlinear Hawkes processes is described. The approach is illustrated with the modeling of collective dynamics in neocortical neuronal ensembles recorded in human and non-human primates, and prediction of single-neuron spiking. A complementary approach to capture collective dynamics based on low-dimensional dynamics ("order parameters") inferred via latent state-space models with point process observations is presented. The approach is illustrated by inferring and decoding low-dimensional dynamics in primate motor cortex during naturalistic reach and grasp movements. Finally, we briefly review hypothesis tests based on conditional inference and spatiotemporal coarse graining for assessing collective dynamics in recorded neuronal ensembles. Published by Elsevier Ltd.

  15. Geometric anisotropic spatial point pattern analysis and Cox processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Toftaker, Håkon

    . In particular we study Cox process models with an elliptical pair correlation function, including shot noise Cox processes and log Gaussian Cox processes, and we develop estimation procedures using summary statistics and Bayesian methods. Our methodology is illustrated on real and synthetic datasets of spatial...

  16. Corner-point criterion for assessing nonlinear image processing imagers

    Science.gov (United States)

    Landeau, Stéphane; Pigois, Laurent; Foing, Jean-Paul; Deshors, Gilles; Swiathy, Greggory

    2017-10-01

    Range performance modeling of optronics imagers attempts to characterize the ability to resolve details in the image. Today, digital image processing is systematically used in conjunction with the optoelectronic system to correct its defects or to exploit tiny detection signals to increase performance. In order to characterize these processing having adaptive and non-linear properties, it becomes necessary to stimulate the imagers with test patterns whose properties are similar to the actual scene image ones, in terms of dynamic range, contours, texture and singular points. This paper presents an approach based on a Corner-Point (CP) resolution criterion, derived from the Probability of Correct Resolution (PCR) of binary fractal patterns. The fundamental principle lies in the respectful perception of the CP direction of one pixel minority value among the majority value of a 2×2 pixels block. The evaluation procedure considers the actual image as its multi-resolution CP transformation, taking the role of Ground Truth (GT). After a spatial registration between the degraded image and the original one, the degradation is statistically measured by comparing the GT with the degraded image CP transformation, in terms of localized PCR at the region of interest. The paper defines this CP criterion and presents the developed evaluation techniques, such as the measurement of the number of CP resolved on the target, the transformation CP and its inverse transform that make it possible to reconstruct an image of the perceived CPs. Then, this criterion is compared with the standard Johnson criterion, in the case of a linear blur and noise degradation. The evaluation of an imaging system integrating an image display and a visual perception is considered, by proposing an analysis scheme combining two methods: a CP measurement for the highly non-linear part (imaging) with real signature test target and conventional methods for the more linear part (displaying). The application to

  17. Application of statistical process control and process capability analysis procedures in orbiter processing activities at the Kennedy Space Center

    Science.gov (United States)

    Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.

    1994-01-01

    Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.

  18. Risk analysis of hematopoietic stem cell transplant process: failure mode, effect, and criticality analysis and hazard analysis critical control point methods integration based on guidelines to good manufacturing practice for medicinal product ANNEX 20 (February 2008).

    Science.gov (United States)

    Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F

    2010-01-01

    The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module. Copyright 2010 Elsevier Inc. All rights reserved.

  19. Tipping point analysis of a large ocean ambient sound record

    Science.gov (United States)

    Livina, Valerie N.; Harris, Peter; Brower, Albert; Wang, Lian; Sotirakopoulos, Kostas; Robinson, Stephen

    2017-04-01

    We study a long (2003-2015) high-resolution (250Hz) sound pressure record provided by the Comprehensive Nuclear-Test-Ban Treaty Organisation (CTBTO) from the hydro-acoustic station Cape Leeuwin (Australia). We transform the hydrophone waveforms into five bands of 10-min-average sound pressure levels (including the third-octave band) and apply tipping point analysis techniques [1-3]. We report the results of the analysis of fluctuations and trends in the data and discuss the BigData challenges in processing this record, including handling data segments of large size and possible HPC solutions. References: [1] Livina et al, GRL 2007, [2] Livina et al, Climate of the Past 2010, [3] Livina et al, Chaos 2015.

  20. Interior point algorithms theory and analysis

    CERN Document Server

    Ye, Yinyu

    2011-01-01

    The first comprehensive review of the theory and practice of one of today's most powerful optimization techniques. The explosive growth of research into and development of interior point algorithms over the past two decades has significantly improved the complexity of linear programming and yielded some of today's most sophisticated computing techniques. This book offers a comprehensive and thorough treatment of the theory, analysis, and implementation of this powerful computational tool. Interior Point Algorithms provides detailed coverage of all basic and advanced aspects of the subject.

  1. Process and results of analytical framework and typology development for POINT

    DEFF Research Database (Denmark)

    Gudmundsson, Henrik; Lehtonen, Markku; Bauler, Tom

    2009-01-01

    POINT is a project about how indicators are used in practice; to what extent and in what way indicators actually influence, support, or hinder policy and decision making processes, and what could be done to enhance the positive role of indicators in such processes. The project needs an analytical......, a set of core concepts and associated typologies, a series of analytic schemes proposed, and a number of research propositions and questions for the subsequent empirical work in POINT....

  2. PENERAPAN SISTEM HAZARD ANALYSIS CRITICAL CONTROL POINT (HACCP PADA PROSES PEMBUATAN KERIPIK TEMPE

    Directory of Open Access Journals (Sweden)

    Rahmi Yuniarti

    2015-06-01

    Full Text Available Malang is one of the industrial centers of tempe chips. To maintain the quality and food safety, analysis is required to identify the hazards during the production process. This study was conducted to identify the hazards during the production process of tempe chips and provide recommendations for developing a HACCP system. The phases of production process of tempe chips are started from slice the tempe, move it to the kitchen, coat it with flour dough, fry it in the pan, drain it, package it, and then storage it. There are 3 types of potential hazards in terms of biological, physical, and chemical during the production process. With the CCP identification, there are three processes that have Critical Control Point. There are the process of slicing tempe, immersion of tempe into the flour mixture and draining. Recommendations for the development of HACCP systems include recommendations related to employee hygiene, supporting equipment, 5-S analysis, and the production layout.

  3. Single Point Vulnerability Analysis of Automatic Seismic Trip System

    International Nuclear Information System (INIS)

    Oh, Seo Bin; Chung, Soon Il; Lee, Yong Suk; Choi, Byung Pil

    2016-01-01

    Single Point Vulnerability (SPV) analysis is a process used to identify individual equipment whose failure alone will result in a reactor trip, turbine generator failure, or power reduction of more than 50%. Automatic Seismic Trip System (ASTS) is a newly installed system to ensure the safety of plant when earthquake occurs. Since this system directly shuts down the reactor, the failure or malfunction of its system component can cause a reactor trip more frequently than other systems. Therefore, an SPV analysis of ASTS is necessary to maintain its essential performance. To analyze SPV for ASTS, failure mode and effect analysis (FMEA) and fault tree analysis (FTA) was performed. In this study, FMEA and FTA methods were performed to select SPV equipment of ASTS. D/O, D/I, A/I card, seismic sensor, and trip relay had an effect on the reactor trip but their single failure will not cause reactor trip. In conclusion, ASTS is excluded as SPV. These results can be utilized as the basis data for ways to enhance facility reliability such as design modification and improvement of preventive maintenance procedure

  4. Single Point Vulnerability Analysis of Automatic Seismic Trip System

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Seo Bin; Chung, Soon Il; Lee, Yong Suk [FNC Technology Co., Yongin (Korea, Republic of); Choi, Byung Pil [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    Single Point Vulnerability (SPV) analysis is a process used to identify individual equipment whose failure alone will result in a reactor trip, turbine generator failure, or power reduction of more than 50%. Automatic Seismic Trip System (ASTS) is a newly installed system to ensure the safety of plant when earthquake occurs. Since this system directly shuts down the reactor, the failure or malfunction of its system component can cause a reactor trip more frequently than other systems. Therefore, an SPV analysis of ASTS is necessary to maintain its essential performance. To analyze SPV for ASTS, failure mode and effect analysis (FMEA) and fault tree analysis (FTA) was performed. In this study, FMEA and FTA methods were performed to select SPV equipment of ASTS. D/O, D/I, A/I card, seismic sensor, and trip relay had an effect on the reactor trip but their single failure will not cause reactor trip. In conclusion, ASTS is excluded as SPV. These results can be utilized as the basis data for ways to enhance facility reliability such as design modification and improvement of preventive maintenance procedure.

  5. Genetic interaction analysis of point mutations enables interrogation of gene function at a residue-level resolution

    Science.gov (United States)

    Braberg, Hannes; Moehle, Erica A.; Shales, Michael; Guthrie, Christine; Krogan, Nevan J.

    2014-01-01

    We have achieved a residue-level resolution of genetic interaction mapping – a technique that measures how the function of one gene is affected by the alteration of a second gene – by analyzing point mutations. Here, we describe how to interpret point mutant genetic interactions, and outline key applications for the approach, including interrogation of protein interaction interfaces and active sites, and examination of post-translational modifications. Genetic interaction analysis has proven effective for characterizing cellular processes; however, to date, systematic high-throughput genetic interaction screens have relied on gene deletions or knockdowns, which limits the resolution of gene function analysis and poses problems for multifunctional genes. Our point mutant approach addresses these issues, and further provides a tool for in vivo structure-function analysis that complements traditional biophysical methods. We also discuss the potential for genetic interaction mapping of point mutations in human cells and its application to personalized medicine. PMID:24842270

  6. Parallel Processing of Big Point Clouds Using Z-Order Partitioning

    Science.gov (United States)

    Alis, C.; Boehm, J.; Liu, K.

    2016-06-01

    As laser scanning technology improves and costs are coming down, the amount of point cloud data being generated can be prohibitively difficult and expensive to process on a single machine. This data explosion is not only limited to point cloud data. Voluminous amounts of high-dimensionality and quickly accumulating data, collectively known as Big Data, such as those generated by social media, Internet of Things devices and commercial transactions, are becoming more prevalent as well. New computing paradigms and frameworks are being developed to efficiently handle the processing of Big Data, many of which utilize a compute cluster composed of several commodity grade machines to process chunks of data in parallel. A central concept in many of these frameworks is data locality. By its nature, Big Data is large enough that the entire dataset would not fit on the memory and hard drives of a single node hence replicating the entire dataset to each worker node is impractical. The data must then be partitioned across worker nodes in a manner that minimises data transfer across the network. This is a challenge for point cloud data because there exist different ways to partition data and they may require data transfer. We propose a partitioning based on Z-order which is a form of locality-sensitive hashing. The Z-order or Morton code is computed by dividing each dimension to form a grid then interleaving the binary representation of each dimension. For example, the Z-order code for the grid square with coordinates (x = 1 = 012, y = 3 = 112) is 10112 = 11. The number of points in each partition is controlled by the number of bits per dimension: the more bits, the fewer the points. The number of bits per dimension also controls the level of detail with more bits yielding finer partitioning. We present this partitioning method by implementing it on Apache Spark and investigating how different parameters affect the accuracy and running time of the k nearest neighbour algorithm

  7. Edit distance for marked point processes revisited: An implementation by binary integer programming

    Energy Technology Data Exchange (ETDEWEB)

    Hirata, Yoshito; Aihara, Kazuyuki [Institute of Industrial Science, The University of Tokyo, 4-6-1 Komaba, Meguro-ku, Tokyo 153-8505 (Japan)

    2015-12-15

    We implement the edit distance for marked point processes [Suzuki et al., Int. J. Bifurcation Chaos 20, 3699–3708 (2010)] as a binary integer program. Compared with the previous implementation using minimum cost perfect matching, the proposed implementation has two advantages: first, by using the proposed implementation, we can apply a wide variety of software and hardware, even spin glasses and coherent ising machines, to calculate the edit distance for marked point processes; second, the proposed implementation runs faster than the previous implementation when the difference between the numbers of events in two time windows for a marked point process is large.

  8. Strong approximations and sequential change-point analysis for diffusion processes

    DEFF Research Database (Denmark)

    Mihalache, Stefan-Radu

    2012-01-01

    In this paper ergodic diffusion processes depending on a parameter in the drift are considered under the assumption that the processes can be observed continuously. Strong approximations by Wiener processes for a stochastic integral and for the estimator process constructed by the one...

  9. Tipping point analysis of ocean acoustic noise

    Science.gov (United States)

    Livina, Valerie N.; Brouwer, Albert; Harris, Peter; Wang, Lian; Sotirakopoulos, Kostas; Robinson, Stephen

    2018-02-01

    We apply tipping point analysis to a large record of ocean acoustic data to identify the main components of the acoustic dynamical system and study possible bifurcations and transitions of the system. The analysis is based on a statistical physics framework with stochastic modelling, where we represent the observed data as a composition of deterministic and stochastic components estimated from the data using time-series techniques. We analyse long-term and seasonal trends, system states and acoustic fluctuations to reconstruct a one-dimensional stochastic equation to approximate the acoustic dynamical system. We apply potential analysis to acoustic fluctuations and detect several changes in the system states in the past 14 years. These are most likely caused by climatic phenomena. We analyse trends in sound pressure level within different frequency bands and hypothesize a possible anthropogenic impact on the acoustic environment. The tipping point analysis framework provides insight into the structure of the acoustic data and helps identify its dynamic phenomena, correctly reproducing the probability distribution and scaling properties (power-law correlations) of the time series.

  10. Tipping point analysis of ocean acoustic noise

    Directory of Open Access Journals (Sweden)

    V. N. Livina

    2018-02-01

    Full Text Available We apply tipping point analysis to a large record of ocean acoustic data to identify the main components of the acoustic dynamical system and study possible bifurcations and transitions of the system. The analysis is based on a statistical physics framework with stochastic modelling, where we represent the observed data as a composition of deterministic and stochastic components estimated from the data using time-series techniques. We analyse long-term and seasonal trends, system states and acoustic fluctuations to reconstruct a one-dimensional stochastic equation to approximate the acoustic dynamical system. We apply potential analysis to acoustic fluctuations and detect several changes in the system states in the past 14 years. These are most likely caused by climatic phenomena. We analyse trends in sound pressure level within different frequency bands and hypothesize a possible anthropogenic impact on the acoustic environment. The tipping point analysis framework provides insight into the structure of the acoustic data and helps identify its dynamic phenomena, correctly reproducing the probability distribution and scaling properties (power-law correlations of the time series.

  11. Can the Hazard Assessment and Critical Control Points (HACCP) system be used to design process-based hygiene concepts?

    Science.gov (United States)

    Hübner, N-O; Fleßa, S; Haak, J; Wilke, F; Hübner, C; Dahms, C; Hoffmann, W; Kramer, A

    2011-01-01

    Recently, the HACCP (Hazard Analysis and Critical Control Points) concept was proposed as possible way to implement process-based hygiene concepts in clinical practice, but the extent to which this food safety concept can be transferred into the health care setting is unclear. We therefore discuss possible ways for a translation of the principles of the HACCP for health care settings. While a direct implementation of food processing concepts into health care is not very likely to be feasible and will probably not readily yield the intended results, the underlying principles of process-orientation, in-process safety control and hazard analysis based counter measures are transferable to clinical settings. In model projects the proposed concepts should be implemented, monitored, and evaluated under real world conditions.

  12. Energy risk management through self-exciting marked point process

    International Nuclear Information System (INIS)

    Herrera, Rodrigo

    2013-01-01

    Crude oil is a dynamically traded commodity that affects many economies. We propose a collection of marked self-exciting point processes with dependent arrival rates for extreme events in oil markets and related risk measures. The models treat the time among extreme events in oil markets as a stochastic process. The main advantage of this approach is its capability to capture the short, medium and long-term behavior of extremes without involving an arbitrary stochastic volatility model or a prefiltration of the data, as is common in extreme value theory applications. We make use of the proposed model in order to obtain an improved estimate for the Value at Risk in oil markets. Empirical findings suggest that the reliability and stability of Value at Risk estimates improve as a result of finer modeling approach. This is supported by an empirical application in the representative West Texas Intermediate (WTI) and Brent crude oil markets. - Highlights: • We propose marked self-exciting point processes for extreme events in oil markets. • This approach captures the short and long-term behavior of extremes. • We improve the estimates for the VaR in the WTI and Brent crude oil markets

  13. Aspects of second-order analysis of structured inhomogeneous spatio-temporal processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Ghorbani, Mohammad

    2012-01-01

    Statistical methodology for spatio-temporal point processes is in its infancy. We consider second-order analysis based on pair correlation functions and K-functions for general inhomogeneous spatio-temporal point processes and for inhomogeneous spatio-temporal Cox processes. Assuming spatio......-temporal separability of the intensity function, we clarify different meanings of second-order spatio-temporal separability. One is second-order spatio-temporal independence and relates to log-Gaussian Cox processes with an additive covariance structure of the underlying spatio-temporal Gaussian process. Another...... concerns shot-noise Cox processes with a separable spatio-temporal covariance density. We propose diagnostic procedures for checking hypotheses of second-order spatio-temporal separability, which we apply on simulated and real data....

  14. Aftershock identification problem via the nearest-neighbor analysis for marked point processes

    Science.gov (United States)

    Gabrielov, A.; Zaliapin, I.; Wong, H.; Keilis-Borok, V.

    2007-12-01

    The centennial observations on the world seismicity have revealed a wide variety of clustering phenomena that unfold in the space-time-energy domain and provide most reliable information about the earthquake dynamics. However, there is neither a unifying theory nor a convenient statistical apparatus that would naturally account for the different types of seismic clustering. In this talk we present a theoretical framework for nearest-neighbor analysis of marked processes and obtain new results on hierarchical approach to studying seismic clustering introduced by Baiesi and Paczuski (2004). Recall that under this approach one defines an asymmetric distance D in space-time-energy domain such that the nearest-neighbor spanning graph with respect to D becomes a time- oriented tree. We demonstrate how this approach can be used to detect earthquake clustering. We apply our analysis to the observed seismicity of California and synthetic catalogs from ETAS model and show that the earthquake clustering part is statistically different from the homogeneous part. This finding may serve as a basis for an objective aftershock identification procedure.

  15. Mass measurement on the rp-process waiting point {sup 72}Kr

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez, D. [Gesellschaft fuer Schwerionenforschung mbH, Darmstadt (Germany); Kolhinen, V.S. [Jyvaeskylae Univ. (Finland); Audi, G. [CSNSM-IN2P3-Centre National de la Recherche Scientifique (CNRS), 91 - Orsay (FR)] [and others

    2004-06-01

    The mass of one of the three major waiting points in the astrophysical rp-process {sup 72}Kr was measured for the first time with the Penning trap mass spectrometer ISOLTRAP. The measurement yielded a relative mass uncertainty of {delta}m/m=1.2 x 10{sup -7} ({delta}m=8 keV). Other Kr isotopes, also needed for astrophysical calculations, were measured with more than one order of magnitude improved accuracy. We use the ISOLTRAP masses of{sup 72-74}Kr to reanalyze the role of the {sup 72}Kr waiting point in the rp-process during X-ray bursts. (orig.)

  16. Point Cluster Analysis Using a 3D Voronoi Diagram with Applications in Point Cloud Segmentation

    Directory of Open Access Journals (Sweden)

    Shen Ying

    2015-08-01

    Full Text Available Three-dimensional (3D point analysis and visualization is one of the most effective methods of point cluster detection and segmentation in geospatial datasets. However, serious scattering and clotting characteristics interfere with the visual detection of 3D point clusters. To overcome this problem, this study proposes the use of 3D Voronoi diagrams to analyze and visualize 3D points instead of the original data item. The proposed algorithm computes the cluster of 3D points by applying a set of 3D Voronoi cells to describe and quantify 3D points. The decompositions of point cloud of 3D models are guided by the 3D Voronoi cell parameters. The parameter values are mapped from the Voronoi cells to 3D points to show the spatial pattern and relationships; thus, a 3D point cluster pattern can be highlighted and easily recognized. To capture different cluster patterns, continuous progressive clusters and segmentations are tested. The 3D spatial relationship is shown to facilitate cluster detection. Furthermore, the generated segmentations of real 3D data cases are exploited to demonstrate the feasibility of our approach in detecting different spatial clusters for continuous point cloud segmentation.

  17. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard Analysis and Critical Control Point (HACCP... SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS General Provisions § 120.8 Hazard Analysis and Critical Control Point (HACCP) plan. (a) HACCP plan. Each...

  18. Profitability Analysis of Soybean Oil Processes.

    Science.gov (United States)

    Cheng, Ming-Hsun; Rosentrater, Kurt A

    2017-10-07

    Soybean oil production is the basic process for soybean applications. Cash flow analysis is used to estimate the profitability of a manufacturing venture. Besides capital investments, operating costs, and revenues, the interest rate is the factor to estimate the net present value (NPV), break-even points, and payback time; which are benchmarks for profitability evaluation. The positive NPV and reasonable payback time represent a profitable process, and provide an acceptable projection for real operating. Additionally, the capacity of the process is another critical factor. The extruding-expelling process and hexane extraction are the two typical approaches used in industry. When the capacities of annual oil production are larger than 12 and 173 million kg respectively, these two processes are profitable. The solvent free approach, known as enzyme assisted aqueous extraction process (EAEP), is profitable when the capacity is larger than 17 million kg of annual oil production.

  19. Profitability Analysis of Soybean Oil Processes

    Directory of Open Access Journals (Sweden)

    Ming-Hsun Cheng

    2017-10-01

    Full Text Available Soybean oil production is the basic process for soybean applications. Cash flow analysis is used to estimate the profitability of a manufacturing venture. Besides capital investments, operating costs, and revenues, the interest rate is the factor to estimate the net present value (NPV, break-even points, and payback time; which are benchmarks for profitability evaluation. The positive NPV and reasonable payback time represent a profitable process, and provide an acceptable projection for real operating. Additionally, the capacity of the process is another critical factor. The extruding-expelling process and hexane extraction are the two typical approaches used in industry. When the capacities of annual oil production are larger than 12 and 173 million kg respectively, these two processes are profitable. The solvent free approach, known as enzyme assisted aqueous extraction process (EAEP, is profitable when the capacity is larger than 17 million kg of annual oil production.

  20. Spatio-temporal point process filtering methods with an application

    Czech Academy of Sciences Publication Activity Database

    Frcalová, B.; Beneš, V.; Klement, Daniel

    2010-01-01

    Roč. 21, 3-4 (2010), s. 240-252 ISSN 1180-4009 R&D Projects: GA AV ČR(CZ) IAA101120604 Institutional research plan: CEZ:AV0Z50110509 Keywords : cox point process * filtering * spatio-temporal modelling * spike Subject RIV: BA - General Mathematics Impact factor: 0.750, year: 2010

  1. Investigation of Random Switching Driven by a Poisson Point Process

    DEFF Research Database (Denmark)

    Simonsen, Maria; Schiøler, Henrik; Leth, John-Josef

    2015-01-01

    This paper investigates the switching mechanism of a two-dimensional switched system, when the switching events are generated by a Poisson point process. A model, in the shape of a stochastic process, for such a system is derived and the distribution of the trajectory's position is developed...... together with marginal density functions for the coordinate functions. Furthermore, the joint probability distribution is given explicitly....

  2. Design of glass-ceramic complex microstructure with using onset point of crystallization in differential thermal analysis

    International Nuclear Information System (INIS)

    Hwang, Seongjin; Kim, Jinho; Shin, Hyo-Soon; Kim, Jong-Hee; Kim, Hyungsun

    2008-01-01

    Two types of frits with different compositions were used to develop a high strength substrate in electronic packaging using a low temperature co-fired ceramic process. In order to reveal the crystallization stage during heating to approximately 900 deg. C, a glass-ceramic consisting of the two types of frits, which had been crystallized to diopside and anorthite after firing, was tested at different mixing ratios of the frits. The exothermal peaks deconvoluted by a Gauss function in the differential thermal analysis curves were used to determine the onset point of crystallization of diopside or anorthite. The onset points of crystallization were affected by the mixing ratio of the frits, and the microstructure of the glass-ceramic depended on the onset point of crystallization. It was found that when multicrystalline phases appear in the microstructure, the resulting complex microstructure could be predicted from the onset point of crystallization obtained by differential thermal analysis

  3. Exergy analysis of the LFC process

    International Nuclear Information System (INIS)

    Li, Qingsong; Lin, Yuankui

    2016-01-01

    Highlights: • Mengdong lignite was upgraded by liquids from coal (LFC) process at a laboratory-scale. • True boiling point distillation of tar was performed. • Basing on experimental data, the LFC process was simulated in Aspen Plus. • Amounts of exergy destruction and efficiencies of blocks were calculated. • Potential measures for improving the LFC process are suggested. - Abstract: Liquid from coal (LFC) is a pyrolysis technology for upgrading lignite. LFC is close to viability as a large-scale commercial technology and is strongly promoted by the Chinese government. This paper presents an exergy analysis of the LFC process producing semicoke and tar, simulated in Aspen Plus. The simulation included the drying unit, pyrolysis unit, tar recovery unit and combustion unit. To obtain the data required for the simulation, Mengdong lignite was upgraded using a laboratory-scale experimental facility based on LFC technology. True boiling point distillation of tar was performed. Based on thermodynamic data obtained from the simulation, chemical exergy and physical exergy were determined for process streams and exergy destruction was calculated. The exergy budget of the LFC process is presented as a Grassmann flow diagram. The overall exergy efficiency was 76.81%, with the combustion unit causing the highest exergy destruction. The study found that overall exergy efficiency can be increased by reducing moisture in lignite and making full use of physical exergy of pyrolysates. A feasible method for making full use of physical exergy of semicoke was suggested.

  4. CLINSULF sub-dew-point process for sulphur recovery

    Energy Technology Data Exchange (ETDEWEB)

    Heisel, M.; Marold, F.

    1988-01-01

    In a 2-reactor system, the CLINSULF process allows very high sulphur recovery rates. When operated at 100/sup 0/C at the outlet, i.e. below the sulphur solidification point, a sulphur recovery rate of more than 99.2% was achieved in a 2-reactor series. Assuming a 70% sulphur recovery in an upstream Claus furnace plus sulphur condenser, an overall sulphur recovery of more than 99.8% results for the 2-reactor system. This is approximately 2% higher than in conventional Claus plus SDP units, which mostly consist of 4 reactors or more. This means the the CLINSULF SSP process promises to be an improvement both in respect of efficiency and low investment cost.

  5. Monte Carlo point process estimation of electromyographic envelopes from motor cortical spikes for brain-machine interfaces

    Science.gov (United States)

    Liao, Yuxi; She, Xiwei; Wang, Yiwen; Zhang, Shaomin; Zhang, Qiaosheng; Zheng, Xiaoxiang; Principe, Jose C.

    2015-12-01

    Objective. Representation of movement in the motor cortex (M1) has been widely studied in brain-machine interfaces (BMIs). The electromyogram (EMG) has greater bandwidth than the conventional kinematic variables (such as position, velocity), and is functionally related to the discharge of cortical neurons. As the stochastic information of EMG is derived from the explicit spike time structure, point process (PP) methods will be a good solution for decoding EMG directly from neural spike trains. Previous studies usually assume linear or exponential tuning curves between neural firing and EMG, which may not be true. Approach. In our analysis, we estimate the tuning curves in a data-driven way and find both the traditional functional-excitatory and functional-inhibitory neurons, which are widely found across a rat’s motor cortex. To accurately decode EMG envelopes from M1 neural spike trains, the Monte Carlo point process (MCPP) method is implemented based on such nonlinear tuning properties. Main results. Better reconstruction of EMG signals is shown on baseline and extreme high peaks, as our method can better preserve the nonlinearity of the neural tuning during decoding. The MCPP improves the prediction accuracy (the normalized mean squared error) 57% and 66% on average compared with the adaptive point process filter using linear and exponential tuning curves respectively, for all 112 data segments across six rats. Compared to a Wiener filter using spike rates with an optimal window size of 50 ms, MCPP decoding EMG from a point process improves the normalized mean square error (NMSE) by 59% on average. Significance. These results suggest that neural tuning is constantly changing during task execution and therefore, the use of spike timing methodologies and estimation of appropriate tuning curves needs to be undertaken for better EMG decoding in motor BMIs.

  6. PARALLEL PROCESSING OF BIG POINT CLOUDS USING Z-ORDER-BASED PARTITIONING

    Directory of Open Access Journals (Sweden)

    C. Alis

    2016-06-01

    Full Text Available As laser scanning technology improves and costs are coming down, the amount of point cloud data being generated can be prohibitively difficult and expensive to process on a single machine. This data explosion is not only limited to point cloud data. Voluminous amounts of high-dimensionality and quickly accumulating data, collectively known as Big Data, such as those generated by social media, Internet of Things devices and commercial transactions, are becoming more prevalent as well. New computing paradigms and frameworks are being developed to efficiently handle the processing of Big Data, many of which utilize a compute cluster composed of several commodity grade machines to process chunks of data in parallel. A central concept in many of these frameworks is data locality. By its nature, Big Data is large enough that the entire dataset would not fit on the memory and hard drives of a single node hence replicating the entire dataset to each worker node is impractical. The data must then be partitioned across worker nodes in a manner that minimises data transfer across the network. This is a challenge for point cloud data because there exist different ways to partition data and they may require data transfer. We propose a partitioning based on Z-order which is a form of locality-sensitive hashing. The Z-order or Morton code is computed by dividing each dimension to form a grid then interleaving the binary representation of each dimension. For example, the Z-order code for the grid square with coordinates (x = 1 = 012, y = 3 = 112 is 10112 = 11. The number of points in each partition is controlled by the number of bits per dimension: the more bits, the fewer the points. The number of bits per dimension also controls the level of detail with more bits yielding finer partitioning. We present this partitioning method by implementing it on Apache Spark and investigating how different parameters affect the accuracy and running time of the k nearest

  7. Thermal analysis of LOFT waste gas processing system nitrogen supply for process line purge and blower seal

    International Nuclear Information System (INIS)

    Tatar, G.A.

    1979-01-01

    The LOFT Waste Gas Processing System uses gaseous nitrogen (GN 2 ) to purge the main process line and to supply pressure on the blower labyrinth seal. The purpose of this analysis was to determine the temperature of the GN 2 at the blower seals and the main process line. Since these temperatures were below 32 0 F the heat rate necessary to raise these temperatures was calculated. This report shows that the GN 2 temperatures at the points mentioned above were below 10 0 F. A heat rate into the GN 2 of 389 Watts added at the point where the supply line enters the vault would raise the GN 2 temperature above 32 0 F

  8. Automated extraction and analysis of rock discontinuity characteristics from 3D point clouds

    Science.gov (United States)

    Bianchetti, Matteo; Villa, Alberto; Agliardi, Federico; Crosta, Giovanni B.

    2016-04-01

    A reliable characterization of fractured rock masses requires an exhaustive geometrical description of discontinuities, including orientation, spacing, and size. These are required to describe discontinuum rock mass structure, perform Discrete Fracture Network and DEM modelling, or provide input for rock mass classification or equivalent continuum estimate of rock mass properties. Although several advanced methodologies have been developed in the last decades, a complete characterization of discontinuity geometry in practice is still challenging, due to scale-dependent variability of fracture patterns and difficult accessibility to large outcrops. Recent advances in remote survey techniques, such as terrestrial laser scanning and digital photogrammetry, allow a fast and accurate acquisition of dense 3D point clouds, which promoted the development of several semi-automatic approaches to extract discontinuity features. Nevertheless, these often need user supervision on algorithm parameters which can be difficult to assess. To overcome this problem, we developed an original Matlab tool, allowing fast, fully automatic extraction and analysis of discontinuity features with no requirements on point cloud accuracy, density and homogeneity. The tool consists of a set of algorithms which: (i) process raw 3D point clouds, (ii) automatically characterize discontinuity sets, (iii) identify individual discontinuity surfaces, and (iv) analyse their spacing and persistence. The tool operates in either a supervised or unsupervised mode, starting from an automatic preliminary exploration data analysis. The identification and geometrical characterization of discontinuity features is divided in steps. First, coplanar surfaces are identified in the whole point cloud using K-Nearest Neighbor and Principal Component Analysis algorithms optimized on point cloud accuracy and specified typical facet size. Then, discontinuity set orientation is calculated using Kernel Density Estimation and

  9. Analysis of Stress Updates in the Material-point Method

    DEFF Research Database (Denmark)

    Andersen, Søren; Andersen, Lars

    2009-01-01

    The material-point method (MPM) is a new numerical method for analysis of large strain engineering problems. The MPM applies a dual formulation, where the state of the problem (mass, stress, strain, velocity etc.) is tracked using a finite set of material points while the governing equations...... are solved on a background computational grid. Several references state, that one of the main advantages of the material-point method is the easy application of complicated material behaviour as the constitutive response is updated individually for each material point. However, as discussed here, the MPM way...

  10. Two-step estimation for inhomogeneous spatial point processes

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Guan, Yongtao

    This paper is concerned with parameter estimation for inhomogeneous spatial point processes with a regression model for the intensity function and tractable second order properties (K-function). Regression parameters are estimated using a Poisson likelihood score estimating function and in a second...... step minimum contrast estimation is applied for the residual clustering parameters. Asymptotic normality of parameter estimates is established under certain mixing conditions and we exemplify how the results may be applied in ecological studies of rain forests....

  11. Bridging the gap between a stationary point process and its Palm distribution

    NARCIS (Netherlands)

    Nieuwenhuis, G.

    1994-01-01

    In the context of stationary point processes measurements are usually made from a time point chosen at random or from an occurrence chosen at random. That is, either the stationary distribution P or its Palm distribution P° is the ruling probability measure. In this paper an approach is presented to

  12. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method.

    Science.gov (United States)

    Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu

    2016-12-24

    A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis.

  13. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method

    Directory of Open Access Journals (Sweden)

    Yueqian Shen

    2016-12-01

    Full Text Available A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis.

  14. Simultaneous reconstruction of multiple depth images without off-focus points in integral imaging using a graphics processing unit.

    Science.gov (United States)

    Yi, Faliu; Lee, Jieun; Moon, Inkyu

    2014-05-01

    The reconstruction of multiple depth images with a ray back-propagation algorithm in three-dimensional (3D) computational integral imaging is computationally burdensome. Further, a reconstructed depth image consists of a focus and an off-focus area. Focus areas are 3D points on the surface of an object that are located at the reconstructed depth, while off-focus areas include 3D points in free-space that do not belong to any object surface in 3D space. Generally, without being removed, the presence of an off-focus area would adversely affect the high-level analysis of a 3D object, including its classification, recognition, and tracking. Here, we use a graphics processing unit (GPU) that supports parallel processing with multiple processors to simultaneously reconstruct multiple depth images using a lookup table containing the shifted values along the x and y directions for each elemental image in a given depth range. Moreover, each 3D point on a depth image can be measured by analyzing its statistical variance with its corresponding samples, which are captured by the two-dimensional (2D) elemental images. These statistical variances can be used to classify depth image pixels as either focus or off-focus points. At this stage, the measurement of focus and off-focus points in multiple depth images is also implemented in parallel on a GPU. Our proposed method is conducted based on the assumption that there is no occlusion of the 3D object during the capture stage of the integral imaging process. Experimental results have demonstrated that this method is capable of removing off-focus points in the reconstructed depth image. The results also showed that using a GPU to remove the off-focus points could greatly improve the overall computational speed compared with using a CPU.

  15. Stability Analysis of Periodic Systems by Truncated Point Mappings

    Science.gov (United States)

    Guttalu, R. S.; Flashner, H.

    1996-01-01

    An approach is presented deriving analytical stability and bifurcation conditions for systems with periodically varying coefficients. The method is based on a point mapping(period to period mapping) representation of the system's dynamics. An algorithm is employed to obtain an analytical expression for the point mapping and its dependence on the system's parameters. The algorithm is devised to derive the coefficients of a multinominal expansion of the point mapping up to an arbitrary order in terms of the state variables and of the parameters. Analytical stability and bifurcation condition are then formulated and expressed as functional relations between the parameters. To demonstrate the application of the method, the parametric stability of Mathieu's equation and of a two-degree of freedom system are investigated. The results obtained by the proposed approach are compared to those obtained by perturbation analysis and by direct integration which we considered to the "exact solution". It is shown that, unlike perturbation analysis, the proposed method provides very accurate solution even for large valuesof the parameters. If an expansion of the point mapping in terms of a small parameter is performed the method is equivalent to perturbation analysis. Moreover, it is demonstrated that the method can be easily applied to multiple-degree-of-freedom systems using the same framework. This feature is an important advantage since most of the existing analysis methods apply mainly to single-degree-of-freedom systems and their extension to higher dimensions is difficult and computationally cumbersome.

  16. Multiplicative point process as a model of trading activity

    Science.gov (United States)

    Gontis, V.; Kaulakys, B.

    2004-11-01

    Signals consisting of a sequence of pulses show that inherent origin of the 1/ f noise is a Brownian fluctuation of the average interevent time between subsequent pulses of the pulse sequence. In this paper, we generalize the model of interevent time to reproduce a variety of self-affine time series exhibiting power spectral density S( f) scaling as a power of the frequency f. Furthermore, we analyze the relation between the power-law correlations and the origin of the power-law probability distribution of the signal intensity. We introduce a stochastic multiplicative model for the time intervals between point events and analyze the statistical properties of the signal analytically and numerically. Such model system exhibits power-law spectral density S( f)∼1/ fβ for various values of β, including β= {1}/{2}, 1 and {3}/{2}. Explicit expressions for the power spectra in the low-frequency limit and for the distribution density of the interevent time are obtained. The counting statistics of the events is analyzed analytically and numerically, as well. The specific interest of our analysis is related with the financial markets, where long-range correlations of price fluctuations largely depend on the number of transactions. We analyze the spectral density and counting statistics of the number of transactions. The model reproduces spectral properties of the real markets and explains the mechanism of power-law distribution of trading activity. The study provides evidence that the statistical properties of the financial markets are enclosed in the statistics of the time interval between trades. A multiplicative point process serves as a consistent model generating this statistics.

  17. Two-step estimation for inhomogeneous spatial point processes

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Guan, Yongtao

    2009-01-01

    The paper is concerned with parameter estimation for inhomogeneous spatial point processes with a regression model for the intensity function and tractable second-order properties (K-function). Regression parameters are estimated by using a Poisson likelihood score estimating function and in the ...... and in the second step minimum contrast estimation is applied for the residual clustering parameters. Asymptotic normality of parameter estimates is established under certain mixing conditions and we exemplify how the results may be applied in ecological studies of rainforests....

  18. On estimation of the intensity function of a point process

    NARCIS (Netherlands)

    Lieshout, van M.N.M.

    2010-01-01

    Abstract. Estimation of the intensity function of spatial point processes is a fundamental problem. In this paper, we interpret the Delaunay tessellation field estimator recently introduced by Schaap and Van de Weygaert as an adaptive kernel estimator and give explicit expressions for the mean and

  19. The application of quality risk management to the bacterial endotoxins test: use of hazard analysis and critical control points.

    Science.gov (United States)

    Annalaura, Carducci; Giulia, Davini; Stefano, Ceccanti

    2013-01-01

    Risk analysis is widely used in the pharmaceutical industry to manage production processes, validation activities, training, and other activities. Several methods of risk analysis are available (for example, failure mode and effects analysis, fault tree analysis), and one or more should be chosen and adapted to the specific field where they will be applied. Among the methods available, hazard analysis and critical control points (HACCP) is a methodology that has been applied since the 1960s, and whose areas of application have expanded over time from food to the pharmaceutical industry. It can be easily and successfully applied to several processes because its main feature is the identification, assessment, and control of hazards. It can be also integrated with other tools, such as fishbone diagram and flowcharting. The aim of this article is to show how HACCP can be used to manage an analytical process, propose how to conduct the necessary steps, and provide data templates necessary to document and useful to follow current good manufacturing practices. In the quality control process, risk analysis is a useful tool for enhancing the uniformity of technical choices and their documented rationale. Accordingly, it allows for more effective and economical laboratory management, is capable of increasing the reliability of analytical results, and enables auditors and authorities to better understand choices that have been made. The aim of this article is to show how hazard analysis and critical control points can be used to manage bacterial endotoxins testing and other analytical processes in a formal, clear, and detailed manner.

  20. Nonlinear consider covariance analysis using a sigma-point filter formulation

    Science.gov (United States)

    Lisano, Michael E.

    2006-01-01

    The research reported here extends the mathematical formulation of nonlinear, sigma-point estimators to enable consider covariance analysis for dynamical systems. This paper presents a novel sigma-point consider filter algorithm, for consider-parameterized nonlinear estimation, following the unscented Kalman filter (UKF) variation on the sigma-point filter formulation, which requires no partial derivatives of dynamics models or measurement models with respect to the parameter list. It is shown that, consistent with the attributes of sigma-point estimators, a consider-parameterized sigma-point estimator can be developed entirely without requiring the derivation of any partial-derivative matrices related to the dynamical system, the measurements, or the considered parameters, which appears to be an advantage over the formulation of a linear-theory sequential consider estimator. It is also demonstrated that a consider covariance analysis performed with this 'partial-derivative-free' formulation yields equivalent results to the linear-theory consider filter, for purely linear problems.

  1. Fast Change Point Detection for Electricity Market Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Berkeley, UC; Gu, William; Choi, Jaesik; Gu, Ming; Simon, Horst; Wu, Kesheng

    2013-08-25

    Electricity is a vital part of our daily life; therefore it is important to avoid irregularities such as the California Electricity Crisis of 2000 and 2001. In this work, we seek to predict anomalies using advanced machine learning algorithms. These algorithms are effective, but computationally expensive, especially if we plan to apply them on hourly electricity market data covering a number of years. To address this challenge, we significantly accelerate the computation of the Gaussian Process (GP) for time series data. In the context of a Change Point Detection (CPD) algorithm, we reduce its computational complexity from O($n^{5}$) to O($n^{2}$). Our efficient algorithm makes it possible to compute the Change Points using the hourly price data from the California Electricity Crisis. By comparing the detected Change Points with known events, we show that the Change Point Detection algorithm is indeed effective in detecting signals preceding major events.

  2. Application of random-point processes to the detection of radiation sources

    International Nuclear Information System (INIS)

    Woods, J.W.

    1978-01-01

    In this report the mathematical theory of random-point processes is reviewed and it is shown how use of the theory can obtain optimal solutions to the problem of detecting radiation sources. As noted, the theory also applies to image processing in low-light-level or low-count-rate situations. Paralleling Snyder's work, the theory is extended to the multichannel case of a continuous, two-dimensional (2-D), energy-time space. This extension essentially involves showing that the data are doubly stochastic Poisson (DSP) point processes in energy as well as time. Further, a new 2-D recursive formulation is presented for the radiation-detection problem with large computational savings over nonrecursive techniques when the number of channels is large (greater than or equal to 30). Finally, some adaptive strategies for on-line ''learning'' of unknown, time-varying signal and background-intensity parameters and statistics are present and discussed. These adaptive procedures apply when a complete statistical description is not available a priori

  3. AKaplan-Meier estimators of distance distributions for spatial point processes

    NARCIS (Netherlands)

    Baddeley, A.J.; Gill, R.D.

    1997-01-01

    When a spatial point process is observed through a bounded window, edge effects hamper the estimation of characteristics such as the empty space function $F$, the nearest neighbour distance distribution $G$, and the reduced second order moment function $K$. Here we propose and study product-limit

  4. A case study on point process modelling in disease mapping

    Czech Academy of Sciences Publication Activity Database

    Beneš, Viktor; Bodlák, M.; Moller, J.; Waagepetersen, R.

    2005-01-01

    Roč. 24, č. 3 (2005), s. 159-168 ISSN 1580-3139 R&D Projects: GA MŠk 0021620839; GA ČR GA201/03/0946 Institutional research plan: CEZ:AV0Z10750506 Keywords : log Gaussian Cox point process * Bayesian estimation Subject RIV: BB - Applied Statistics, Operational Research

  5. Mean-field inference of Hawkes point processes

    International Nuclear Information System (INIS)

    Bacry, Emmanuel; Gaïffas, Stéphane; Mastromatteo, Iacopo; Muzy, Jean-François

    2016-01-01

    We propose a fast and efficient estimation method that is able to accurately recover the parameters of a d-dimensional Hawkes point-process from a set of observations. We exploit a mean-field approximation that is valid when the fluctuations of the stochastic intensity are small. We show that this is notably the case in situations when interactions are sufficiently weak, when the dimension of the system is high or when the fluctuations are self-averaging due to the large number of past events they involve. In such a regime the estimation of a Hawkes process can be mapped on a least-squares problem for which we provide an analytic solution. Though this estimator is biased, we show that its precision can be comparable to the one of the maximum likelihood estimator while its computation speed is shown to be improved considerably. We give a theoretical control on the accuracy of our new approach and illustrate its efficiency using synthetic datasets, in order to assess the statistical estimation error of the parameters. (paper)

  6. Variational approach for spatial point process intensity estimation

    DEFF Research Database (Denmark)

    Coeurjolly, Jean-Francois; Møller, Jesper

    is assumed to be of log-linear form β+θ⊤z(u) where z is a spatial covariate function and the focus is on estimating θ. The variational estimator is very simple to implement and quicker than alternative estimation procedures. We establish its strong consistency and asymptotic normality. We also discuss its...... finite-sample properties in comparison with the maximum first order composite likelihood estimator when considering various inhomogeneous spatial point process models and dimensions as well as settings were z is completely or only partially known....

  7. Image processing and analysis software development

    International Nuclear Information System (INIS)

    Shahnaz, R.

    1999-01-01

    The work presented in this project is aimed at developing a software 'IMAGE GALLERY' to investigate various image processing and analysis techniques. The work was divided into two parts namely the image processing techniques and pattern recognition, which further comprised of character and face recognition. Various image enhancement techniques including negative imaging, contrast stretching, compression of dynamic, neon, diffuse, emboss etc. have been studied. Segmentation techniques including point detection, line detection, edge detection have been studied. Also some of the smoothing and sharpening filters have been investigated. All these imaging techniques have been implemented in a window based computer program written in Visual Basic Neural network techniques based on Perception model have been applied for face and character recognition. (author)

  8. Analysis of web-based online services for GPS relative and precise point positioning techniques

    Directory of Open Access Journals (Sweden)

    Taylan Ocalan

    Full Text Available Nowadays, Global Positioning System (GPS has been used effectively in several engineering applications for the survey purposes by multiple disciplines. Web-based online services developed by several organizations; which are user friendly, unlimited and most of them are free; have become a significant alternative against the high-cost scientific and commercial software on achievement of post processing and analyzing the GPS data. When centimeter (cm or decimeter (dm level accuracies are desired, that can be obtained easily regarding different quality engineering applications through these services. In this paper, a test study was conducted at ISKI-CORS network; Istanbul-Turkey in order to figure out the accuracy analysis of the most used web based online services around the world (namely OPUS, AUSPOS, SCOUT, CSRS-PPP, GAPS, APPS, magicGNSS. These services use relative and precise point positioning (PPP solution approaches. In this test study, the coordinates of eight stations were estimated by using of both online services and Bernese 5.0 scientific GPS processing software from 24-hour GPS data set and then the coordinate differences between the online services and Bernese processing software were computed. From the evaluations, it was seen that the results for each individual differences were less than 10 mm regarding relative online service, and less than 20 mm regarding precise point positioning service. The accuracy analysis was gathered from these coordinate differences and standard deviations of the obtained coordinates from different techniques and then online services were compared to each other. The results show that the position accuracies obtained by associated online services provide high accurate solutions that may be used in many engineering applications and geodetic analysis.

  9. End point detection in ion milling processes by sputter-induced optical emission spectroscopy

    International Nuclear Information System (INIS)

    Lu, C.; Dorian, M.; Tabei, M.; Elsea, A.

    1984-01-01

    The characteristic optical emission from the sputtered material during ion milling processes can provide an unambiguous indication of the presence of the specific etched species. By monitoring the intensity of a representative emission line, the etching process can be precisely terminated at an interface. Enhancement of the etching end point is possible by using a dual-channel photodetection system operating in a ratio or difference mode. The installation of the optical detection system to an existing etching chamber has been greatly facilitated by the use of optical fibers. Using a commercial ion milling system, experimental data for a number of etching processes have been obtained. The result demonstrates that sputter-induced optical emission spectroscopy offers many advantages over other techniques in detecting the etching end point of ion milling processes

  10. Bayesian inference for multivariate point processes observed at sparsely distributed times

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl; Møller, Jesper; Aukema, B.H.

    We consider statistical and computational aspects of simulation-based Bayesian inference for a multivariate point process which is only observed at sparsely distributed times. For specicity we consider a particular data set which has earlier been analyzed by a discrete time model involving unknown...... normalizing constants. We discuss the advantages and disadvantages of using continuous time processes compared to discrete time processes in the setting of the present paper as well as other spatial-temporal situations. Keywords: Bark beetle, conditional intensity, forest entomology, Markov chain Monte Carlo...

  11. Theory of Single Point Incremental Forming

    DEFF Research Database (Denmark)

    Martins, P.A.F.; Bay, Niels; Skjødt, Martin

    2008-01-01

    This paper presents a closed-form theoretical analysis modelling the fundamentals of single point incremental forming and explaining the experimental and numerical results available in the literature for the past couple of years. The model is based on membrane analysis with bi-directional in-plan......-plane contact friction and is focused on the extreme modes of deformation that are likely to be found in single point incremental forming processes. The overall investigation is supported by experimental work performed by the authors and data retrieved from the literature.......This paper presents a closed-form theoretical analysis modelling the fundamentals of single point incremental forming and explaining the experimental and numerical results available in the literature for the past couple of years. The model is based on membrane analysis with bi-directional in...

  12. Microbial profile and critical control points during processing of 'robo ...

    African Journals Online (AJOL)

    STORAGESEVER

    2009-05-18

    May 18, 2009 ... frying, surface fat draining, open-air cooling, and holding/packaging in polyethylene films during sales and distribution. The product was, however, classified under category III with respect to risk and the significance of monitoring and evaluation of quality using the hazard analysis critical control point.

  13. Novel evaluation metrics for sparse spatio-temporal point process hotspot predictions - a crime case study

    OpenAIRE

    Adepeju, M.; Rosser, G.; Cheng, T.

    2016-01-01

    Many physical and sociological processes are represented as discrete events in time and space. These spatio-temporal point processes are often sparse, meaning that they cannot be aggregated and treated with conventional regression models. Models based on the point process framework may be employed instead for prediction purposes. Evaluating the predictive performance of these models poses a unique challenge, as the same sparseness prevents the use of popular measures such as the root mean squ...

  14. Putting to point the production process of iodine-131 by dry distillation (Preoperational tests)

    International Nuclear Information System (INIS)

    Alanis M, J.

    2002-12-01

    With the purpose of putting to point the process of production of 131 I, one of the objectives of carrying out the realization of operational tests of the production process of iodine-131, it was of verifying the operation of each one of the following components: heating systems, vacuum system, mechanical system and peripheral equipment that are part of the production process of iodine-131, another of the objectives, was settling down the optimal parameters that were applied in each process during the obtaining of iodine-131, it is necessary to point out that this objective is very important, since the components of the equipment are new and its behavior during the process is different to the equipment where its were carried out the experimental studies. (Author)

  15. Analysis of the resolution processes of three modeling tasks

    Directory of Open Access Journals (Sweden)

    Cèsar Gallart Palau

    2017-08-01

    Full Text Available In this paper we present a comparative analysis of the resolution process of three modeling tasks performed by secondary education students (13-14 years, designed from three different points of view: The Modelling-eliciting Activities, the LEMA project, and the Realistic Mathematical Problems. The purpose of this analysis is to obtain a methodological characterization of them in order to provide to secondary education teachers a proper selection and sequencing of tasks for their implementation in the classroom.

  16. Critical Points in Distance Learning System

    Directory of Open Access Journals (Sweden)

    Airina Savickaitė

    2013-08-01

    Full Text Available Purpose – This article presents the results of distance learning system analysis, i.e. the critical elements of the distance learning system. The critical points of distance learning are a part of distance education online environment interactivity/community process model. The most important is the fact that the critical point is associated with distance learning participants. Design/methodology/approach – Comparative review of articles and analysis of distance learning module. Findings – A modern man is a lifelong learner and distance learning is a way to be a modern person. The focus on a learner and feedback is the most important thing of learning distance system. Also, attention should be paid to the lecture-appropriate knowledge and ability to convey information. Distance system adaptation is the way to improve the learner’s learning outcomes. Research limitations/implications – Different learning disciplines and learning methods may have different critical points. Practical implications – The information of analysis could be important for both lecturers and students, who studies distance education systems. There are familiar critical points which may deteriorate the quality of learning. Originality/value – The study sought to develop remote systems for applications in order to improve the quality of knowledge. Keywords: distance learning, process model, critical points. Research type: review of literature and general overview.

  17. Ergonomic analysis of radiopharmaceuticals samples preparation process

    International Nuclear Information System (INIS)

    Gomes, Luciene Betzler C.; Santos, Isaac Luquetti dos; Fonseca, Antonio Carlos C. da; Pellini, Marcos Pinto; Rebelo, Ana Maria

    2005-01-01

    The doses of radioisotopes to be administrated in patients for diagnostic effect or therapy are prepared in the radiopharmacological sector. The preparation process adopts techniques that are aimed to reduce the exposition time of the professionals and the absorption of excessive doses for patients. The ergonomic analysis of this process contributes in the prevention of occupational illnesses and to prevent risks of accidents during the routines, providing welfare and security to the involved users and conferring to the process an adequate working standard. In this context it is perceived relevance of studies that deal with the analysis of factors that point with respect to the solution of problems and for establishing proposals that minimize risks in the exercise of the activities. Through a methodology that considers the application of the concepts of Ergonomics, it is searched the improvement of the effectiveness or the quality and reduction of the difficulties lived for the workers. The work prescribed, established through norms and procedures codified will be faced with the work effectively carried through, the real work, shaped to break the correct appreciation, with focus in the activities. This work has as objective to argue an ergonomic analysis of samples preparation process of radioisotopes in the Setor de Radiofarmacia do Hospital Universitario Clementino Fraga Filho da Universidade Federal do Rio de Janeiro (UFRJ). (author)

  18. Economic analysis of locust beans processing and marketing in ilorin, kwara state, Nigeria

    Directory of Open Access Journals (Sweden)

    C.O. Farayola

    2012-12-01

    Full Text Available This study was designed to estimate the economic analysis of locust bean processing and marketing in Ilorin, Kwara State, Nigeria. Primary data was used and purposive sampling technique was adopted to select the respondents used for the study. A total number of 60 respondents were interviewed. The data collected were analyzed using inferential statistical tool such as regression analysis. Budgetary analysis technique was also used to analyze the profitability of locust bean processing and marketing in the study area. Majority of the processors and marketers are making profits; 68.3% operate above breakeven point while 26.7% operate at breakeven point and the rest 5% was below the breakeven point, this indicates that they neither profit nor lost. The regression analysis result shows that quantity processed, family size and years of experience in processing are significant at 1%, 5% and 10% respectively while education level and stall rent is negative and significant at 1% and 5% respectively. F- Test also explained that independent variables are jointly significant at 1% probability level with an adjusted R2 of 78.9%. The overall rate of return on investment indicates that average rate of return is 0.5 (50%, which is positive. It is therefore concluded that profit made by the processors and marketers can be improved on by increasing the quantity of locust bean being processed through adoption of newly discovered method of processing and improved method of preservation, packaging and marketing of the product to international standard by reducing the odour of the product without the loss of essential nutrients and palability in order to generate foreign exchange. Also, rules and regulations against cutting of economic trees for alternative uses should be enforced to maximize their values.

  19. On the stability and dynamics of stochastic spiking neuron models: Nonlinear Hawkes process and point process GLMs.

    Science.gov (United States)

    Gerhard, Felipe; Deger, Moritz; Truccolo, Wilson

    2017-02-01

    Point process generalized linear models (PP-GLMs) provide an important statistical framework for modeling spiking activity in single-neurons and neuronal networks. Stochastic stability is essential when sampling from these models, as done in computational neuroscience to analyze statistical properties of neuronal dynamics and in neuro-engineering to implement closed-loop applications. Here we show, however, that despite passing common goodness-of-fit tests, PP-GLMs estimated from data are often unstable, leading to divergent firing rates. The inclusion of absolute refractory periods is not a satisfactory solution since the activity then typically settles into unphysiological rates. To address these issues, we derive a framework for determining the existence and stability of fixed points of the expected conditional intensity function (CIF) for general PP-GLMs. Specifically, in nonlinear Hawkes PP-GLMs, the CIF is expressed as a function of the previous spike history and exogenous inputs. We use a mean-field quasi-renewal (QR) approximation that decomposes spike history effects into the contribution of the last spike and an average of the CIF over all spike histories prior to the last spike. Fixed points for stationary rates are derived as self-consistent solutions of integral equations. Bifurcation analysis and the number of fixed points predict that the original models can show stable, divergent, and metastable (fragile) dynamics. For fragile models, fluctuations of the single-neuron dynamics predict expected divergence times after which rates approach unphysiologically high values. This metric can be used to estimate the probability of rates to remain physiological for given time periods, e.g., for simulation purposes. We demonstrate the use of the stability framework using simulated single-neuron examples and neurophysiological recordings. Finally, we show how to adapt PP-GLM estimation procedures to guarantee model stability. Overall, our results provide a

  20. SINGLE TREE DETECTION FROM AIRBORNE LASER SCANNING DATA USING A MARKED POINT PROCESS BASED METHOD

    Directory of Open Access Journals (Sweden)

    J. Zhang

    2013-05-01

    Full Text Available Tree detection and reconstruction is of great interest in large-scale city modelling. In this paper, we present a marked point process model to detect single trees from airborne laser scanning (ALS data. We consider single trees in ALS recovered canopy height model (CHM as a realization of point process of circles. Unlike traditional marked point process, we sample the model in a constraint configuration space by making use of image process techniques. A Gibbs energy is defined on the model, containing a data term which judge the fitness of the model with respect to the data, and prior term which incorporate the prior knowledge of object layouts. We search the optimal configuration through a steepest gradient descent algorithm. The presented hybrid framework was test on three forest plots and experiments show the effectiveness of the proposed method.

  1. Miniaturization for Point-of-Care Analysis: Platform Technology for Almost Every Biomedical Assay.

    Science.gov (United States)

    Schumacher, Soeren; Sartorius, Dorian; Ehrentreich-Förster, Eva; Bier, Frank F

    2012-10-01

    Platform technologies for the changing need of diagnostics are one of the main challenges in medical device technology. From one point-of-view the demand for new and more versatile diagnostic is increasing due to a deeper knowledge of biomarkers and their combination with diseases. From another point-of-view a decentralization of diagnostics will occur since decisions can be made faster resulting in higher success of therapy. Hence, new types of technologies have to be established which enables a multiparameter analysis at the point-of-care. Within this review-like article a system called Fraunhofer ivD-platform is introduced. It consists of a credit-card sized cartridge with integrated reagents, sensors and pumps and a read-out/processing unit. Within the cartridge the assay runs fully automated within 15-20 minutes. Due to the open design of the platform different analyses such as antibody, serological or DNA-assays can be performed. Specific examples of these three different assay types are given to show the broad applicability of the system.

  2. Pseudo-dynamic source modelling with 1-point and 2-point statistics of earthquake source parameters

    KAUST Repository

    Song, S. G.

    2013-12-24

    Ground motion prediction is an essential element in seismic hazard and risk analysis. Empirical ground motion prediction approaches have been widely used in the community, but efficient simulation-based ground motion prediction methods are needed to complement empirical approaches, especially in the regions with limited data constraints. Recently, dynamic rupture modelling has been successfully adopted in physics-based source and ground motion modelling, but it is still computationally demanding and many input parameters are not well constrained by observational data. Pseudo-dynamic source modelling keeps the form of kinematic modelling with its computational efficiency, but also tries to emulate the physics of source process. In this paper, we develop a statistical framework that governs the finite-fault rupture process with 1-point and 2-point statistics of source parameters in order to quantify the variability of finite source models for future scenario events. We test this method by extracting 1-point and 2-point statistics from dynamically derived source models and simulating a number of rupture scenarios, given target 1-point and 2-point statistics. We propose a new rupture model generator for stochastic source modelling with the covariance matrix constructed from target 2-point statistics, that is, auto- and cross-correlations. Our sensitivity analysis of near-source ground motions to 1-point and 2-point statistics of source parameters provides insights into relations between statistical rupture properties and ground motions. We observe that larger standard deviation and stronger correlation produce stronger peak ground motions in general. The proposed new source modelling approach will contribute to understanding the effect of earthquake source on near-source ground motion characteristics in a more quantitative and systematic way.

  3. MODELLING AND SIMULATION OF A NEUROPHYSIOLOGICAL EXPERIMENT BY SPATIO-TEMPORAL POINT PROCESSES

    Directory of Open Access Journals (Sweden)

    Viktor Beneš

    2011-05-01

    Full Text Available We present a stochastic model of an experimentmonitoring the spiking activity of a place cell of hippocampus of an experimental animal moving in an arena. Doubly stochastic spatio-temporal point process is used to model and quantify overdispersion. Stochastic intensity is modelled by a Lévy based random field while the animal path is simplified to a discrete random walk. In a simulation study first a method suggested previously is used. Then it is shown that a solution of the filtering problem yields the desired inference to the random intensity. Two approaches are suggested and the new one based on finite point process density is applied. Using Markov chain Monte Carlo we obtain numerical results from the simulated model. The methodology is discussed.

  4. Dense range images from sparse point clouds using multi-scale processing

    NARCIS (Netherlands)

    Do, Q.L.; Ma, L.; With, de P.H.N.

    2013-01-01

    Multi-modal data processing based on visual and depth/range images has become relevant in computer vision for 3D reconstruction applications such as city modeling, robot navigation etc. In this paper, we generate highaccuracy dense range images from sparse point clouds to facilitate such

  5. Visibility Analysis in a Point Cloud Based on the Medial Axis Transform

    NARCIS (Netherlands)

    Peters, R.; Ledoux, H.; Biljecki, F.

    2015-01-01

    Visibility analysis is an important application of 3D GIS data. Current approaches require 3D city models that are often derived from detailed aerial point clouds. We present an approach to visibility analysis that does not require a city model but works directly on the point cloud. Our approach is

  6. Microbiological performance of Hazard Analysis Critical Control Point (HACCP)-based food safety management systems: A case of Nile perch processing company

    NARCIS (Netherlands)

    Kussaga, J.B.; Luning, P.A.; Tiisekwa, B.P.M.; Jacxsens, L.

    2017-01-01

    This study aimed at giving insight into microbiological safety output of a Hazard Analysis Critical Control Point (HACCP)-based Food Safety Management System (FSMS) of a Nile perch exporting company by using a combined assessment, This study aimed at giving insight into microbiological safety output

  7. Growth Curve Analysis and Change-Points Detection in Extremes

    KAUST Repository

    Meng, Rui

    2016-05-15

    The thesis consists of two coherent projects. The first project presents the results of evaluating salinity tolerance in barley using growth curve analysis where different growth trajectories are observed within barley families. The study of salinity tolerance in plants is crucial to understanding plant growth and productivity. Because fully-automated smarthouses with conveyor systems allow non-destructive and high-throughput phenotyping of large number of plants, it is now possible to apply advanced statistical tools to analyze daily measurements and to study salinity tolerance. To compare different growth patterns of barley variates, we use functional data analysis techniques to analyze the daily projected shoot areas. In particular, we apply the curve registration method to align all the curves from the same barley family in order to summarize the family-wise features. We also illustrate how to use statistical modeling to account for spatial variation in microclimate in smarthouses and for temporal variation across runs, which is crucial for identifying traits of the barley variates. In our analysis, we show that the concentrations of sodium and potassium in leaves are negatively correlated, and their interactions are associated with the degree of salinity tolerance. The second project studies change-points detection methods in extremes when multiple time series data are available. Motived by the scientific question of whether the chances to experience extreme weather are different in different seasons of a year, we develop a change-points detection model to study changes in extremes or in the tail of a distribution. Most of existing models identify seasons from multiple yearly time series assuming a season or a change-point location remains exactly the same across years. In this work, we propose a random effect model that allows the change-point to vary from year to year, following a given distribution. Both parametric and nonparametric methods are developed

  8. Second-order analysis of inhomogeneous spatial point processes with proportional intensity functions

    DEFF Research Database (Denmark)

    Guan, Yongtao; Waagepetersen, Rasmus; Beale, Colin M.

    2008-01-01

    of the intensity functions. The first approach is based on nonparametric kernel-smoothing, whereas the second approach uses a conditional likelihood estimation approach to fit a parametric model for the pair correlation function. A great advantage of the proposed methods is that they do not require the often...... to two spatial point patterns regarding the spatial distributions of birds in the U.K.'s Peak District in 1990 and 2004....

  9. Spectrography analysis of stainless steel by the point to point technique

    International Nuclear Information System (INIS)

    Bona, A.

    1986-01-01

    A method for the determination of the elements Ni, Cr, Mn, Si, Mo, Nb, Cu, Co and V in stainless steel by emission spectrographic analysis using high voltage spark sources is presented. The 'point-to-point' technique is employed. The experimental parameters were optimized taking account a compromise between the detection sensitivity and the precision of the measurement. The parameters investigated were the high voltage capacitance, the inductance, the analytical and auxiliary gaps, the period of pre burn spark and the time of exposure. The edge shape of the counter electrodes and the type of polishing and diameter variation of the stailess steel eletrodes were evaluated in preliminary assays. In addition the degradation of the chemical power of the developer was also investigated. Counter electrodes of graphite, copper, aluminium and iron were employed and the counter electrode itself was used as an internal standard. In the case of graphite counter electrodes the iron lines were employed as internal standard. The relative errors were the criteria for evaluation of these experiments. The National Bureau of Standards - Certified reference stainless steel standards and the Eletrometal Acos Finos S.A. samples (certified by the supplier) were employed for drawing in the calibration systems and analytical curves. The best results were obtained using the convencional graphite counter electrodes. The inaccuracy and the imprecision of the proposed method varied from 2% to 15% and from 1% to 9% respectively. This present technique was compared to others instrumental techniques such as inductively coupled plasma, X-ray fluorescence and neutron activation analysis. The advantages and disadvantages for each case were discussed. (author) [pt

  10. Using a Virtual Experiment to Analyze Infiltration Process from Point to Grid-cell Size Scale

    Science.gov (United States)

    Barrios, M. I.

    2013-12-01

    The hydrological science requires the emergence of a consistent theoretical corpus driving the relationships between dominant physical processes at different spatial and temporal scales. However, the strong spatial heterogeneities and non-linearities of these processes make difficult the development of multiscale conceptualizations. Therefore, scaling understanding is a key issue to advance this science. This work is focused on the use of virtual experiments to address the scaling of vertical infiltration from a physically based model at point scale to a simplified physically meaningful modeling approach at grid-cell scale. Numerical simulations have the advantage of deal with a wide range of boundary and initial conditions against field experimentation. The aim of the work was to show the utility of numerical simulations to discover relationships between the hydrological parameters at both scales, and to use this synthetic experience as a media to teach the complex nature of this hydrological process. The Green-Ampt model was used to represent vertical infiltration at point scale; and a conceptual storage model was employed to simulate the infiltration process at the grid-cell scale. Lognormal and beta probability distribution functions were assumed to represent the heterogeneity of soil hydraulic parameters at point scale. The linkages between point scale parameters and the grid-cell scale parameters were established by inverse simulations based on the mass balance equation and the averaging of the flow at the point scale. Results have shown numerical stability issues for particular conditions and have revealed the complex nature of the non-linear relationships between models' parameters at both scales and indicate that the parameterization of point scale processes at the coarser scale is governed by the amplification of non-linear effects. The findings of these simulations have been used by the students to identify potential research questions on scale issues

  11. The implementation of a Hazard Analysis and Critical Control Point management system in a peanut butter ice cream plant

    Directory of Open Access Journals (Sweden)

    Yu-Ting Hung

    2015-09-01

    Full Text Available To ensure the safety of the peanut butter ice cream manufacture, a Hazard Analysis and Critical Control Point (HACCP plan has been designed and applied to the production process. Potential biological, chemical, and physical hazards in each manufacturing procedure were identified. Critical control points for the peanut butter ice cream were then determined as the pasteurization and freezing process. The establishment of a monitoring system, corrective actions, verification procedures, and documentation and record keeping were followed to complete the HACCP program. The results of this study indicate that implementing the HACCP system in food industries can effectively enhance food safety and quality while improving the production management.

  12. The neutron capture cross section of the ${s}$-process branch point isotope $^{63}$Ni

    CERN Multimedia

    Neutron capture nucleosynthesis in massive stars plays an important role in Galactic chemical evolution as well as for the analysis of abundance patterns in very old metal-poor halo stars. The so-called weak ${s}$-process component, which is responsible for most of the ${s}$ abundances between Fe and Sr, turned out to be very sensitive to the stellar neutron capture cross sections in this mass region and, in particular, of isotopes near the seed distribution around Fe. In this context, the unstable isotope $^{63}$Ni is of particular interest because it represents the first branching point in the reaction path of the ${s}$-process. We propose to measure this cross section at n_TOF from thermal energies up to 500 keV, covering the entire range of astrophysical interest. These data are needed to replace uncertain theoretical predicitons by first experimental information to understand the consequences of the $^{63}$Ni branching for the abundance pattern of the subsequent isotopes, especially for $^{63}$Cu and $^{...

  13. An application of image processing techniques in computed tomography image analysis

    DEFF Research Database (Denmark)

    McEvoy, Fintan

    2007-01-01

    number of animals and image slices, automation of the process was desirable. The open-source and free image analysis program ImageJ was used. A macro procedure was created that provided the required functionality. The macro performs a number of basic image processing procedures. These include an initial...... process designed to remove the scanning table from the image and to center the animal in the image. This is followed by placement of a vertical line segment from the mid point of the upper border of the image to the image center. Measurements are made between automatically detected outer and inner...... boundaries of subcutaneous adipose tissue along this line segment. This process was repeated as the image was rotated (with the line position remaining unchanged) so that measurements around the complete circumference were obtained. Additionally, an image was created showing all detected boundary points so...

  14. Super-Relaxed ( -Proximal Point Algorithms, Relaxed ( -Proximal Point Algorithms, Linear Convergence Analysis, and Nonlinear Variational Inclusions

    Directory of Open Access Journals (Sweden)

    Agarwal RaviP

    2009-01-01

    Full Text Available We glance at recent advances to the general theory of maximal (set-valued monotone mappings and their role demonstrated to examine the convex programming and closely related field of nonlinear variational inequalities. We focus mostly on applications of the super-relaxed ( -proximal point algorithm to the context of solving a class of nonlinear variational inclusion problems, based on the notion of maximal ( -monotonicity. Investigations highlighted in this communication are greatly influenced by the celebrated work of Rockafellar (1976, while others have played a significant part as well in generalizing the proximal point algorithm considered by Rockafellar (1976 to the case of the relaxed proximal point algorithm by Eckstein and Bertsekas (1992. Even for the linear convergence analysis for the overrelaxed (or super-relaxed ( -proximal point algorithm, the fundamental model for Rockafellar's case does the job. Furthermore, we attempt to explore possibilities of generalizing the Yosida regularization/approximation in light of maximal ( -monotonicity, and then applying to first-order evolution equations/inclusions.

  15. APPLICABILITY ANALYSIS OF CLOTH SIMULATION FILTERING ALGORITHM FOR MOBILE LIDAR POINT CLOUD

    Directory of Open Access Journals (Sweden)

    S. Cai

    2018-04-01

    Full Text Available Classifying the original point clouds into ground and non-ground points is a key step in LiDAR (light detection and ranging data post-processing. Cloth simulation filtering (CSF algorithm, which based on a physical process, has been validated to be an accurate, automatic and easy-to-use algorithm for airborne LiDAR point cloud. As a new technique of three-dimensional data collection, the mobile laser scanning (MLS has been gradually applied in various fields, such as reconstruction of digital terrain models (DTM, 3D building modeling and forest inventory and management. Compared with airborne LiDAR point cloud, there are some different features (such as point density feature, distribution feature and complexity feature for mobile LiDAR point cloud. Some filtering algorithms for airborne LiDAR data were directly used in mobile LiDAR point cloud, but it did not give satisfactory results. In this paper, we explore the ability of the CSF algorithm for mobile LiDAR point cloud. Three samples with different shape of the terrain are selected to test the performance of this algorithm, which respectively yields total errors of 0.44 %, 0.77 % and1.20 %. Additionally, large area dataset is also tested to further validate the effectiveness of this algorithm, and results show that it can quickly and accurately separate point clouds into ground and non-ground points. In summary, this algorithm is efficient and reliable for mobile LiDAR point cloud.

  16. Hazard rate model and statistical analysis of a compound point process

    Czech Academy of Sciences Publication Activity Database

    Volf, Petr

    2005-01-01

    Roč. 41, č. 6 (2005), s. 773-786 ISSN 0023-5954 R&D Projects: GA ČR(CZ) GA402/04/1294 Institutional research plan: CEZ:AV0Z10750506 Keywords : couting process * compound process * Cox regression model * intensity Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.343, year: 2005

  17. Processing of pulse oximeter data using discrete wavelet analysis.

    Science.gov (United States)

    Lee, Seungjoon; Ibey, Bennett L; Xu, Weijian; Wilson, Mark A; Ericson, M Nance; Coté, Gerard L

    2005-07-01

    A wavelet-based signal processing technique was employed to improve an implantable blood perfusion monitoring system. Data was acquired from both in vitro and in vivo sources: a perfusion model and the proximal jejunum of an adult pig. Results showed that wavelet analysis could isolate perfusion signals from raw, periodic, in vitro data as well as fast Fourier transform (FFT) methods. However, for the quasi-periodic in vivo data segments, wavelet analysis provided more consistent results than the FFT analysis for data segments of 50, 10, and 5 s in length. Wavelet analysis has thus been shown to require less data points for quasi-periodic data than FFT analysis making it a good choice for an indwelling perfusion monitor where power consumption and reaction time are paramount.

  18. [Introduction of hazard analysis and critical control points (HACCP) principles at the flight catering food production plant].

    Science.gov (United States)

    Popova, A Yu; Trukhina, G M; Mikailova, O M

    In the article there is considered the quality control and safety system implemented in the one of the largest flight catering food production plant for airline passengers and flying squad. The system for the control was based on the Hazard Analysis And Critical Control Points (HACCP) principles and developed hygienic and antiepidemic measures. There is considered the identification of hazard factors at stages of the technical process. There are presented results of the analysis data of monitoring for 6 critical control points over the five-year period. The quality control and safety system permit to decline food contamination risk during acceptance, preparation and supplying of in-flight meal. There was proved the efficiency of the implemented system. There are determined further ways of harmonization and implementation for HACCP principles in the plant.

  19. Maintenance Process Strategic Analysis

    Science.gov (United States)

    Jasiulewicz-Kaczmarek, M.; Stachowiak, A.

    2016-08-01

    The performance and competitiveness of manufacturing companies is dependent on the availability, reliability and productivity of their production facilities. Low productivity, downtime, and poor machine performance is often linked to inadequate plant maintenance, which in turn can lead to reduced production levels, increasing costs, lost market opportunities, and lower profits. These pressures have given firms worldwide the motivation to explore and embrace proactive maintenance strategies over the traditional reactive firefighting methods. The traditional view of maintenance has shifted into one of an overall view that encompasses Overall Equipment Efficiency, Stakeholders Management and Life Cycle assessment. From practical point of view it requires changes in approach to maintenance represented by managers and changes in actions performed within maintenance area. Managers have to understand that maintenance is not only about repairs and conservations of machines and devices, but also actions striving for more efficient resources management and care for safety and health of employees. The purpose of the work is to present strategic analysis based on SWOT analysis to identify the opportunities and strengths of maintenance process, to benefit from them as much as possible, as well as to identify weaknesses and threats, so that they could be eliminated or minimized.

  20. Numerical analysis on pump turbine runaway points

    International Nuclear Information System (INIS)

    Guo, L; Liu, J T; Wang, L Q; Jiao, L; Li, Z F

    2012-01-01

    To research the character of pump turbine runaway points with different guide vane opening, a hydraulic model was established based on a pumped storage power station. The RNG k-ε model and SMPLEC algorithms was used to simulate the internal flow fields. The result of the simulation was compared with the test data and good correspondence was got between experimental data and CFD result. Based on this model, internal flow analysis was carried out. The result show that when the pump turbine ran at the runway speed, lots of vortexes appeared in the flow passage of the runner. These vortexes could always be observed even if the guide vane opening changes. That is an important way of energy loss in the runaway condition. Pressure on two sides of the runner blades were almost the same. So the runner power is very low. High speed induced large centrifugal force and the small guide vane opening gave the water velocity a large tangential component, then an obvious water ring could be observed between the runner blades and guide vanes in small guide vane opening condition. That ring disappeared when the opening bigger than 20°. These conclusions can provide a theory basis for the analysis and simulation of the pump turbine runaway points.

  1. The Purification Method of Matching Points Based on Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    DONG Yang

    2017-02-01

    Full Text Available The traditional purification method of matching points usually uses a small number of the points as initial input. Though it can meet most of the requirements of point constraints, the iterative purification solution is easy to fall into local extreme, which results in the missing of correct matching points. To solve this problem, we introduce the principal component analysis method to use the whole point set as initial input. And thorough mismatching points step eliminating and robust solving, more accurate global optimal solution, which intends to reduce the omission rate of correct matching points and thus reaches better purification effect, can be obtained. Experimental results show that this method can obtain the global optimal solution under a certain original false matching rate, and can decrease or avoid the omission of correct matching points.

  2. The effect of starting point placement technique on thoracic transverse process strength: an ex vivo biomechanical study

    Directory of Open Access Journals (Sweden)

    Burton Douglas C

    2010-07-01

    Full Text Available Abstract Background The use of thoracic pedicle screws in spinal deformity, trauma, and tumor reconstruction is becoming more common. Unsuccessful screw placement may require salvage techniques utilizing transverse process hooks. The effect of different starting point placement techniques on the strength of the transverse process has not previously been reported. The purpose of this paper is to determine the biomechanical properties of the thoracic transverse process following various pedicle screw starting point placement techniques. Methods Forty-seven fresh-frozen human cadaveric thoracic vertebrae from T2 to T9 were disarticulated and matched by bone mineral density (BMD and transverse process (TP cross-sectional area. Specimens were randomized to one of four groups: A, control, and three others based on thoracic pedicle screw placement technique; B, straightforward; C, funnel; and D, in-out-in. Initial cortical bone removal for pedicle screw placement was made using a burr at the location on the transverse process or transverse process-laminar junction as published in the original description of each technique. The transverse process was tested measuring load-to-failure simulating a hook in compression mode. Analysis of covariance and Pearson correlation coefficients were used to examine the data. Results Technique was a significant predictor of load-to-failure (P = 0.0007. The least squares mean (LS mean load-to-failure of group A (control was 377 N, group B (straightforward 355 N, group C (funnel 229 N, and group D (in-out-in 301 N. Significant differences were noted between groups A and C, A and D, B and C, and C and D. BMD (0.925 g/cm2 [range, 0.624-1.301 g/cm2] was also a significant predictor of load-to-failure, for all specimens grouped together (P P 0.05. Level and side tested were not found to significantly correlate with load-to-failure. Conclusions The residual coronal plane compressive strength of the thoracic transverse process

  3. HACCP (Hazard Analysis and Critical Control Points) to guarantee safe water reuse and drinking water production--a case study.

    Science.gov (United States)

    Dewettinck, T; Van Houtte, E; Geenens, D; Van Hege, K; Verstraete, W

    2001-01-01

    To obtain a sustainable water catchment in the dune area of the Flemish west coast, the integration of treated domestic wastewater in the existing potable water production process is planned. The hygienic hazards associated with the introduction of treated domestic wastewater into the water cycle are well recognised. Therefore, the concept of HACCP (Hazard Analysis and Critical Control Points) was used to guarantee hygienically safe drinking water production. Taking into account the literature data on the removal efficiencies of the proposed advanced treatment steps with regard to enteric viruses and protozoa and after setting high quality limits based on the recent progress in quantitative risk assessment, the critical control points (CCPs) and points of attention (POAs) were identified. Based on the HACCP analysis a specific monitoring strategy was developed which focused on the control of these CCPs and POAs.

  4. A Multi-Point Method Considering the Maximum Power Point Tracking Dynamic Process for Aerodynamic Optimization of Variable-Speed Wind Turbine Blades

    Directory of Open Access Journals (Sweden)

    Zhiqiang Yang

    2016-05-01

    Full Text Available Due to the dynamic process of maximum power point tracking (MPPT caused by turbulence and large rotor inertia, variable-speed wind turbines (VSWTs cannot maintain the optimal tip speed ratio (TSR from cut-in wind speed up to the rated speed. Therefore, in order to increase the total captured wind energy, the existing aerodynamic design for VSWT blades, which only focuses on performance improvement at a single TSR, needs to be improved to a multi-point design. In this paper, based on a closed-loop system of VSWTs, including turbulent wind, rotor, drive train and MPPT controller, the distribution of operational TSR and its description based on inflow wind energy are investigated. Moreover, a multi-point method considering the MPPT dynamic process for the aerodynamic optimization of VSWT blades is proposed. In the proposed method, the distribution of operational TSR is obtained through a dynamic simulation of the closed-loop system under a specific turbulent wind, and accordingly the multiple design TSRs and the corresponding weighting coefficients in the objective function are determined. Finally, using the blade of a National Renewable Energy Laboratory (NREL 1.5 MW wind turbine as the baseline, the proposed method is compared with the conventional single-point optimization method using the commercial software Bladed. Simulation results verify the effectiveness of the proposed method.

  5. [Powdered infant formulae preparation guide for hospitals based on Hazard Analysis and Critical Control Points (HACCP) principles].

    Science.gov (United States)

    Vargas-Leguás, H; Rodríguez Garrido, V; Lorite Cuenca, R; Pérez-Portabella, C; Redecillas Ferreiro, S; Campins Martí, M

    2009-06-01

    This guide for the preparation of powdered infant formulae in hospital environments is a collaborative work between several hospital services and is based on national and European regulations, international experts meetings and the recommendations of scientific societies. This guide also uses the Hazard Analysis and Critical Control Point principles proposed by Codex Alimentarius and emphasises effective verifying measures, microbiological controls of the process and the corrective actions when monitoring indicates that a critical control point is not under control. It is a dynamic guide and specifies the evaluation procedures that allow it to be constantly adapted.

  6. Apparatus and method for implementing power saving techniques when processing floating point values

    Science.gov (United States)

    Kim, Young Moon; Park, Sang Phill

    2017-10-03

    An apparatus and method are described for reducing power when reading and writing graphics data. For example, one embodiment of an apparatus comprises: a graphics processor unit (GPU) to process graphics data including floating point data; a set of registers, at least one of the registers of the set partitioned to store the floating point data; and encode/decode logic to reduce a number of binary 1 values being read from the at least one register by causing a specified set of bit positions within the floating point data to be read out as 0s rather than 1s.

  7. Self-Exciting Point Process Modeling of Conversation Event Sequences

    Science.gov (United States)

    Masuda, Naoki; Takaguchi, Taro; Sato, Nobuo; Yano, Kazuo

    Self-exciting processes of Hawkes type have been used to model various phenomena including earthquakes, neural activities, and views of online videos. Studies of temporal networks have revealed that sequences of social interevent times for individuals are highly bursty. We examine some basic properties of event sequences generated by the Hawkes self-exciting process to show that it generates bursty interevent times for a wide parameter range. Then, we fit the model to the data of conversation sequences recorded in company offices in Japan. In this way, we can estimate relative magnitudes of the self excitement, its temporal decay, and the base event rate independent of the self excitation. These variables highly depend on individuals. We also point out that the Hawkes model has an important limitation that the correlation in the interevent times and the burstiness cannot be independently modulated.

  8. Structured spatio-temporal shot-noise Cox point process models, with a view to modelling forest fires

    DEFF Research Database (Denmark)

    Møller, Jesper; Diaz-Avalos, Carlos

    Spatio-temporal Cox point process models with a multiplicative structure for the driving random intensity, incorporating covariate information into temporal and spatial components, and with a residual term modelled by a shot-noise process, are considered. Such models are flexible and tractable fo...... dataset consisting of 2796 days and 5834 spatial locations of fires. The model is compared with a spatio-temporal log-Gaussian Cox point process model, and likelihood-based methods are discussed to some extent....

  9. Structured Spatio-temporal shot-noise Cox point process models, with a view to modelling forest fires

    DEFF Research Database (Denmark)

    Møller, Jesper; Diaz-Avalos, Carlos

    2010-01-01

    Spatio-temporal Cox point process models with a multiplicative structure for the driving random intensity, incorporating covariate information into temporal and spatial components, and with a residual term modelled by a shot-noise process, are considered. Such models are flexible and tractable fo...... data set consisting of 2796 days and 5834 spatial locations of fires. The model is compared with a spatio-temporal log-Gaussian Cox point process model, and likelihood-based methods are discussed to some extent....

  10. Georeferenced Point Clouds: A Survey of Features and Point Cloud Management

    Directory of Open Access Journals (Sweden)

    Johannes Otepka

    2013-10-01

    Full Text Available This paper presents a survey of georeferenced point clouds. Concentration is, on the one hand, put on features, which originate in the measurement process themselves, and features derived by processing the point cloud. On the other hand, approaches for the processing of georeferenced point clouds are reviewed. This includes the data structures, but also spatial processing concepts. We suggest a categorization of features into levels that reflect the amount of processing. Point clouds are found across many disciplines, which is reflected in the versatility of the literature suggesting specific features.

  11. Bubble point pressures of the selected model system for CatLiq® bio-oil process

    DEFF Research Database (Denmark)

    Toor, Saqib Sohail; Rosendahl, Lasse; Baig, Muhammad Noman

    2010-01-01

    . In this work, the bubble point pressures of a selected model mixture (CO2 + H2O + Ethanol + Acetic acid + Octanoic acid) were measured to investigate the phase boundaries of the CatLiq® process. The bubble points were measured in the JEFRI-DBR high pressure PVT phase behavior system. The experimental results......The CatLiq® process is a second generation catalytic liquefaction process for the production of bio-oil from WDGS (Wet Distillers Grains with Solubles) at subcritical conditions (280-350 oC and 225-250 bar) in the presence of a homogeneous alkaline and a heterogeneous Zirconia catalyst...

  12. Percolation analysis for cosmic web with discrete points

    Science.gov (United States)

    Zhang, Jiajun; Cheng, Dalong; Chu, Ming-Chung

    2018-01-01

    Percolation analysis has long been used to quantify the connectivity of the cosmic web. Most of the previous work is based on density fields on grids. By smoothing into fields, we lose information about galaxy properties like shape or luminosity. The lack of mathematical modeling also limits our understanding for the percolation analysis. To overcome these difficulties, we have studied percolation analysis based on discrete points. Using a friends-of-friends (FoF) algorithm, we generate the S -b b relation, between the fractional mass of the largest connected group (S ) and the FoF linking length (b b ). We propose a new model, the probability cloud cluster expansion theory to relate the S -b b relation with correlation functions. We show that the S -b b relation reflects a combination of all orders of correlation functions. Using N-body simulation, we find that the S -b b relation is robust against redshift distortion and incompleteness in observation. From the Bolshoi simulation, with halo abundance matching (HAM), we have generated a mock galaxy catalog. Good matching of the projected two-point correlation function with observation is confirmed. However, comparing the mock catalog with the latest galaxy catalog from Sloan Digital Sky Survey (SDSS) Data Release (DR)12, we have found significant differences in their S -b b relations. This indicates that the mock galaxy catalog cannot accurately retain higher-order correlation functions than the two-point correlation function, which reveals the limit of the HAM method. As a new measurement, the S -b b relation is applicable to a wide range of data types, fast to compute, and robust against redshift distortion and incompleteness and contains information of all orders of correlation functions.

  13. Music analysis and point-set compression

    DEFF Research Database (Denmark)

    Meredith, David

    A musical analysis represents a particular way of understanding certain aspects of the structure of a piece of music. The quality of an analysis can be evaluated to some extent by the degree to which knowledge of it improves performance on tasks such as mistake spotting, memorising a piece...... as the minimum description length principle and relates closely to certain ideas in the theory of Kolmogorov complexity. Inspired by this general principle, the hypothesis explored in this paper is that the best ways of understanding (or explanations for) a piece of music are those that are represented...... by the shortest possible descriptions of the piece. With this in mind, two compression algorithms are presented, COSIATEC and SIATECCompress. Each of these algorithms takes as input an in extenso description of a piece of music as a set of points in pitch-time space representing notes. Each algorithm...

  14. Processing Terrain Point Cloud Data

    KAUST Repository

    DeVore, Ronald; Petrova, Guergana; Hielsberg, Matthew; Owens, Luke; Clack, Billy; Sood, Alok

    2013-01-01

    Terrain point cloud data are typically acquired through some form of Light Detection And Ranging sensing. They form a rich resource that is important in a variety of applications including navigation, line of sight, and terrain visualization

  15. A Combined Control Chart for Identifying Out–Of–Control Points in Multivariate Processes

    Directory of Open Access Journals (Sweden)

    Marroquín–Prado E.

    2010-10-01

    Full Text Available The Hotelling's T2 control chart is widely used to identify out–of–control signals in multivariate processes. However, this chart is not sensitive to small shifts in the process mean vec tor. In this work we propose a control chart to identify out–of–control signals. The proposed chart is a combination of Hotelling's T2 chart, M chart proposed by Hayter et al. (1994 and a new chart based on Principal Components. The combination of these charts identifies any type and size of change in the process mean vector. Us ing simulation and the Average Run Length (ARL, the performance of the proposed control chart is evaluated. The ARL means the average points within control before an out–of–control point is detected, The results of the simulation show that the proposed chart is more sensitive that each one of the three charts individually

  16. The implementation of a Hazard Analysis and Critical Control Point management system in a peanut butter ice cream plant.

    Science.gov (United States)

    Hung, Yu-Ting; Liu, Chi-Te; Peng, I-Chen; Hsu, Chin; Yu, Roch-Chui; Cheng, Kuan-Chen

    2015-09-01

    To ensure the safety of the peanut butter ice cream manufacture, a Hazard Analysis and Critical Control Point (HACCP) plan has been designed and applied to the production process. Potential biological, chemical, and physical hazards in each manufacturing procedure were identified. Critical control points for the peanut butter ice cream were then determined as the pasteurization and freezing process. The establishment of a monitoring system, corrective actions, verification procedures, and documentation and record keeping were followed to complete the HACCP program. The results of this study indicate that implementing the HACCP system in food industries can effectively enhance food safety and quality while improving the production management. Copyright © 2015. Published by Elsevier B.V.

  17. The Melting Point of Palladium Using Miniature Fixed Points of Different Ceramic Materials: Part II—Analysis of Melting Curves and Long-Term Investigation

    Science.gov (United States)

    Edler, F.; Huang, K.

    2016-12-01

    Fifteen miniature fixed-point cells made of three different ceramic crucible materials (Al2O3, ZrO2, and Al2O3(86 %)+ZrO2(14 %)) were filled with pure palladium and used to calibrate type B thermocouples (Pt30 %Rh/Pt6 %Rh). A critical point by using miniature fixed points with small amounts of fixed-point material is the analysis of the melting curves, which are characterized by significant slopes during the melting process compared to flat melting plateaus obtainable using conventional fixed-point cells. The method of the extrapolated starting point temperature using straight line approximation of the melting plateau was applied to analyze the melting curves. This method allowed an unambiguous determination of an electromotive force (emf) assignable as melting temperature. The strict consideration of two constraints resulted in a unique, repeatable and objective method to determine the emf at the melting temperature within an uncertainty of about 0.1 μ V. The lifetime and long-term stability of the miniature fixed points was investigated by performing more than 100 melt/freeze cycles for each crucible of the different ceramic materials. No failure of the crucibles occurred indicating an excellent mechanical stability of the investigated miniature cells. The consequent limitation of heating rates to values below {± }3.5 K min^{-1} above 1100° C and the carefully and completely filled crucibles (the liquid palladium occupies the whole volume of the crucible) are the reasons for successfully preventing the crucibles from breaking. The thermal stability of the melting temperature of palladium was excellent when using the crucibles made of Al2O3(86 %)+ZrO2(14 %) and ZrO2. Emf drifts over the total duration of the long-term investigation were below a temperature equivalent of about 0.1 K-0.2 K.

  18. Imitation learning of Non-Linear Point-to-Point Robot Motions using Dirichlet Processes

    DEFF Research Database (Denmark)

    Krüger, Volker; Tikhanoff, Vadim; Natale, Lorenzo

    2012-01-01

    In this paper we discuss the use of the infinite Gaussian mixture model and Dirichlet processes for learning robot movements from demonstrations. Starting point of this work is an earlier paper where the authors learn a non-linear dynamic robot movement model from a small number of observations....... The model in that work is learned using a classical finite Gaussian mixture model (FGMM) where the Gaussian mixtures are appropriately constrained. The problem with this approach is that one needs to make a good guess for how many mixtures the FGMM should use. In this work, we generalize this approach...... our algorithm on the same data that was used in [5], where the authors use motion capture devices to record the demonstrations. As further validation we test our approach on novel data acquired on our iCub in a different demonstration scenario in which the robot is physically driven by the human...

  19. Prospects for direct neutron capture measurements on s-process branching point isotopes

    Energy Technology Data Exchange (ETDEWEB)

    Guerrero, C.; Lerendegui-Marco, J.; Quesada, J.M. [Universidad de Sevilla, Dept. de Fisica Atomica, Molecular y Nuclear, Sevilla (Spain); Domingo-Pardo, C. [CSIC-Universidad de Valencia, Instituto de Fisica Corpuscular, Valencia (Spain); Kaeppeler, F. [Karlsruhe Institute of Technology, Institut fuer Kernphysik, Karlsruhe (Germany); Palomo, F.R. [Universidad de Sevilla, Dept. de Ingenieria Electronica, Sevilla (Spain); Reifarth, R. [Goethe-Universitaet Frankfurt am Main, Frankfurt am Main (Germany)

    2017-05-15

    The neutron capture cross sections of several unstable key isotopes acting as branching points in the s-process are crucial for stellar nucleosynthesis studies, but they are very challenging to measure directly due to the difficult production of sufficient sample material, the high activity of the resulting samples, and the actual (n, γ) measurement, where high neutron fluxes and effective background rejection capabilities are required. At present there are about 21 relevant s-process branching point isotopes whose cross section could not be measured yet over the neutron energy range of interest for astrophysics. However, the situation is changing with some very recent developments and upcoming technologies. This work introduces three techniques that will change the current paradigm in the field: the use of γ-ray imaging techniques in (n, γ) experiments, the production of moderated neutron beams using high-power lasers, and double capture experiments in Maxwellian neutron beams. (orig.)

  20. Determination of the impact of RGB points cloud attribute quality on color-based segmentation process

    Directory of Open Access Journals (Sweden)

    Bartłomiej Kraszewski

    2015-06-01

    Full Text Available The article presents the results of research on the effect that radiometric quality of point cloud RGB attributes have on color-based segmentation. In the research, a point cloud with a resolution of 5 mm, received from FAROARO Photon 120 scanner, described the fragment of an office’s room and color images were taken by various digital cameras. The images were acquired by SLR Nikon D3X, and SLR Canon D200 integrated with the laser scanner, compact camera Panasonic TZ-30 and a mobile phone digital camera. Color information from images was spatially related to point cloud in FAROARO Scene software. The color-based segmentation of testing data was performed with the use of a developed application named “RGB Segmentation”. The application was based on public Point Cloud Libraries (PCL and allowed to extract subsets of points fulfilling the criteria of segmentation from the source point cloud using region growing method.Using the developed application, the segmentation of four tested point clouds containing different RGB attributes from various images was performed. Evaluation of segmentation process was performed based on comparison of segments acquired using the developed application and extracted manually by an operator. The following items were compared: the number of obtained segments, the number of correctly identified objects and the correctness of segmentation process. The best correctness of segmentation and most identified objects were obtained using the data with RGB attribute from Nikon D3X images. Based on the results it was found that quality of RGB attributes of point cloud had impact only on the number of identified objects. In case of correctness of the segmentation, as well as its error no apparent relationship between the quality of color information and the result of the process was found.[b]Keywords[/b]: terrestrial laser scanning, color-based segmentation, RGB attribute, region growing method, digital images, points cloud

  1. Two step estimation for Neyman-Scott point process with inhomogeneous cluster centers

    Czech Academy of Sciences Publication Activity Database

    Mrkvička, T.; Muška, Milan; Kubečka, Jan

    2014-01-01

    Roč. 24, č. 1 (2014), s. 91-100 ISSN 0960-3174 R&D Projects: GA ČR(CZ) GA206/07/1392 Institutional support: RVO:60077344 Keywords : bayesian method * clustering * inhomogeneous point process Subject RIV: EH - Ecology, Behaviour Impact factor: 1.623, year: 2014

  2. Application of hazard analysis critical control points (HACCP) to organic chemical contaminants in food.

    Science.gov (United States)

    Ropkins, K; Beck, A J

    2002-03-01

    Hazard Analysis Critical Control Points (HACCP) is a systematic approach to the identification, assessment, and control of hazards that was developed as an effective alternative to conventional end-point analysis to control food safety. It has been described as the most effective means of controlling foodborne diseases, and its application to the control of microbiological hazards has been accepted internationally. By contrast, relatively little has been reported relating to the potential use of HACCP, or HACCP-like procedures, to control chemical contaminants of food. This article presents an overview of the implementation of HACCP and discusses its application to the control of organic chemical contaminants in the food chain. Although this is likely to result in many of the advantages previously identified for microbiological HACCP, that is, more effective, efficient, and economical hazard management, a number of areas are identified that require further research and development. These include: (1) a need to refine the methods of chemical contaminant identification and risk assessment employed, (2) develop more cost-effective monitoring and control methods for routine chemical contaminant surveillance of food, and (3) improve the effectiveness of process optimization for the control of chemical contaminants in food.

  3. Characterization results and Markov chain Monte Carlo algorithms including exact simulation for some spatial point processes

    DEFF Research Database (Denmark)

    Häggström, Olle; Lieshout, Marie-Colette van; Møller, Jesper

    1999-01-01

    The area-interaction process and the continuum random-cluster model are characterized in terms of certain functional forms of their respective conditional intensities. In certain cases, these two point process models can be derived from a bivariate point process model which in many respects...... is simpler to analyse and simulate. Using this correspondence we devise a two-component Gibbs sampler, which can be used for fast and exact simulation by extending the recent ideas of Propp and Wilson. We further introduce a Swendsen-Wang type algorithm. The relevance of the results within spatial statistics...

  4. The application of prototype point processes for the summary and description of California wildfires

    Science.gov (United States)

    Nichols, K.; Schoenberg, F.P.; Keeley, J.E.; Bray, A.; Diez, D.

    2011-01-01

    A method for summarizing repeated realizations of a space-time marked point process, known as prototyping, is discussed and applied to catalogues of wildfires in California. Prototype summaries are constructed for varying time intervals using California wildfire data from 1990 to 2006. Previous work on prototypes for temporal and space-time point processes is extended here to include methods for computing prototypes with marks and the incorporation of prototype summaries into hierarchical clustering algorithms, the latter of which is used to delineate fire seasons in California. Other results include summaries of patterns in the spatial-temporal distribution of wildfires within each wildfire season. ?? 2011 Blackwell Publishing Ltd.

  5. Instantaneous nonlinear assessment of complex cardiovascular dynamics by Laguerre-Volterra point process models.

    Science.gov (United States)

    Valenza, Gaetano; Citi, Luca; Barbieri, Riccardo

    2013-01-01

    We report an exemplary study of instantaneous assessment of cardiovascular dynamics performed using point-process nonlinear models based on Laguerre expansion of the linear and nonlinear Wiener-Volterra kernels. As quantifiers, instantaneous measures such as high order spectral features and Lyapunov exponents can be estimated from a quadratic and cubic autoregressive formulation of the model first order moment, respectively. Here, these measures are evaluated on heartbeat series coming from 16 healthy subjects and 14 patients with Congestive Hearth Failure (CHF). Data were gathered from the on-line repository PhysioBank, which has been taken as landmark for testing nonlinear indices. Results show that the proposed nonlinear Laguerre-Volterra point-process methods are able to track the nonlinear and complex cardiovascular dynamics, distinguishing significantly between CHF and healthy heartbeat series.

  6. Students’ Algebraic Thinking Process in Context of Point and Line Properties

    Science.gov (United States)

    Nurrahmi, H.; Suryadi, D.; Fatimah, S.

    2017-09-01

    Learning of schools algebra is limited to symbols and operating procedures, so students are able to work on problems that only require the ability to operate symbols but unable to generalize a pattern as one of part of algebraic thinking. The purpose of this study is to create a didactic design that facilitates students to do algebraic thinking process through the generalization of patterns, especially in the context of the property of point and line. This study used qualitative method and includes Didactical Design Research (DDR). The result is students are able to make factual, contextual, and symbolic generalization. This happen because the generalization arises based on facts on local terms, then the generalization produced an algebraic formula that was described in the context and perspective of each student. After that, the formula uses the algebraic letter symbol from the symbol t hat uses the students’ language. It can be concluded that the design has facilitated students to do algebraic thinking process through the generalization of patterns, especially in the context of property of the point and line. The impact of this study is this design can use as one of material teaching alternative in learning of school algebra.

  7. Fast covariance estimation for innovations computed from a spatial Gibbs point process

    DEFF Research Database (Denmark)

    Coeurjolly, Jean-Francois; Rubak, Ege

    In this paper, we derive an exact formula for the covariance of two innovations computed from a spatial Gibbs point process and suggest a fast method for estimating this covariance. We show how this methodology can be used to estimate the asymptotic covariance matrix of the maximum pseudo...

  8. Analysis of tree stand horizontal structure using random point field methods

    Directory of Open Access Journals (Sweden)

    O. P. Sekretenko

    2015-06-01

    Full Text Available This paper uses the model approach to analyze the horizontal structure of forest stands. The main types of models of random point fields and statistical procedures that can be used to analyze spatial patterns of trees of uneven and even-aged stands are described. We show how modern methods of spatial statistics can be used to address one of the objectives of forestry – to clarify the laws of natural thinning of forest stand and the corresponding changes in its spatial structure over time. Studying natural forest thinning, we describe the consecutive stages of modeling: selection of the appropriate parametric model, parameter estimation and generation of point patterns in accordance with the selected model, the selection of statistical functions to describe the horizontal structure of forest stands and testing of statistical hypotheses. We show the possibilities of a specialized software package, spatstat, which is designed to meet the challenges of spatial statistics and provides software support for modern methods of analysis of spatial data. We show that a model of stand thinning that does not consider inter-tree interaction can project the size distribution of the trees properly, but the spatial pattern of the modeled stand is not quite consistent with observed data. Using data of three even-aged pine forest stands of 25, 55, and 90-years old, we demonstrate that the spatial point process models are useful for combining measurements in the forest stands of different ages to study the forest stand natural thinning.

  9. Steam generators secondary side chemical cleaning at Point Lepreau using the Siemen's high temperature process

    International Nuclear Information System (INIS)

    Verma, K.; MacNeil, C.; Odar, S.

    1996-01-01

    The secondary sides of all four steam generators at the Point Lepreau Nuclear Generating Stations were cleaned during the 1995 annual outage run-down using the Siemens high temperature chemical cleaning process. Traditionally all secondary side chemical cleaning exercises in CANDU as well as the other nuclear power stations in North America have been conducted using a process developed in conjunction with the Electric Power Research Institute (EPRI). The Siemens high temperature process was applied for the first time in North America at the Point Lepreau Nuclear Generating Station (PLGS). The paper discusses experiences related to the pre and post award chemical cleaning activities, chemical cleaning application, post cleaning inspection results and waste handling activities. (author)

  10. A generic Transcriptomics Reporting Framework (TRF) for 'omics data processing and analysis.

    Science.gov (United States)

    Gant, Timothy W; Sauer, Ursula G; Zhang, Shu-Dong; Chorley, Brian N; Hackermüller, Jörg; Perdichizzi, Stefania; Tollefsen, Knut E; van Ravenzwaay, Ben; Yauk, Carole; Tong, Weida; Poole, Alan

    2017-12-01

    A generic Transcriptomics Reporting Framework (TRF) is presented that lists parameters that should be reported in 'omics studies used in a regulatory context. The TRF encompasses the processes from transcriptome profiling from data generation to a processed list of differentially expressed genes (DEGs) ready for interpretation. Included within the TRF is a reference baseline analysis (RBA) that encompasses raw data selection; data normalisation; recognition of outliers; and statistical analysis. The TRF itself does not dictate the methodology for data processing, but deals with what should be reported. Its principles are also applicable to sequencing data and other 'omics. In contrast, the RBA specifies a simple data processing and analysis methodology that is designed to provide a comparison point for other approaches and is exemplified here by a case study. By providing transparency on the steps applied during 'omics data processing and analysis, the TRF will increase confidence processing of 'omics data, and regulatory use. Applicability of the TRF is ensured by its simplicity and generality. The TRF can be applied to all types of regulatory 'omics studies, and it can be executed using different commonly available software tools. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.

  11. Screw compressor analysis from a vibration point-of-view

    Science.gov (United States)

    Hübel, D.; Žitek, P.

    2017-09-01

    Vibrations are a very typical feature of all compressors and are given great attention in the industry. The reason for this interest is primarily the negative influence that it can have on both the operating staff and the entire machine's service life. The purpose of this work is to describe the methodology of screw compressor analysis from a vibration point-of-view. This analysis is an essential part of the design of vibro-diagnostics of screw compressors with regard to their service life.

  12. Nuclear binding around the RP-process waiting points $^{68}$Se and $^{72}$Kr

    CERN Multimedia

    2002-01-01

    Encouraged by the success of mass determinations of nuclei close to the Z=N line performed at ISOLTRAP during the year 2000 and of the recent decay spectroscopy studies on neutron-deficient Kr isotopes (IS351 collaboration), we aim to measure masses and proton separation energies of the bottleneck nuclei defining the flow of the astrophysical rp-process beyond A$\\sim$70. In detail, the program includes mass measurements of the rp-process waiting point nuclei $^{68}$Se and $^{72}$Kr and determination of proton separation energies of the proton-unbound $^{69}$Br and $^{73}$Rb via $\\beta$-decays of $^{69}$Kr and $^{73}$Sr, respectively. The aim of the project is to complete the experimental database for astrophysical network calculations and for the liquid-drop type of mass models typically used in the modelling of the astrophysical rp process in the region. The first beamtime is scheduled for the August 2001 and the aim is to measure the absolute mass of the waiting-point nucleus $^{72}$Kr.

  13. What's the point? Golden and Labrador retrievers living in kennels do not understand human pointing gestures.

    Science.gov (United States)

    D'Aniello, Biagio; Alterisio, Alessandra; Scandurra, Anna; Petremolo, Emanuele; Iommelli, Maria Rosaria; Aria, Massimo

    2017-07-01

    In many studies that have investigated whether dogs' capacities to understand human pointing gestures are aspects of evolutionary or developmental social competences, family-owned dogs have been compared to shelter dogs. However, for most of these studies, the origins of shelter dogs were unknown. Some shelter dogs may have lived with families before entering shelters, and from these past experiences, they may have learned to understand human gestures. Furthermore, there is substantial variation in the methodology and analytic approaches used in such studies (e.g. different pointing protocols, different treatment of trials with no-choice response and indoor vs. outdoor experimental arenas). Such differences in methodologies and analysis techniques used make it difficult to compare results obtained from different studies and may account for the divergent results obtained. We thus attempted to control for several parameters by carrying out a test on dynamic proximal and distal pointing. We studied eleven kennel dogs of known origin that were born and raised in a kennels with limited human interaction. This group was compared to a group of eleven dogs comparable in terms of breed, sex and age that had lived with human families since they were puppies. Our results demonstrate that pet dogs outperform kennel dogs in their comprehension of proximal and distal pointing, regardless of whether trials where no-choice was made were considered as errors or were excluded from statistical analysis, meaning that dogs living in kennels do not understand pointing gestures. Even if genetic effects of the domestication process on human-dog relationships cannot be considered as negligible, our data suggest that dogs need to learn human pointing gestures and thus underscore the importance of ontogenetic processes.

  14. ON THE ESTIMATION OF DISTANCE DISTRIBUTION FUNCTIONS FOR POINT PROCESSES AND RANDOM SETS

    Directory of Open Access Journals (Sweden)

    Dietrich Stoyan

    2011-05-01

    Full Text Available This paper discusses various estimators for the nearest neighbour distance distribution function D of a stationary point process and for the quadratic contact distribution function Hq of a stationary random closed set. It recommends the use of Hanisch's estimator of D, which is of Horvitz-Thompson type, and the minussampling estimator of Hq. This recommendation is based on simulations for Poisson processes and Boolean models.

  15. A random point process model for the score in sport matches

    Czech Academy of Sciences Publication Activity Database

    Volf, Petr

    2009-01-01

    Roč. 20, č. 2 (2009), s. 121-131 ISSN 1471-678X R&D Projects: GA AV ČR(CZ) IAA101120604 Institutional research plan: CEZ:AV0Z10750506 Keywords : sport statistics * scoring intensity * Cox’s regression model Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2009/SI/volf-a random point process model for the score in sport matches.pdf

  16. Bayesian analysis of log Gaussian Cox processes for disease mapping

    DEFF Research Database (Denmark)

    Benes, Viktor; Bodlák, Karel; Møller, Jesper

    We consider a data set of locations where people in Central Bohemia have been infected by tick-borne encephalitis, and where population census data and covariates concerning vegetation and altitude are available. The aims are to estimate the risk map of the disease and to study the dependence...... of the risk on the covariates. Instead of using the common area level approaches we consider a Bayesian analysis for a log Gaussian Cox point process with covariates. Posterior characteristics for a discretized version of the log Gaussian Cox process are computed using markov chain Monte Carlo methods...

  17. Dual keel Space Station payload pointing system design and analysis feasibility study

    Science.gov (United States)

    Smagala, Tom; Class, Brian F.; Bauer, Frank H.; Lebair, Deborah A.

    1988-01-01

    A Space Station attached Payload Pointing System (PPS) has been designed and analyzed. The PPS is responsible for maintaining fixed payload pointing in the presence of disturbance applied to the Space Station. The payload considered in this analysis is the Solar Optical Telescope. System performance is evaluated via digital time simulations by applying various disturbance forces to the Space Station. The PPS meets the Space Station articulated pointing requirement for all disturbances except Shuttle docking and some centrifuge cases.

  18. Numerical analysis of slowest heating or cooling point in a canned food in oil

    Energy Technology Data Exchange (ETDEWEB)

    Hanzawa, T.; Wang, Q.; Suzuki, M.; Sakai, N. [Tokyo Univ. of Fisheries (Japan)

    1998-06-01

    In the sterilizing process of canned food in oil for a fish meat such as tunny, the slowest heating or cooling point is very important for the thermal process determination of the can. To obtain the slowest point, the temperature profiles in solid food are estimated by numerical calculation from the fundamental equations at unsteady state in consideration of a free convection in the space occupied by the oil. The positions of the slowest heating or cooling point in the canned food in oil are obtained accurately, and a correlative equation for the position is obtained numerically under various operating conditions. The calculated temperature profiles and the position of both slowest points are in sufficiently good approximation to the experimental ones. 4 refs., 9 figs.

  19. [Design of a Hazard Analysis and Critical Control Points (HACCP) plan to assure the safety of a bologna product produced by a meat processing plant].

    Science.gov (United States)

    Bou Rached, Lizet; Ascanio, Norelis; Hernández, Pilar

    2004-03-01

    The Hazard Analysis and Critical Control Point (HACCP) is a systematic integral program used to identify and estimate the hazards (microbiological, chemical and physical) and the risks generated during the primary production, processing, storage, distribution, expense and consumption of foods. To establish a program of HACCP has advantages, being some of them: to emphasize more in the prevention than in the detection, to diminish the costs, to minimize the risk of manufacturing faulty products, to allow bigger trust to the management, to strengthen the national and international competitiveness, among others. The present work is a proposal based on the design of an HACCP program to guarantee the safety of the Bologna Special Type elaborated by a meat products industry, through the determination of hazards (microbiological, chemical or physical), the identification of critical control points (CCP), the establishment of critical limits, plan corrective actions and the establishment of documentation and verification procedures. The used methodology was based in the application of the seven basic principles settled down by the Codex Alimentarius, obtaining the design of this program. In view of the fact that recently the meat products are linked with pathogens like E. coli O157:H7 and Listeria monocytogenes, these were contemplated as microbiological hazard for the establishment of the HACCP plan whose application will guarantee the obtaining of a safe product.

  20. Accessibility analysis in manufacturing processes using visibility cones

    Institute of Scientific and Technical Information of China (English)

    尹周平; 丁汉; 熊有伦

    2002-01-01

    Accessibility is a kind of important design feature of products,and accessibility analysis has been acknowledged as a powerful tool for solving computational manufacturing problems arising from different manufacturing processes.After exploring the relations among approachability,accessibility and visibility,a general method for accessibility analysis using visibility cones (VC) is proposed.With the definition of VC of a point,three kinds of visibility of a feature,namely complete visibility cone (CVC),partial visibility cone (PVC) and local visibility cone (LVC),are defined.A novel approach to computing VCs is formulated by identifying C-obstacles in the C-space,for which a general and efficient algorithm is proposed and implemented by making use of visibility culling.Lastly,we discuss briefly how to realize accessibility analysis in numerically controlled (NC) machining planning,coordinate measuring machines (CMMs) inspection planning and assembly sequence planning with the proposed methods.

  1. Systematic Assessment of Attenuated Total Reflectance-Fourier Transforms Infrared Spectroscopy Coupled with Multivariate Analysis for Forensic Analysis of Black Ball-point Pen Inks

    International Nuclear Information System (INIS)

    Lee, L.C.; Mohamed Rozali Othman; Pua, H.; Lee, L.C.

    2012-01-01

    This manuscript aims to provide a new and non-destructive method for systematic analysis of inks on a questioned document. Ink samples were analyzed in situ on the paper substrate by micro-ATR-FTIR spectroscopy and the data obtained was processed and evaluated by a series of multivariate chemometrics. Absorbance value from wavenumbers of 2000-675 cm -1 were first processed by cluster analysis (CA), followed by principal component analysis (PCA) to form a set of new variables. Subsequently, the variables set was used for classification, differentiation and identification of 155 sample pens that comprise nine different brands. Results show that nine black ball-point pen brands could be classified into three main groups via discriminant analysis (DA). Differentiation analyses of nine different pen brands performed using one-way ANOVA indicated only two pairs of brands cannot be differentiated at 95 % confidence interval. Finally an identification flow chart was proposed to determine the brand of unknown pen inks. As a conclusion, the proposed method for extracting and creating a new variable set from infrared spectrum was evaluated to be satisfactory for systematic analysis of inks based on their infrared spectrum. (author)

  2. Badge Office Process Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Haurykiewicz, John Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dinehart, Timothy Grant [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parker, Robert Young [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-12

    The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with information and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.

  3. [Multiple time scales analysis of spatial differentiation characteristics of non-point source nitrogen loss within watershed].

    Science.gov (United States)

    Liu, Mei-bing; Chen, Xing-wei; Chen, Ying

    2015-07-01

    Identification of the critical source areas of non-point source pollution is an important means to control the non-point source pollution within the watershed. In order to further reveal the impact of multiple time scales on the spatial differentiation characteristics of non-point source nitrogen loss, a SWAT model of Shanmei Reservoir watershed was developed. Based on the simulation of total nitrogen (TN) loss intensity of all 38 subbasins, spatial distribution characteristics of nitrogen loss and critical source areas were analyzed at three time scales of yearly average, monthly average and rainstorms flood process, respectively. Furthermore, multiple linear correlation analysis was conducted to analyze the contribution of natural environment and anthropogenic disturbance on nitrogen loss. The results showed that there were significant spatial differences of TN loss in Shanmei Reservoir watershed at different time scales, and the spatial differentiation degree of nitrogen loss was in the order of monthly average > yearly average > rainstorms flood process. TN loss load mainly came from upland Taoxi subbasin, which was identified as the critical source area. At different time scales, land use types (such as farmland and forest) were always the dominant factor affecting the spatial distribution of nitrogen loss, while the effect of precipitation and runoff on the nitrogen loss was only taken in no fertilization month and several processes of storm flood at no fertilization date. This was mainly due to the significant spatial variation of land use and fertilization, as well as the low spatial variability of precipitation and runoff.

  4. Vibration analysis and vibration damage assessment in nuclear and process equipment

    International Nuclear Information System (INIS)

    Pettigrew, M.J.; Taylor, C.E.; Fisher, N.J.; Yetisir, M.; Smith, B.A.W.

    1997-01-01

    Component failures due to excessive flow-induced vibration are still affecting the performance and reliability of process and nuclear components. The purpose of this paper is to discuss flow-induced vibration analysis and vibration damage prediction. Vibration excitation mechanisms are described with particular emphasis on fluid elastic instability. The dynamic characteristics of process and power equipment are explained. The statistical nature of some parameters, in particular support conditions, is discussed. The prediction of fretting-wear damage is approached from several points-of-view. An energy approach to formulate fretting-wear damage is proposed. (author)

  5. Implementation of Steiner point of fuzzy set.

    Science.gov (United States)

    Liang, Jiuzhen; Wang, Dejiang

    2014-01-01

    This paper deals with the implementation of Steiner point of fuzzy set. Some definitions and properties of Steiner point are investigated and extended to fuzzy set. This paper focuses on establishing efficient methods to compute Steiner point of fuzzy set. Two strategies of computing Steiner point of fuzzy set are proposed. One is called linear combination of Steiner points computed by a series of crisp α-cut sets of the fuzzy set. The other is an approximate method, which is trying to find the optimal α-cut set approaching the fuzzy set. Stability analysis of Steiner point of fuzzy set is also studied. Some experiments on image processing are given, in which the two methods are applied for implementing Steiner point of fuzzy image, and both strategies show their own advantages in computing Steiner point of fuzzy set.

  6. The Hinkley Point decision: An analysis of the policy process

    International Nuclear Information System (INIS)

    Thomas, Stephen

    2016-01-01

    In 2006, the British government launched a policy to build nuclear power reactors based on a claim that the power produced would be competitive with fossil fuel and would require no public subsidy. A decade later, it is not clear how many, if any, orders will be placed and the claims on costs and subsidies have proved false. Despite this failure to deliver, the policy is still being pursued with undiminished determination. The finance model that is now proposed is seen as a model other European countries can follow so the success or otherwise of the British nuclear programme will have implications outside the UK. This paper contends that the checks and balances that should weed out misguided policies, have failed. It argues that the most serious failure is with the civil service and its inability to provide politicians with high quality advice – truth to power. It concludes that the failure is likely to be due to the unwillingness of politicians to listen to opinions that conflict with their beliefs. Other weaknesses include the lack of energy expertise in the media, the unwillingness of the public to engage in the policy process and the impotence of Parliamentary Committees. - Highlights: •Britain's nuclear power policy is failing due to high costs and problems of finance. •This has implications for European countries who want to use the same financing model. •The continued pursuit of a failing policy is due to poor advice from civil servants. •Lack of expertise in the media and lack of public engagement have contributed. •Parliamentary processes have not provided proper critical scrutiny.

  7. Feedwater line break accident analysis for SMART in the view point of minimum departure from nucleate boiling ratio

    International Nuclear Information System (INIS)

    Kim Soo Hyoung; Bae, Kyoo Hwan; Chung, Young Jong; Kim, Keung Koo

    2012-01-01

    KAERI and KEPCO consortium had performed standard design of SMART(System integrated Modular Advanced ReacTor) from 2009 to 2011 and obtained standard design approval in July 2012. To confirm the safety of SMART design, all of the safety related design basis events were analyzed. A feedwater line break (FLB) is a postulated accident and is a limiting accident for a decrease in the heat removal by the secondary system in the view point of the peak RCS pressure. It is well known that departure from nucleate boiling ratio (DNBR) increases with the increase of the system pressure for conventional nuclear power plants. But SMART has comparatively lower RCS flow rate, and there is a possibility to show different DNBR behavior depending on the system pressure. To confirm that SMART is safe in case of FLB accident, the Korean nuclear regulatory body required to perform the safety analysis in the view point of minimum DNBR (MDNBR) during the licensing review process for standard design approval (SDA) of SMART design. In this paper, the safety analysis results of the FLB accident for SMART in the view point of MDNBR is described

  8. Feedwater line break accident analysis for SMART in the view point of minimum departure from nucleate boiling ratio

    Energy Technology Data Exchange (ETDEWEB)

    Kim Soo Hyoung; Bae, Kyoo Hwan; Chung, Young Jong; Kim, Keung Koo [KAERI, Daejeon (Korea, Republic of)

    2012-10-15

    KAERI and KEPCO consortium had performed standard design of SMART(System integrated Modular Advanced ReacTor) from 2009 to 2011 and obtained standard design approval in July 2012. To confirm the safety of SMART design, all of the safety related design basis events were analyzed. A feedwater line break (FLB) is a postulated accident and is a limiting accident for a decrease in the heat removal by the secondary system in the view point of the peak RCS pressure. It is well known that departure from nucleate boiling ratio (DNBR) increases with the increase of the system pressure for conventional nuclear power plants. But SMART has comparatively lower RCS flow rate, and there is a possibility to show different DNBR behavior depending on the system pressure. To confirm that SMART is safe in case of FLB accident, the Korean nuclear regulatory body required to perform the safety analysis in the view point of minimum DNBR (MDNBR) during the licensing review process for standard design approval (SDA) of SMART design. In this paper, the safety analysis results of the FLB accident for SMART in the view point of MDNBR is described.

  9. Microchip capillary electrophoresis for point-of-care analysis of lithium

    NARCIS (Netherlands)

    Vrouwe, E.X.; Luttge, R.; Vermes, I.; Berg, van den A.

    2007-01-01

    Background: Microchip capillary electrophoresis (CE) is a promising method for chemical analysis of complex samples such as whole blood. We evaluated the method for point-of-care testing of lithium. Methods: Chemical separation was performed on standard glass microchip CE devices with a conductivity

  10. Environmental protection standards - from the point of view of systems analysis

    Energy Technology Data Exchange (ETDEWEB)

    Becker, K

    1978-11-01

    A project of the International Institute of Applied Systems Analysis (IIASA) in Laxenburg castle near Vienna is reviewed where standards for environmental protection are interpreted from the point of view of systems analysis. Some examples are given to show how results are influenced not only by technical and economic factors but also by psychological and political factors.

  11. Environmental protection standards - from the point of view of systems analysis

    International Nuclear Information System (INIS)

    Becker, K.

    1978-01-01

    A project of the International Institute of Applied Systems Analysis (IIASA) in Laxenburg castle near Vienna is reviewed where standards for environmental protection are interpreted from the point of view of systems analysis. Some examples are given to show how results are influenced not only by technical and economic factors but also by psychological and political factors. (orig.) [de

  12. Probabilistic safety assessment and optimal control of hazardous technological systems. A marked point process approach

    International Nuclear Information System (INIS)

    Holmberg, J.

    1997-04-01

    The thesis models risk management as an optimal control problem for a stochastic process. The approach classes the decisions made by management into three categories according to the control methods of a point process: (1) planned process lifetime, (2) modification of the design, and (3) operational decisions. The approach is used for optimization of plant shutdown criteria and surveillance test strategies of a hypothetical nuclear power plant

  13. The Hazard Analysis and Critical Control Points (HACCP) generic model for the production of Thai fermented pork sausage (Nham).

    Science.gov (United States)

    Paukatong, K V; Kunawasen, S

    2001-01-01

    Nham is a traditional Thai fermented pork sausage. The major ingredients of Nham are ground pork meat and shredded pork rind. Nham has been reported to be contaminated with Salmonella spp., Staphylococcus aureus, and Listeria monocytogenes. Therefore, it is a potential cause of foodborne diseases for consumers. A Hazard Analysis and Critical Control Points (HACCP) generic model has been developed for the Nham process. Nham processing plants were observed and a generic flow diagram of Nham processes was constructed. Hazard analysis was then conducted. Other than microbial hazards, the pathogens previously found in Nham, sodium nitrite and metal were identified as chemical and physical hazards in this product, respectively. Four steps in the Nham process have been identified as critical control points. These steps are the weighing of the nitrite compound, stuffing, fermentation, and labeling. The chemical hazard of nitrite must be controlled during the weighing step. The critical limit of nitrite levels in the Nham mixture has been set at 100-200 ppm. This level is high enough to control Clostridium botulinum but does not cause chemical hazards to the consumer. The physical hazard from metal clips could be prevented by visual inspection of every Nham product during stuffing. The microbiological hazard in Nham could be reduced in the fermentation process. The critical limit of the pH of Nham was set at lower than 4.6. Since this product is not cooked during processing, finally, educating the consumer, by providing information on the label such as "safe if cooked before consumption", could be an alternative way to prevent the microbiological hazards of this product.

  14. Homotopy analysis solutions of point kinetics equations with one delayed precursor group

    International Nuclear Information System (INIS)

    Zhu Qian; Luo Lei; Chen Zhiyun; Li Haofeng

    2010-01-01

    Homotopy analysis method is proposed to obtain series solutions of nonlinear differential equations. Homotopy analysis method was applied for the point kinetics equations with one delayed precursor group. Analytic solutions were obtained using homotopy analysis method, and the algorithm was analysed. The results show that the algorithm computation time and precision agree with the engineering requirements. (authors)

  15. Numerical analysis of sandwich beam with corrugated core under three-point bending

    Energy Technology Data Exchange (ETDEWEB)

    Wittenbeck, Leszek [Poznan University of Technology, Institute of Mathematics Piotrowo Street No. 5, 60-965 Poznan (Poland); Grygorowicz, Magdalena; Paczos, Piotr [Poznan University of Technology, Institute of Applied Mechanics Jana Pawla IIStreet No. 24, 60-965 Poznan (Poland)

    2015-03-10

    The strength problem of sandwich beam with corrugated core under three-point bending is presented.The beam are made of steel and formed by three mutually orthogonal corrugated layers. The finite element analysis (FEA) of the sandwich beam is performed with the use of the FEM system - ABAQUS. The relationship between the applied load and deflection in three-point bending is considered.

  16. The process of life-cycle cost analysis on the Fernald Environmental Management Project

    International Nuclear Information System (INIS)

    Chang, D.Y.; Jacoboski, J.A.; Fisher, L.A.; Beirne, P.J.

    1993-01-01

    The Estimating Services Department of the Fernald Environmental Restoration Management Corporation (FERMCO) is formalizing the process of life-cycle cost analysis (LCCA) for the Fernald Environmental Management Project (FEMP). The LCCA process is based on the concepts, principles, and guidelines described by applicable Department of Energy's (DOE) orders, pertinent published literature, and the National Bureau of Standards handbook 135. LCC analyses will be performed following a ten-step process on the FEMP at the earliest possible decision point to support the selection of the least-cost alternatives for achieving the FERMCO mission

  17. Determine point-to-point networking interactions using regular expressions

    Directory of Open Access Journals (Sweden)

    Konstantin S. Deev

    2015-06-01

    Full Text Available As Internet growth and becoming more popular, the number of concurrent data flows start to increasing, which makes sense in bandwidth requested. Providers and corporate customers need ability to identify point-to-point interactions. The best is to use special software and hardware implementations that distribute the load in the internals of the complex, using the principles and approaches, in particular, described in this paper. This paper represent the principles of building system, which searches for a regular expression match using computing on graphics adapter in server station. A significant computing power and capability to parallel execution on modern graphic processor allows inspection of large amounts of data through sets of rules. Using the specified characteristics can lead to increased computing power in 30…40 times compared to the same setups on the central processing unit. The potential increase in bandwidth capacity could be used in systems that provide packet analysis, firewalls and network anomaly detectors.

  18. Throughput analysis of point-to-multi-point hybric FSO/RF network

    KAUST Repository

    Rakia, Tamer; Gebali, Fayez; Yang, Hong-Chuan; Alouini, Mohamed-Slim

    2017-01-01

    This paper presents and analyzes a point-to-multi-point (P2MP) network that uses a number of free-space optical (FSO) links for data transmission from the central node to the different remote nodes. A common backup radio-frequency (RF) link is used

  19. Powder stickiness in milk drying: uncertainty and sensitivity analysis for process understanding

    DEFF Research Database (Denmark)

    Ferrari, Adrián; Gutiérrez, Soledad; Sin, Gürkan

    2017-01-01

    A powder stickiness model based in the glass transition temperature (Gordon – Taylor equations) was built for a production scale milk drying process (including a spray chamber, and internal/external fluid beds). To help process understanding, the model was subjected to sensitivity analysis (SA...... for nonlinear error propagation was selected as the main UA approach. SA results show an important local sensitivity on the spray dryer, but at the end of the internal fluid bed (critical point for stickiness) minor local sensitivities were observed. Feed concentrate moisture was found as the input with major...... global sensitivity on the glass transition temperature at the critical point, so it could represent a key variable for helping on stickiness control. UA results show the major model predictions uncertainty on the spray dryer, but it does not represent a stickiness issue since the product...

  20. Research on an uplink carrier sense multiple access algorithm of large indoor visible light communication networks based on an optical hard core point process.

    Science.gov (United States)

    Nan, Zhufen; Chi, Xuefen

    2016-12-20

    The IEEE 802.15.7 protocol suggests that it could coordinate the channel access process based on the competitive method of carrier sensing. However, the directionality of light and randomness of diffuse reflection would give rise to a serious imperfect carrier sense (ICS) problem [e.g., hidden node (HN) problem and exposed node (EN) problem], which brings great challenges in realizing the optical carrier sense multiple access (CSMA) mechanism. In this paper, the carrier sense process implemented by diffuse reflection light is modeled as the choice of independent sets. We establish an ICS model with the presence of ENs and HNs for the multi-point to multi-point visible light communication (VLC) uplink communications system. Considering the severe optical ICS problem, an optical hard core point process (OHCPP) is developed, which characterizes the optical CSMA for the indoor VLC uplink communications system. Due to the limited coverage of the transmitted optical signal, in our OHCPP, the ENs within the transmitters' carrier sense region could be retained provided that they could not corrupt the ongoing communications. Moreover, because of the directionality of both light emitting diode (LED) transmitters and receivers, theoretical analysis of the HN problem becomes difficult. In this paper, we derive the closed-form expression for approximating the outage probability and transmission capacity of VLC networks with the presence of HNs and ENs. Simulation results validate the analysis and also show the existence of an optimal physical carrier-sensing threshold that maximizes the transmission capacity for a given emission angle of LED.

  1. A business process model as a starting point for tight cooperation among organizations

    Directory of Open Access Journals (Sweden)

    O. Mysliveček

    2006-01-01

    Full Text Available Outsourcing and other kinds of tight cooperation among organizations are more and more necessary for success on all markets (markets of high technology products are particularly influenced. Thus it is important for companies to be able to effectively set up all kinds of cooperation. A business process model (BPM is a suitable starting point for this future cooperation. In this paper the process of setting up such cooperation is outlined, as well as why it is important for business success. 

  2. Topobathymetric LiDAR point cloud processing and landform classification in a tidal environment

    Science.gov (United States)

    Skovgaard Andersen, Mikkel; Al-Hamdani, Zyad; Steinbacher, Frank; Rolighed Larsen, Laurids; Brandbyge Ernstsen, Verner

    2017-04-01

    Historically it has been difficult to create high resolution Digital Elevation Models (DEMs) in land-water transition zones due to shallow water depth and often challenging environmental conditions. This gap of information has been reflected as a "white ribbon" with no data in the land-water transition zone. In recent years, the technology of airborne topobathymetric Light Detection and Ranging (LiDAR) has proven capable of filling out the gap by simultaneously capturing topographic and bathymetric elevation information, using only a single green laser. We collected green LiDAR point cloud data in the Knudedyb tidal inlet system in the Danish Wadden Sea in spring 2014. Creating a DEM from a point cloud requires the general processing steps of data filtering, water surface detection and refraction correction. However, there is no transparent and reproducible method for processing green LiDAR data into a DEM, specifically regarding the procedure of water surface detection and modelling. We developed a step-by-step procedure for creating a DEM from raw green LiDAR point cloud data, including a procedure for making a Digital Water Surface Model (DWSM) (see Andersen et al., 2017). Two different classification analyses were applied to the high resolution DEM: A geomorphometric and a morphological classification, respectively. The classification methods were originally developed for a small test area; but in this work, we have used the classification methods to classify the complete Knudedyb tidal inlet system. References Andersen MS, Gergely Á, Al-Hamdani Z, Steinbacher F, Larsen LR, Ernstsen VB (2017). Processing and performance of topobathymetric lidar data for geomorphometric and morphological classification in a high-energy tidal environment. Hydrol. Earth Syst. Sci., 21: 43-63, doi:10.5194/hess-21-43-2017. Acknowledgements This work was funded by the Danish Council for Independent Research | Natural Sciences through the project "Process-based understanding and

  3. One-point fluctuation analysis of the high-energy neutrino sky

    DEFF Research Database (Denmark)

    Feyereisen, Michael R.; Tamborra, Irene; Ando, Shin'ichiro

    2017-01-01

    We perform the first one-point fluctuation analysis of the high-energy neutrino sky. This method reveals itself to be especially suited to contemporary neutrino data, as it allows to study the properties of the astrophysical components of the high-energy flux detected by the IceCube telescope, even...

  4. Analysis of irregularly distributed points

    DEFF Research Database (Denmark)

    Hartelius, Karsten

    1996-01-01

    conditional modes are applied to this problem. The Kalman filter is described as a powerfull tool for modelling two-dimensional data. Motivated by the development of the reduced update Kalman filter we propose a reduced update Kalman smoother which offers considerable computa- tional savings. Kriging...... on hybridisation analysis, which comprise matching a grid to an arrayed set of DNA- clones spotted onto a hybridisation filter. The line process has proven to perform a satisfactorly modelling of shifted fields (subgrids) in the hybridisation grid, and a two-staged hierarchical grid matching scheme which...

  5. Experimental analysis of influence of different lubricants types on the multi-phase ironing process

    Directory of Open Access Journals (Sweden)

    Milan Djordjević

    2013-05-01

    Full Text Available This paper is aimed at presenting results of an experimental analysis of the different types of lubricants influence on the multi-phase ironing process. Based on sliding of the metal strip between the two contact elements a special tribological model was adopted. The subject of experimental investigations was variations of the drawing force, contact pressure and the friction coefficient for each type of the applied lubricants. The ironing process was conducted in three-phases at the constant sliding velocity. The objective of this analysis was to compare all the applied lubricants in order to estimate their quality from the point of view of their applicability in the multi-phase ironing process.

  6. EXPERIMENTAL ANALYSIS OF INFLUENCE OF DIFFERENT LUBRICANTS TYPES ON THE MULTI-PHASE IRONING PROCESS

    Directory of Open Access Journals (Sweden)

    Milan Djordjević

    2013-09-01

    Full Text Available This paper is aimed at presenting results of an experimental analysis of the different types of lubricants influence on the multi-phase ironing process. Based on sliding of the metal strip between the two contact elements a special tribological model was adopted. The subject of experimental investigations was variations of the drawing force, contact pressure and the friction coefficient for each type of the applied lubricants. The ironing process was conducted in three-phases at the constant sliding velocity. The objective of this analysis was to compare all the applied lubricants in order to estimate their quality from the point of view of their applicability in the multi-phase ironing process.

  7. Multiple Monte Carlo Testing with Applications in Spatial Point Processes

    DEFF Research Database (Denmark)

    Mrkvička, Tomáš; Myllymäki, Mari; Hahn, Ute

    with a function as the test statistic, 3) several Monte Carlo tests with functions as test statistics. The rank test has correct (global) type I error in each case and it is accompanied with a p-value and with a graphical interpretation which shows which subtest or which distances of the used test function......(s) lead to the rejection at the prescribed significance level of the test. Examples of null hypothesis from point process and random set statistics are used to demonstrate the strength of the rank envelope test. The examples include goodness-of-fit test with several test functions, goodness-of-fit test...

  8. The effect of post-processing treatments on inflection points in current–voltage curves of roll-to-roll processed polymer photovoltaics

    DEFF Research Database (Denmark)

    Lilliedal, Mathilde Raad; Medford, Andrew James; Vesterager Madsen, Morten

    2010-01-01

    Inflection point behaviour is often observed in the current–voltage (IV) curve of polymer solar cells. This phenomenon is examined in the context of flexible roll-to-roll (R2R) processed polymer solar cells in a large series of devices with a layer structure of: PET–ITO–ZnO–P3HT...... characterization of device interfaces was carried out in order to identify possible chemical processes that are related to photo-annealing. A possible mechanism based on ZnO photoconductivity, photooxidation and redistribution of oxygen inside the cell is proposed, and it is anticipated that the findings......:PCBM–PEDOT:PSS–Ag. The devices were manufactured using a combination of slot-die coating and screen printing; they were then encapsulated by lamination using a polymer based barrier material. All manufacturing steps were carried out in ambient air. The freshly prepared devices showed a consistent inflection point in the IV...

  9. A numerical analysis on forming limits during spiral and concentric single point incremental forming

    Science.gov (United States)

    Gipiela, M. L.; Amauri, V.; Nikhare, C.; Marcondes, P. V. P.

    2017-01-01

    Sheet metal forming is one of the major manufacturing industries, which are building numerous parts for aerospace, automotive and medical industry. Due to the high demand in vehicle industry and environmental regulations on less fuel consumption on other hand, researchers are innovating new methods to build these parts with energy efficient sheet metal forming process instead of conventionally used punch and die to form the parts to achieve the lightweight parts. One of the most recognized manufacturing process in this category is Single Point Incremental Forming (SPIF). SPIF is the die-less sheet metal forming process in which the single point tool incrementally forces any single point of sheet metal at any process time to plastic deformation zone. In the present work, finite element method (FEM) is applied to analyze the forming limits of high strength low alloy steel formed by single point incremental forming (SPIF) by spiral and concentric tool path. SPIF numerical simulations were model with 24 and 29 mm cup depth, and the results were compare with Nakajima results obtained by experiments and FEM. It was found that the cup formed with Nakajima tool failed at 24 mm while cups formed by SPIF surpassed the limit for both depths with both profiles. It was also notice that the strain achieved in concentric profile are lower than that in spiral profile.

  10. Use of the hazard analysis and critical control points (HACCP) risk assessment on a medical device for parenteral application.

    Science.gov (United States)

    Jahnke, Michael; Kühn, Klaus-Dieter

    2003-01-01

    In order to guarantee the consistently high quality of medical products for human use, it is absolutely necessary that flawless hygiene conditions are maintained by the strict observance of hygiene rules. With the growing understanding of the impact of process conditions on the quality of the resulting product, process controls (surveillance) have gained increasing importance to complete the quality profile traditionally defined by post-process product testing. Today, process controls have become an important GMP requirement for the pharmaceutical industry. However, before quality process controls can be introduced, the manufacturing process has to be analyzed, with the focus on its critical quality-influencing steps. The HACCP (Hazard Analysis and Critical Control Points) method is well recognized as a useful tool in the pharmaceutical industry. This risk analysis, following the guidelines of the HACCP method and the monitoring of critical steps during the manufacturing process was applied to the manufacture of methyl methacrylate solution used for bone cement and led to the establishment of a preventative monitoring system and constitutes an effective concept for quality assurance of hygiene and all other parameters influencing the quality of the product.

  11. Site Suitability Analysis for Beekeeping via Analythical Hyrearchy Process, Konya Example

    Science.gov (United States)

    Sarı, F.; Ceylan, D. A.

    2017-11-01

    Over the past decade, the importance of the beekeeping activities has been emphasized in the field of biodiversity, ecosystems, agriculture and human health. Thus, efficient management and deciding correct beekeeping activities seems essential to maintain and improve productivity and efficiency. Due to this importance, considering the economic contributions to the rural area, the need for suitability analysis concept has been revealed. At this point, Multi Criteria Decision Analysis (MCDA) and Geographical Information Systems (GIS) integration provides efficient solutions to the complex structure of decision- making process for beekeeping activities. In this study, site suitability analysis via Analytical Hierarchy Process (AHP) was carried out for Konya city in Turkey. Slope, elevation, aspect, distance to water resources, roads and settlements, precipitation and flora criteria are included to determine suitability. The requirements, expectations and limitations of beekeeping activities are specified with the participation of experts and stakeholders. The final suitability map were validated with existing 117 beekeeping locations and Turkish Statistical Institute 2016 beekeeping statistics for Konya province.

  12. Clusterless Decoding of Position From Multiunit Activity Using A Marked Point Process Filter

    Science.gov (United States)

    Deng, Xinyi; Liu, Daniel F.; Kay, Kenneth; Frank, Loren M.; Eden, Uri T.

    2016-01-01

    Point process filters have been applied successfully to decode neural signals and track neural dynamics. Traditionally, these methods assume that multiunit spiking activity has already been correctly spike-sorted. As a result, these methods are not appropriate for situations where sorting cannot be performed with high precision such as real-time decoding for brain-computer interfaces. As the unsupervised spike-sorting problem remains unsolved, we took an alternative approach that takes advantage of recent insights about clusterless decoding. Here we present a new point process decoding algorithm that does not require multiunit signals to be sorted into individual units. We use the theory of marked point processes to construct a function that characterizes the relationship between a covariate of interest (in this case, the location of a rat on a track) and features of the spike waveforms. In our example, we use tetrode recordings, and the marks represent a four-dimensional vector of the maximum amplitudes of the spike waveform on each of the four electrodes. In general, the marks may represent any features of the spike waveform. We then use Bayes’ rule to estimate spatial location from hippocampal neural activity. We validate our approach with a simulation study and with experimental data recorded in the hippocampus of a rat moving through a linear environment. Our decoding algorithm accurately reconstructs the rat’s position from unsorted multiunit spiking activity. We then compare the quality of our decoding algorithm to that of a traditional spike-sorting and decoding algorithm. Our analyses show that the proposed decoding algorithm performs equivalently or better than algorithms based on sorted single-unit activity. These results provide a path toward accurate real-time decoding of spiking patterns that could be used to carry out content-specific manipulations of population activity in hippocampus or elsewhere in the brain. PMID:25973549

  13. Neutron capture at the s-process branching points $^{171}$Tm and $^{204}$Tl

    CERN Multimedia

    Branching points in the s-process are very special isotopes for which there is a competition between the neutron capture and the subsequent b-decay chain producing the heavy elements beyond Fe. Typically, the knowledge on the associated capture cross sections is very poor due to the difficulty in obtaining enough material of these radioactive isotopes and to measure the cross section of a sample with an intrinsic activity; indeed only 2 out o the 21 ${s}$-process branching points have ever been measured by using the time-of-flight method. In this experiment we aim at measuring for the first time the capture cross sections of $^{171}$Tm and $^{204}$Tl, both of crucial importance for understanding the nucleosynthesis of heavy elements in AGB stars. The combination of both (n,$\\gamma$) measurements on $^{171}$Tm and $^{204}$Tl will allow one to accurately constrain neutron density and the strength of the 13C(α,n) source in low mass AGB stars. Additionally, the cross section of $^{204}$Tl is also of cosmo-chrono...

  14. Temporal Information Processing and Stability Analysis of the MHSN Neuron Model in DDF

    Directory of Open Access Journals (Sweden)

    Saket Kumar Choudhary

    2016-12-01

    Full Text Available Implementation of a neuron like information processing structure at hardware level is a burning research problem. In this article, we analyze the modified hybrid spiking neuron model (the MHSN model in distributed delay framework (DDF for hardware level implementation point of view. We investigate its temporal information processing capability in term of inter-spike-interval (ISI distribution. We also perform the stability analysis of the MHSN model, in which, we compute nullclines, steady state solution, eigenvalues corresponding the MHSN model. During phase plane analysis, we notice that the MHSN model generates limit cycle oscillations which is an important phenomenon in many biological processes. Qualitative behavior of these limit cycle does not changes due to the variation in applied input stimulus, however, delay effect the spiking activity and duration of cycle get altered.

  15. Breed differences in dogs sensitivity to human points: a meta-analysis.

    Science.gov (United States)

    Dorey, Nicole R; Udell, Monique A R; Wynne, Clive D L

    2009-07-01

    The last decade has seen a substantial increase in research on the behavioral and cognitive abilities of pet dogs, Canis familiaris. The most commonly used experimental paradigm is the object-choice task in which a dog is given a choice of two containers and guided to the reinforced object by human pointing gestures. We review here studies of this type and attempt a meta-analysis of the available data. In the meta-analysis breeds of dogs were grouped into the eight categories of the American Kennel Club, and into four clusters identified by Parker and Ostrander [Parker, H.G., Ostrander, E.A., 2005. Canine genomics and genetics: running with the pack. PLoS Genet. 1, 507-513] on the basis of a genetic analysis. No differences in performance between breeds categorized in either fashion were identified. Rather, all dog breeds appear to be similarly and highly successful in following human points to locate desired food. We suggest this result could be due to the paucity of data available in published studies, and the restricted range of breeds tested.

  16. Partial wave analysis using graphics processing units

    Energy Technology Data Exchange (ETDEWEB)

    Berger, Niklaus; Liu Beijiang; Wang Jike, E-mail: nberger@ihep.ac.c [Institute of High Energy Physics, Chinese Academy of Sciences, 19B Yuquan Lu, Shijingshan, 100049 Beijing (China)

    2010-04-01

    Partial wave analysis is an important tool for determining resonance properties in hadron spectroscopy. For large data samples however, the un-binned likelihood fits employed are computationally very expensive. At the Beijing Spectrometer (BES) III experiment, an increase in statistics compared to earlier experiments of up to two orders of magnitude is expected. In order to allow for a timely analysis of these datasets, additional computing power with short turnover times has to be made available. It turns out that graphics processing units (GPUs) originally developed for 3D computer games have an architecture of massively parallel single instruction multiple data floating point units that is almost ideally suited for the algorithms employed in partial wave analysis. We have implemented a framework for tensor manipulation and partial wave fits called GPUPWA. The user writes a program in pure C++ whilst the GPUPWA classes handle computations on the GPU, memory transfers, caching and other technical details. In conjunction with a recent graphics processor, the framework provides a speed-up of the partial wave fit by more than two orders of magnitude compared to legacy FORTRAN code.

  17. SEM analysis of particle size during conventional treatment of CMP process wastewater

    International Nuclear Information System (INIS)

    Roth, Gary A.; Neu-Baker, Nicole M.; Brenner, Sara A.

    2015-01-01

    Engineered nanomaterials (ENMs) are currently employed by many industries and have different physical and chemical properties from their bulk counterparts that may confer different toxicity. Nanoparticles used or generated in semiconductor manufacturing have the potential to enter the municipal waste stream via wastewater and their ultimate fate in the ecosystem is currently unknown. This study investigates the fate of ENMs used in chemical mechanical planarization (CMP), a polishing process repeatedly utilized in semiconductor manufacturing. Wastewater sampling was conducted throughout the wastewater treatment (WWT) process at the fabrication plant's on-site wastewater treatment facility. The goal of this study was to assess whether the WWT processes resulted in size-dependent filtration of particles in the nanoscale regime by analyzing samples using scanning electron microscopy (SEM). Statistical analysis demonstrated no significant differences in particle size between sampling points, indicating low or no selectivity of WWT methods for nanoparticles based on size. All nanoparticles appeared to be of similar morphology (near-spherical), with a high variability in particle size. EDX verified nanoparticles composition of silicon- and/or aluminum-oxide. Nanoparticle sizing data compared between sampling points, including the final sampling point before discharge from the facility, suggested that nanoparticles could be released to the municipal waste stream from industrial sources. - Highlights: • The discrete treatments of a semiconductor wastewater treatment system were examined. • A sampling scheme and method for analyzing nanoparticles in wastewater was devised. • The wastewater treatment process studied is not size-selective for nanoparticles

  18. Asymmetric simple exclusion process with position-dependent hopping rates: Phase diagram from boundary-layer analysis.

    Science.gov (United States)

    Mukherji, Sutapa

    2018-03-01

    In this paper, we study a one-dimensional totally asymmetric simple exclusion process with position-dependent hopping rates. Under open boundary conditions, this system exhibits boundary-induced phase transitions in the steady state. Similarly to totally asymmetric simple exclusion processes with uniform hopping, the phase diagram consists of low-density, high-density, and maximal-current phases. In various phases, the shape of the average particle density profile across the lattice including its boundary-layer parts changes significantly. Using the tools of boundary-layer analysis, we obtain explicit solutions for the density profile in different phases. A detailed analysis of these solutions under different boundary conditions helps us obtain the equations for various phase boundaries. Next, we show how the shape of the entire density profile including the location of the boundary layers can be predicted from the fixed points of the differential equation describing the boundary layers. We discuss this in detail through several examples of density profiles in various phases. The maximal-current phase appears to be an especially interesting phase where the boundary layer flows to a bifurcation point on the fixed-point diagram.

  19. ASIC For Complex Fixed-Point Arithmetic

    Science.gov (United States)

    Petilli, Stephen G.; Grimm, Michael J.; Olson, Erlend M.

    1995-01-01

    Application-specific integrated circuit (ASIC) performs 24-bit, fixed-point arithmetic operations on arrays of complex-valued input data. High-performance, wide-band arithmetic logic unit (ALU) designed for use in computing fast Fourier transforms (FFTs) and for performing ditigal filtering functions. Other applications include general computations involved in analysis of spectra and digital signal processing.

  20. Probabilistic safety assessment and optimal control of hazardous technological systems. A marked point process approach

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, J [VTT Automation, Espoo (Finland)

    1997-04-01

    The thesis models risk management as an optimal control problem for a stochastic process. The approach classes the decisions made by management into three categories according to the control methods of a point process: (1) planned process lifetime, (2) modification of the design, and (3) operational decisions. The approach is used for optimization of plant shutdown criteria and surveillance test strategies of a hypothetical nuclear power plant. 62 refs. The thesis includes also five previous publications by author.

  1. [Analysis and research on cleaning points of HVAC systems in public places].

    Science.gov (United States)

    Yang, Jiaolan; Han, Xu; Chen, Dongqing; Jin, Xin; Dai, Zizhu

    2010-03-01

    To analyze cleaning points of HVAC systems, and to provides scientific base for regulating the cleaning of HVAC systems. Based on the survey results on the cleaning situation of HVAC systems around China for the past three years, we analyzes the cleaning points of HVAC systems from various aspects, such as the major health risk factors of HVAC systems, the formulation strategy of the cleaning of HVAC systems, cleaning methods and acceptance points of the air ducts and the parts of HVAC systems, the onsite protection and individual protection, the waste treatment and the cleaning of the removed equipment, inspection of the cleaning results, video record, and the final acceptance of the cleaning. The analysis of the major health risk factors of HVAC systems and the formulation strategy of the cleaning of HVAC systems is given. The specific methods for cleaning the air ducts, machine units, air ports, coil pipes and the water cooling towers of HVAC systems, the acceptance points of HVAC systems and the requirements of the report on the final acceptance of the cleaning of HVAC systems are proposed. By the analysis of the points of the cleaning of HVAC systems and proposal of corresponding measures, this study provides the base for the scientific and regular launch of the cleaning of HVAC systems, a novel technology service, and lays a foundation for the revision of the existing cleaning regulations, which may generate technical and social benefits to some extent.

  2. Four points function fitted and first derivative procedure for determining the end points in potentiometric titration curves: statistical analysis and method comparison.

    Science.gov (United States)

    Kholeif, S A

    2001-06-01

    A new method that belongs to the differential category for determining the end points from potentiometric titration curves is presented. It uses a preprocess to find first derivative values by fitting four data points in and around the region of inflection to a non-linear function, and then locate the end point, usually as a maximum or minimum, using an inverse parabolic interpolation procedure that has an analytical solution. The behavior and accuracy of the sigmoid and cumulative non-linear functions used are investigated against three factors. A statistical evaluation of the new method using linear least-squares method validation and multifactor data analysis are covered. The new method is generally applied to symmetrical and unsymmetrical potentiometric titration curves, and the end point is calculated using numerical procedures only. It outperforms the "parent" regular differential method in almost all factors levels and gives accurate results comparable to the true or estimated true end points. Calculated end points from selected experimental titration curves compatible with the equivalence point category of methods, such as Gran or Fortuin, are also compared with the new method.

  3. Point of Care Testing Services Delivery: Policy Analysis using a ...

    African Journals Online (AJOL)

    Annals of Biomedical Sciences ... The service providers (hospital management) and the testing personnel are faced with the task of trying to explain these problems. Objective of the study: To critically do a policy analysis of the problems of point of care testing with the aim of identifying the causes of these problems and ...

  4. Weak interaction rates for Kr and Sr waiting-point nuclei under rp-process conditions

    International Nuclear Information System (INIS)

    Sarriguren, P.

    2009-01-01

    Weak interaction rates are studied in neutron deficient Kr and Sr waiting-point isotopes in ranges of densities and temperatures relevant for the rp process. The nuclear structure is described within a microscopic model (deformed QRPA) that reproduces not only the half-lives but also the Gamow-Teller strength distributions recently measured. The various sensitivities of the decay rates to both density and temperature are discussed. Continuum electron capture is shown to contribute significantly to the weak rates at rp-process conditions.

  5. Seafood safety: economics of hazard analysis and Critical Control Point (HACCP) programmes

    National Research Council Canada - National Science Library

    Cato, James C

    1998-01-01

    .... This document on economic issues associated with seafood safety was prepared to complement the work of the Service in seafood technology, plant sanitation and Hazard Analysis Critical Control Point (HACCP) implementation...

  6. Consumers' price awareness at the point-of-selection

    DEFF Research Database (Denmark)

    Jensen, Birger Boutrup

    This paper focuses on consumers' price information processing at the point-of-selection. Specifically, it updates past results of consumers' price awareness at the point-of-selection - applying both a price-recall and a price-recognition test - and tests hypotheses on potential determinants...... of consumers' price awareness at the point-of-selection. Both price-memory tests resulted in higher measured price awareness than in any of the past studies. Results also indicate that price recognition is not the most appropiate measure. Finally, a discriminant analysis shows that consumers who are aware...... of the price at the point-of-selection are more deal prone, more low-price prone, and bought a special-priced item. Implications are discussed....

  7. Absolute quantitative analysis for sorbic acid in processed foods using proton nuclear magnetic resonance spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Ohtsuki, Takashi, E-mail: ohtsuki@nihs.go.jp [National Institute of Health Sciences, 1-18-1 Kamiyoga, Setagaya-ku, Tokyo 158-8501 (Japan); Sato, Kyoko; Sugimoto, Naoki; Akiyama, Hiroshi; Kawamura, Yoko [National Institute of Health Sciences, 1-18-1 Kamiyoga, Setagaya-ku, Tokyo 158-8501 (Japan)

    2012-07-13

    Highlights: Black-Right-Pointing-Pointer A method using qHNMR was applied and validated to determine SA in processed foods. Black-Right-Pointing-Pointer This method has good accuracy, precision, selectiveness, and linearity. Black-Right-Pointing-Pointer The proposed method is more rapid and simple than the conventional method. Black-Right-Pointing-Pointer We found that the proposed method is reliable for the accurate determination of SA. Black-Right-Pointing-Pointer This method can be used for the monitoring of SA in processed foods. - Abstract: An analytical method using solvent extraction and quantitative proton nuclear magnetic resonance (qHNMR) spectroscopy was applied and validated for the absolute quantification of sorbic acid (SA) in processed foods. The proposed method showed good linearity. The recoveries for samples spiked at the maximum usage level specified for food in Japan and at 0.13 g kg{sup -1} (beverage: 0.013 g kg{sup -1}) were larger than 80%, whereas those for samples spiked at 0.063 g kg{sup -1} (beverage: 0.0063 g kg{sup -1}) were between 56.9 and 83.5%. The limit of quantification was 0.063 g kg{sup -1} for foods (and 0.0063 g kg{sup -1} for beverages containing Lactobacillus species). Analysis of the SA content of commercial processed foods revealed quantities equal to or greater than those measured using conventional steam-distillation extraction and high-performance liquid chromatography quantification. The proposed method was rapid, simple, accurate, and precise, and provided International System of Units traceability without the need for authentic analyte standards. It could therefore be used as an alternative to the quantification of SA in processed foods using conventional method.

  8. Interevent Time Distribution of Renewal Point Process, Case Study: Extreme Rainfall in South Sulawesi

    Science.gov (United States)

    Sunusi, Nurtiti

    2018-03-01

    The study of time distribution of occurrences of extreme rain phenomena plays a very important role in the analysis and weather forecast in an area. The timing of extreme rainfall is difficult to predict because its occurrence is random. This paper aims to determine the inter event time distribution of extreme rain events and minimum waiting time until the occurrence of next extreme event through a point process approach. The phenomenon of extreme rain events over a given period of time is following a renewal process in which the time for events is a random variable τ. The distribution of random variable τ is assumed to be a Pareto, Log Normal, and Gamma. To estimate model parameters, a moment method is used. Consider Rt as the time of the last extreme rain event at one location is the time difference since the last extreme rainfall event. if there are no extreme rain events up to t 0, there will be an opportunity for extreme rainfall events at (t 0, t 0 + δt 0). Furthermore from the three models reviewed, the minimum waiting time until the next extreme rainfall will be determined. The result shows that Log Nrmal model is better than Pareto and Gamma model for predicting the next extreme rainfall in South Sulawesi while the Pareto model can not be used.

  9. Analysis of the flight dynamics of the Solar Maximum Mission (SMM) off-sun scientific pointing

    Science.gov (United States)

    Pitone, D. S.; Klein, J. R.; Twambly, B. J.

    1990-01-01

    Algorithms are presented which were created and implemented by the Goddard Space Flight Center's (GSFC's) Solar Maximum Mission (SMM) attitude operations team to support large-angle spacecraft pointing at scientific objectives. The mission objective of the post-repair SMM satellite was to study solar phenomena. However, because the scientific instruments, such as the Coronagraph/Polarimeter (CP) and the Hard X-ray Burst Spectrometer (HXRBS), were able to view objects other than the Sun, attitude operations support for attitude pointing at large angles from the nominal solar-pointing attitudes was required. Subsequently, attitude support for SMM was provided for scientific objectives such as Comet Halley, Supernova 1987A, Cygnus X-1, and the Crab Nebula. In addition, the analysis was extended to include the reverse problem, computing the right ascension and declination of a body given the off-Sun angles. This analysis led to the computation of the orbits of seven new solar comets seen in the field-of-view (FOV) of the CP. The activities necessary to meet these large-angle attitude-pointing sequences, such as slew sequence planning, viewing-period prediction, and tracking-bias computation are described. Analysis is presented for the computation of maneuvers and pointing parameters relative to the SMM-unique, Sun-centered reference frame. Finally, science data and independent attitude solutions are used to evaluate the larg-angle pointing performance.

  10. Application of LES for Analysis of Unsteady Effects on Combustion Processes and Misfires in DISI Engine

    Directory of Open Access Journals (Sweden)

    Goryntsev D.

    2013-10-01

    Full Text Available Cycle-to-cycle variations of combustion processes strongly affect the emissions, specific fuel consumption and work output. Internal combustion engines such as Direct Injection Spark-Ignition (DISI are very sensitive to the cyclic fluctuations of the flow, mixing and combustion processes. Multi-cycle Large Eddy Simulation (LES analysis has been used in order to characterize unsteady effects of combustion processes and misfires in realistic DISI engine. A qualitative analysis of the intensity of cyclic variations of in-cylinder pressure, temperature and fuel mass fraction is presented. The effect of ignition probability and analysis of misfires are pointed out. Finally, the fuel history effects along with the effect of residual gas on in-cylinder pressure and temperature as well as misfires are discussed.

  11. Mixed-Poisson Point Process with Partially-Observed Covariates: Ecological Momentary Assessment of Smoking.

    Science.gov (United States)

    Neustifter, Benjamin; Rathbun, Stephen L; Shiffman, Saul

    2012-01-01

    Ecological Momentary Assessment is an emerging method of data collection in behavioral research that may be used to capture the times of repeated behavioral events on electronic devices, and information on subjects' psychological states through the electronic administration of questionnaires at times selected from a probability-based design as well as the event times. A method for fitting a mixed Poisson point process model is proposed for the impact of partially-observed, time-varying covariates on the timing of repeated behavioral events. A random frailty is included in the point-process intensity to describe variation among subjects in baseline rates of event occurrence. Covariate coefficients are estimated using estimating equations constructed by replacing the integrated intensity in the Poisson score equations with a design-unbiased estimator. An estimator is also proposed for the variance of the random frailties. Our estimators are robust in the sense that no model assumptions are made regarding the distribution of the time-varying covariates or the distribution of the random effects. However, subject effects are estimated under gamma frailties using an approximate hierarchical likelihood. The proposed approach is illustrated using smoking data.

  12. Controlling organic chemical hazards in food manufacturing: a hazard analysis critical control points (HACCP) approach.

    Science.gov (United States)

    Ropkins, K; Beck, A J

    2002-08-01

    Hazard analysis by critical control points (HACCP) is a systematic approach to the identification, assessment and control of hazards. Effective HACCP requires the consideration of all hazards, i.e., chemical, microbiological and physical. However, to-date most 'in-place' HACCP procedures have tended to focus on the control of microbiological and physical food hazards. In general, the chemical component of HACCP procedures is either ignored or limited to applied chemicals, e.g., food additives and pesticides. In this paper we discuss the application of HACCP to a broader range of chemical hazards, using organic chemical contaminants as examples, and the problems that are likely to arise in the food manufacturing sector. Chemical HACCP procedures are likely to result in many of the advantages previously identified for microbiological HACCP procedures: more effective, efficient and economical than conventional end-point-testing methods. However, the high costs of analytical monitoring of chemical contaminants and a limited understanding of formulation and process optimisation as means of controlling chemical contamination of foods are likely to prevent chemical HACCP becoming as effective as microbiological HACCP.

  13. Identification of critical points of thermal environment in broiler production

    Directory of Open Access Journals (Sweden)

    AG Menezes

    2010-03-01

    Full Text Available This paper describes an exploratory study carried out to determine critical control points and possible risks in hatcheries and broiler farms. The study was based in the identification of the potential hazards existing in broiler production, from the hatchery to the broiler farm, identifying critical control points and defining critical limits. The following rooms were analyzed in the hatchery: egg cold storage, pre-heating, incubator, and hatcher rooms. Two broiler houses were studied in two different farms. The following data were collected in the hatchery and broiler houses: temperature (ºC and relative humidity (%, air velocity (m s-1, ammonia levels, and light intensity (lx. In the broiler house study, a questionnaire using information of the Broiler Production Good Practices (BPGP manual was applied, and workers were interviewed. Risk analysis matrices were build to determine Critical Control Points (CCP. After data collection, Statistical Process Control (SPC was applied through the analysis of the Process Capacity Index, using the software program Minitab15®. Environmental temperature and relative humidity were the critical points identified in the hatchery and in both farms. The classes determined as critical control points in the broiler houses were poultry litter, feeding, drinking water, workers' hygiene and health, management and biosecurity, norms and legislation, facilities, and activity planning. It was concluded that CCP analysis, associated with SPC control tools and guidelines of good production practices, may contribute to improve quality control in poultry production.

  14. Analysing the distribution of synaptic vesicles using a spatial point process model

    DEFF Research Database (Denmark)

    Khanmohammadi, Mahdieh; Waagepetersen, Rasmus; Nava, Nicoletta

    2014-01-01

    functionality by statistically modelling the distribution of the synaptic vesicles in two groups of rats: a control group subjected to sham stress and a stressed group subjected to a single acute foot-shock (FS)-stress episode. We hypothesize that the synaptic vesicles have different spatial distributions...... in the two groups. The spatial distributions are modelled using spatial point process models with an inhomogeneous conditional intensity and repulsive pairwise interactions. Our results verify the hypothesis that the two groups have different spatial distributions....

  15. SITE SUITABILITY ANALYSIS FOR BEEKEEPING VIA ANALYTHICAL HYREARCHY PROCESS, KONYA EXAMPLE

    Directory of Open Access Journals (Sweden)

    F. Sarı

    2017-11-01

    Full Text Available Over the past decade, the importance of the beekeeping activities has been emphasized in the field of biodiversity, ecosystems, agriculture and human health. Thus, efficient management and deciding correct beekeeping activities seems essential to maintain and improve productivity and efficiency. Due to this importance, considering the economic contributions to the rural area, the need for suitability analysis concept has been revealed. At this point, Multi Criteria Decision Analysis (MCDA and Geographical Information Systems (GIS integration provides efficient solutions to the complex structure of decision- making process for beekeeping activities. In this study, site suitability analysis via Analytical Hierarchy Process (AHP was carried out for Konya city in Turkey. Slope, elevation, aspect, distance to water resources, roads and settlements, precipitation and flora criteria are included to determine suitability. The requirements, expectations and limitations of beekeeping activities are specified with the participation of experts and stakeholders. The final suitability map were validated with existing 117 beekeeping locations and Turkish Statistical Institute 2016 beekeeping statistics for Konya province.

  16. A customizable stochastic state point process filter (SSPPF) for neural spiking activity.

    Science.gov (United States)

    Xin, Yao; Li, Will X Y; Min, Biao; Han, Yan; Cheung, Ray C C

    2013-01-01

    Stochastic State Point Process Filter (SSPPF) is effective for adaptive signal processing. In particular, it has been successfully applied to neural signal coding/decoding in recent years. Recent work has proven its efficiency in non-parametric coefficients tracking in modeling of mammal nervous system. However, existing SSPPF has only been realized in commercial software platforms which limit their computational capability. In this paper, the first hardware architecture of SSPPF has been designed and successfully implemented on field-programmable gate array (FPGA), proving a more efficient means for coefficient tracking in a well-established generalized Laguerre-Volterra model for mammalian hippocampal spiking activity research. By exploring the intrinsic parallelism of the FPGA, the proposed architecture is able to process matrices or vectors with random size, and is efficiently scalable. Experimental result shows its superior performance comparing to the software implementation, while maintaining the numerical precision. This architecture can also be potentially utilized in the future hippocampal cognitive neural prosthesis design.

  17. PowerPoint Use and Misuse in Digital Innovation

    DEFF Research Database (Denmark)

    Ciriello, Raffaele; Richter, Alexander; Schwabe, Gerhard

    2015-01-01

    insights into PowerPoint use and misuse through an ethnographically informed field study of employee-driven innovation inside a multinational European banking software provider. Drawing on primary data consisting of 62 interviews, 41 slide decks, and longitudinal series of observations and workshops......PowerPoint continues to permeate the presentation genre in general and business communication in particular. Whereas PowerPoint’s role in organizational practices has caught increasing research interest, research on PowerPoint in digital innovation is still scarce. This study provides comprehensive......, the paper illustrates how deeply entangled PowerPoint is in digital innovation. Our in-depth analysis of PowerPoint use at different innovation process stages suggests that the tool cannot be simply regarded as beneficial or detrimental for innovation. Instead, we provide a revised dialectical examination...

  18. Modelling point patterns with linear structures

    DEFF Research Database (Denmark)

    Møller, Jesper; Rasmussen, Jakob Gulddahl

    2009-01-01

    processes whose realizations contain such linear structures. Such a point process is constructed sequentially by placing one point at a time. The points are placed in such a way that new points are often placed close to previously placed points, and the points form roughly line shaped structures. We...... consider simulations of this model and compare with real data....

  19. Modelling point patterns with linear structures

    DEFF Research Database (Denmark)

    Møller, Jesper; Rasmussen, Jakob Gulddahl

    processes whose realizations contain such linear structures. Such a point process is constructed sequentially by placing one point at a time. The points are placed in such a way that new points are often placed close to previously placed points, and the points form roughly line shaped structures. We...... consider simulations of this model and compare with real data....

  20. The Oil Point Method - A tool for indicative environmental evaluation in material and process selection

    DEFF Research Database (Denmark)

    Bey, Niki

    2000-01-01

    to three essential assessment steps, the method enables rough environmental evaluations and supports in this way material- and process-related decision-making in the early stages of design. In its overall structure, the Oil Point Method is related to Life Cycle Assessment - except for two main differences...... of environmental evaluation and only approximate information about the product and its life cycle. This dissertation addresses this challenge in presenting a method, which is tailored to these requirements of designers - the Oil Point Method (OPM). In providing environmental key information and confining itself...

  1. A Systematic Approach to Process Evaluation in the Central Oklahoma Turning Point (COTP) Partnership

    Science.gov (United States)

    Tolma, Eleni L.; Cheney, Marshall K.; Chrislip, David D.; Blankenship, Derek; Troup, Pam; Hann, Neil

    2011-01-01

    Formation is an important stage of partnership development. Purpose: To describe the systematic approach to process evaluation of a Turning Point initiative in central Oklahoma during the formation stage. The nine-month collaborative effort aimed to develop an action plan to promote health. Methods: A sound planning framework was used in the…

  2. Comparative study of building footprint estimation methods from LiDAR point clouds

    Science.gov (United States)

    Rozas, E.; Rivera, F. F.; Cabaleiro, J. C.; Pena, T. F.; Vilariño, D. L.

    2017-10-01

    Building area calculation from LiDAR points is still a difficult task with no clear solution. Their different characteristics, such as shape or size, have made the process too complex to automate. However, several algorithms and techniques have been used in order to obtain an approximated hull. 3D-building reconstruction or urban planning are examples of important applications that benefit of accurate building footprint estimations. In this paper, we have carried out a study of accuracy in the estimation of the footprint of buildings from LiDAR points. The analysis focuses on the processing steps following the object recognition and classification, assuming that labeling of building points have been previously performed. Then, we perform an in-depth analysis of the influence of the point density over the accuracy of the building area estimation. In addition, a set of buildings with different size and shape were manually classified, in such a way that they can be used as benchmark.

  3. KNOWLEDGE-BASED OBJECT DETECTION IN LASER SCANNING POINT CLOUDS

    Directory of Open Access Journals (Sweden)

    F. Boochs

    2012-07-01

    Full Text Available Object identification and object processing in 3D point clouds have always posed challenges in terms of effectiveness and efficiency. In practice, this process is highly dependent on human interpretation of the scene represented by the point cloud data, as well as the set of modeling tools available for use. Such modeling algorithms are data-driven and concentrate on specific features of the objects, being accessible to numerical models. We present an approach that brings the human expert knowledge about the scene, the objects inside, and their representation by the data and the behavior of algorithms to the machine. This “understanding” enables the machine to assist human interpretation of the scene inside the point cloud. Furthermore, it allows the machine to understand possibilities and limitations of algorithms and to take this into account within the processing chain. This not only assists the researchers in defining optimal processing steps, but also provides suggestions when certain changes or new details emerge from the point cloud. Our approach benefits from the advancement in knowledge technologies within the Semantic Web framework. This advancement has provided a strong base for applications based on knowledge management. In the article we will present and describe the knowledge technologies used for our approach such as Web Ontology Language (OWL, used for formulating the knowledge base and the Semantic Web Rule Language (SWRL with 3D processing and topologic built-ins, aiming to combine geometrical analysis of 3D point clouds, and specialists’ knowledge of the scene and algorithmic processing.

  4. Knowledge-Based Object Detection in Laser Scanning Point Clouds

    Science.gov (United States)

    Boochs, F.; Karmacharya, A.; Marbs, A.

    2012-07-01

    Object identification and object processing in 3D point clouds have always posed challenges in terms of effectiveness and efficiency. In practice, this process is highly dependent on human interpretation of the scene represented by the point cloud data, as well as the set of modeling tools available for use. Such modeling algorithms are data-driven and concentrate on specific features of the objects, being accessible to numerical models. We present an approach that brings the human expert knowledge about the scene, the objects inside, and their representation by the data and the behavior of algorithms to the machine. This "understanding" enables the machine to assist human interpretation of the scene inside the point cloud. Furthermore, it allows the machine to understand possibilities and limitations of algorithms and to take this into account within the processing chain. This not only assists the researchers in defining optimal processing steps, but also provides suggestions when certain changes or new details emerge from the point cloud. Our approach benefits from the advancement in knowledge technologies within the Semantic Web framework. This advancement has provided a strong base for applications based on knowledge management. In the article we will present and describe the knowledge technologies used for our approach such as Web Ontology Language (OWL), used for formulating the knowledge base and the Semantic Web Rule Language (SWRL) with 3D processing and topologic built-ins, aiming to combine geometrical analysis of 3D point clouds, and specialists' knowledge of the scene and algorithmic processing.

  5. Material-Point-Method Analysis of Collapsing Slopes

    DEFF Research Database (Denmark)

    Andersen, Søren; Andersen, Lars

    2009-01-01

    To understand the dynamic evolution of landslides and predict their physical extent, a computational model is required that is capable of analysing complex material behaviour as well as large strains and deformations. Here, a model is presented based on the so-called generalised-interpolation mat......To understand the dynamic evolution of landslides and predict their physical extent, a computational model is required that is capable of analysing complex material behaviour as well as large strains and deformations. Here, a model is presented based on the so-called generalised......, a deformed material description is introduced, based on time integration of the deformation gradient and utilising Gauss quadrature over the volume associated with each material point. The method has been implemented in a Fortran code and employed for the analysis of a landslide that took place during...

  6. Developing a Business Intelligence Process for a Training Module in SharePoint 2010

    Science.gov (United States)

    Schmidtchen, Bryce; Solano, Wanda M.; Albasini, Colby

    2015-01-01

    Prior to this project, training information for the employees of the National Center for Critical Processing and Storage (NCCIPS) was stored in an array of unrelated spreadsheets and SharePoint lists that had to be manually updated. By developing a content management system through a web application platform named SharePoint, this training system is now highly automated and provides a much less intensive method of storing training data and scheduling training courses. This system was developed by using SharePoint Designer and laying out the data structure for the interaction between different lists of data about the employees. The automation of data population inside of the lists was accomplished by implementing SharePoint workflows which essentially lay out the logic for how data is connected and calculated between certain lists. The resulting training system is constructed from a combination of five lists of data with a single list acting as the user-friendly interface. This interface is populated with the courses required for each employee and includes past and future information about course requirements. The employees of NCCIPS now have the ability to view, log, and schedule their training information and courses with much more ease. This system will relieve a significant amount of manual input and serve as a powerful informational resource for the employees of NCCIPS in the future.

  7. Visualizing nD Point Clouds as Topological Landscape Profiles to Guide Local Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Oesterling, Patrick [Univ. of Leipzig (Germany). Computer Science Dept.; Heine, Christian [Univ. of Leipzig (Germany). Computer Science Dept.; Federal Inst. of Technology (ETH), Zurich (Switzerland). Dept. of Computer Science; Weber, Gunther H. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Computational Research Division; Scheuermann, Gerik [Univ. of Leipzig (Germany). Computer Science Dept.

    2012-05-04

    Analyzing high-dimensional point clouds is a classical challenge in visual analytics. Traditional techniques, such as projections or axis-based techniques, suffer from projection artifacts, occlusion, and visual complexity.We propose to split data analysis into two parts to address these shortcomings. First, a structural overview phase abstracts data by its density distribution. This phase performs topological analysis to support accurate and non-overlapping presentation of the high-dimensional cluster structure as a topological landscape profile. Utilizing a landscape metaphor, it presents clusters and their nesting as hills whose height, width, and shape reflect cluster coherence, size, and stability, respectively. A second local analysis phase utilizes this global structural knowledge to select individual clusters or point sets for further, localized data analysis. Focusing on structural entities significantly reduces visual clutter in established geometric visualizations and permits a clearer, more thorough data analysis. In conclusion, this analysis complements the global topological perspective and enables the user to study subspaces or geometric properties, such as shape.

  8. Development of Point Kernel Shielding Analysis Computer Program Implementing Recent Nuclear Data and Graphic User Interfaces

    International Nuclear Information System (INIS)

    Kang, Sang Ho; Lee, Seung Gi; Chung, Chan Young; Lee, Choon Sik; Lee, Jai Ki

    2001-01-01

    In order to comply with revised national regulationson radiological protection and to implement recent nuclear data and dose conversion factors, KOPEC developed a new point kernel gamma and beta ray shielding analysis computer program. This new code, named VisualShield, adopted mass attenuation coefficient and buildup factors from recent ANSI/ANS standards and flux-to-dose conversion factors from the International Commission on Radiological Protection (ICRP) Publication 74 for estimation of effective/equivalent dose recommended in ICRP 60. VisualShield utilizes graphical user interfaces and 3-D visualization of the geometric configuration for preparing input data sets and analyzing results, which leads users to error free processing with visual effects. Code validation and data analysis were performed by comparing the results of various calculations to the data outputs of previous programs such as MCNP 4B, ISOSHLD-II, QAD-CGGP, etc

  9. Process energy analysis

    International Nuclear Information System (INIS)

    Kaiser, V.

    1993-01-01

    In Chapter 2 process energy cost analysis for chemical processing is treated in a general way, independent of the specific form of energy and power production. Especially, energy data collection and data treatment, energy accounting (metering, balance setting), specific energy input, and utility energy costs and prices are discussed. (R.P.) 14 refs., 4 figs., 16 tabs

  10. Hazard analysis and critical control point to irradiated food in Brazil

    International Nuclear Information System (INIS)

    Boaratti, Maria de Fatima Guerra

    2004-01-01

    Food borne diseases, in particular gastro-intestinal infections, represent a very large group of pathologies with a strong negative impact on the health of the population because of their widespread nature. Little consideration is given to such conditions due to the fact that their symptoms are often moderate and self-limiting. This has led to a general underestimation of their importance, and consequently to incorrect practices during the preparation and preservation of food, resulting in the frequent occurrence of outbreaks involving groups of varying numbers of consumers. Despite substantial efforts in the avoidance of contamination, an upward trend in the number of outbreaks of food borne illnesses caused by non-spore forming pathogenic bacteria are reported in many countries. Good hygienic practices can reduce the level of contamination but the most important pathogens cannot presently be eliminated from most farms, nor is it possible to eliminate them by primary processing, particularly from those foods which are sold raw. Several decontamination methods exist but the most versatile treatment among them is the ionizing radiation procedure. HACCP (Hazard Analysis and Critical Control Point) is a management system in which food safety is addressed through the analysis and control of biological, chemical, and physical hazards from raw material production, procurement and handling, to manufacturing, distribution and consumption of the finished product. For successful implementation of a HACCP plan, management must be strongly committed to the HACCP concept. A firm commitment to HACCP by top management provides company employees with a sense of the importance of producing safe food. At the same time, it has to be always emphasized that, like other intervention strategies, irradiation must be applied as part of a total sanitation program. The benefits of irradiation should never be considered as an excuse for poor quality or for poor handling and storage conditions

  11. Process for quality assurance of welded joints for electrical resistance point welding

    International Nuclear Information System (INIS)

    Schaefer, R.; Singh, S.

    1977-01-01

    In order to guarantee the reproducibility of welded joints of even quality (above all in the metal working industry), it is proposed that before starting resistance point welding, a preheating current should be allowed to flow at the site of the weld. A given reduction of the total resistance at the site of the weld should effect the time when the preheating current is switched over to welding current. This value is always predetermined empirically. Further possibilities of controlling the welding process are described, where the measurement of thermal expansion of the parts is used. A standard welding time is given. The rated course of electrode movement during the process can be predicted and a running comparison of nominal and actual values can be carried out. (RW) [de

  12. Linearity analysis and comparison study on the epoc® point-of-care blood analysis system in cardiopulmonary bypass patients

    Directory of Open Access Journals (Sweden)

    Jianing Chen

    2016-03-01

    Full Text Available The epoc® blood analysis system (Epocal Inc., Ottawa, Ontario, Canada is a newly developed in vitro diagnostic hand-held analyzer for testing whole blood samples at point-of-care, which provides blood gas, electrolytes, ionized calcium, glucose, lactate, and hematocrit/calculated hemoglobin rapidly. The analytical performance of the epoc® system was evaluated in a tertiary hospital, see related research article “Analytical evaluation of the epoc® point-of-care blood analysis system in cardiopulmonary bypass patients” [1]. Data presented are the linearity analysis for 9 parameters and the comparison study in 40 cardiopulmonary bypass patients on 3 epoc® meters, Instrumentation Laboratory GEM4000, Abbott iSTAT, Nova CCX, and Roche Accu-Chek Inform II and Performa glucose meters.

  13. The Impact of the Delivery of Prepared Power Point Presentations on the Learning Process

    Directory of Open Access Journals (Sweden)

    Auksė Marmienė

    2011-04-01

    Full Text Available This article describes the process of the preparation and delivery of Power Point presentations and how it can be used by teachers as a resource for classroom teaching. The advantages of this classroom activity covering some of the problems and providing a few suggestions for dealing with those difficulties are also outlined. The major objective of the present paper is to investigate the students ability to choose the material and the content of Power Point presentations on professional topics via the Internet as well as the ability to prepare and deliver the presentation in front of the audience. The factors which determine the choice of the presentation subject are also analysed in this paper. After the delivery students were requested to self- and peer-assess the difficulties they faced in preparation and performance of the presentations by writing the reports. Learners’ attitudes to the choice of the topic of Power Point presentations were surveyed by administering a self-assessment questionnaire.

  14. Analysis of Student Satisfaction in The Process of Teaching and Learning Using Importance Performance Analysis

    Science.gov (United States)

    Sembiring, P.; Sembiring, S.; Tarigan, G.; Sembiring, OD

    2017-12-01

    This study aims to determine the level of student satisfaction in the learning process at the University of Sumatra Utara, Indonesia. The sample size of the study consisted 1204 students. Students’ response measured through questionnaires an adapted on a 5-point likert scale and interviews directly to the respondent. SERVQUAL method used to measure the quality of service with five dimensions of service characteristics, namely, physical evidence, reliability, responsiveness, assurance and concern. The result of Importance Performance Analysis reveals that six services attributes must be corrected by policy maker of University Sumatera Utara. The quality of service is still considered low by students.

  15. Risk-analysis of global climate tipping points

    Energy Technology Data Exchange (ETDEWEB)

    Frieler, Katja; Meinshausen, Malte; Braun, N [Potsdam Institute for Climate Impact Research e.V., Potsdam (Germany). PRIMAP Research Group; and others

    2012-09-15

    There are many elements of the Earth system that are expected to change gradually with increasing global warming. Changes might prove to be reversible after global warming returns to lower levels. But there are others that have the potential of showing a threshold behavior. This means that these changes would imply a transition between qualitatively disparate states which can be triggered by only small shifts in background climate (2). These changes are often expected not to be reversible by returning to the current level of warming. The reason for that is, that many of them are characterized by self-amplifying processes that could lead to a new internally stable state which is qualitatively different from before. There are different elements of the climate system that are already identified as potential tipping elements. This group contains the mass losses of the Greenland and the West-Antarctic Ice Sheet, the decline of the Arctic summer sea ice, different monsoon systems, the degradation of coral reefs, the dieback of the Amazon rainforest, the thawing of the permafrost regions as well as the release of methane hydrates (3). Crucially, these tipping elements have regional to global scale effects on human society, biodiversity and/or ecosystem services. Several examples may have a discernable effect on global climate through a large-scale positive feedback. This means they would further amplify the human induced climate change. These tipping elements pose risks comparable to risks found in other fields of human activity: high-impact events that have at least a few percent chance to occur classify as high-risk events. In many of these examples adaptation options are limited and prevention of occurrence may be a more viable strategy. Therefore, a better understanding of the processes driving tipping points is essential. There might be other tipping elements even more critical but not yet identified. These may also lie within our socio-economic systems that are

  16. Conversion of paper sludge to ethanol, II: process design and economic analysis.

    Science.gov (United States)

    Fan, Zhiliang; Lynd, Lee R

    2007-01-01

    Process design and economics are considered for conversion of paper sludge to ethanol. A particular site, a bleached kraft mill operated in Gorham, NH by Fraser Papers (15 tons dry sludge processed per day), is considered. In addition, profitability is examined for a larger plant (50 dry tons per day) and sensitivity analysis is carried out with respect to capacity, tipping fee, and ethanol price. Conversion based on simultaneous saccharification and fermentation with intermittent feeding is examined, with ethanol recovery provided by distillation and molecular sieve adsorption. It was found that the Fraser plant achieves positive cash flow with or without xylose conversion and mineral recovery. Sensitivity analysis indicates economics are very sensitive to ethanol selling price and scale; significant but less sensitive to the tipping fee, and rather insensitive to the prices of cellulase and power. Internal rates of return exceeding 15% are projected for larger plants at most combinations of scale, tipping fee, and ethanol price. Our analysis lends support to the proposition that paper sludge is a leading point-of-entry and proving ground for emergent industrial processes featuring enzymatic hydrolysis of cellulosic biomass.

  17. Object-Based Point Cloud Analysis of Full-Waveform Airborne Laser Scanning Data for Urban Vegetation Classification

    Directory of Open Access Journals (Sweden)

    Norbert Pfeifer

    2008-08-01

    Full Text Available Airborne laser scanning (ALS is a remote sensing technique well-suited for 3D vegetation mapping and structure characterization because the emitted laser pulses are able to penetrate small gaps in the vegetation canopy. The backscattered echoes from the foliage, woody vegetation, the terrain, and other objects are detected, leading to a cloud of points. Higher echo densities (> 20 echoes/m2 and additional classification variables from full-waveform (FWF ALS data, namely echo amplitude, echo width and information on multiple echoes from one shot, offer new possibilities in classifying the ALS point cloud. Currently FWF sensor information is hardly used for classification purposes. This contribution presents an object-based point cloud analysis (OBPA approach, combining segmentation and classification of the 3D FWF ALS points designed to detect tall vegetation in urban environments. The definition tall vegetation includes trees and shrubs, but excludes grassland and herbage. In the applied procedure FWF ALS echoes are segmented by a seeded region growing procedure. All echoes sorted descending by their surface roughness are used as seed points. Segments are grown based on echo width homogeneity. Next, segment statistics (mean, standard deviation, and coefficient of variation are calculated by aggregating echo features such as amplitude and surface roughness. For classification a rule base is derived automatically from a training area using a statistical classification tree. To demonstrate our method we present data of three sites with around 500,000 echoes each. The accuracy of the classified vegetation segments is evaluated for two independent validation sites. In a point-wise error assessment, where the classification is compared with manually classified 3D points, completeness and correctness better than 90% are reached for the validation sites. In comparison to many other algorithms the proposed 3D point classification works on the original

  18. Bayesian analysis of spatial point processes in the neighbourhood of Voronoi networks

    DEFF Research Database (Denmark)

    Skare, Øivind; Møller, Jesper; Jensen, Eva Bjørn Vedel

    2007-01-01

    A model for an inhomogeneous Poisson process with high intensity near the edges of a Voronoi tessellation in 2D or 3D is proposed. The model is analysed in a Bayesian setting with priors on nuclei of the Voronoi tessellation and other model parameters. An MCMC algorithm is constructed to sample...

  19. Bayesian analysis of spatial point processes in the neighbourhood of Voronoi networks

    DEFF Research Database (Denmark)

    Skare, Øivind; Møller, Jesper; Vedel Jensen, Eva B.

    A model for an inhomogeneous Poisson process with high intensity near the edges of a Voronoi tessellation in 2D or 3D is proposed. The model is analysed in a Bayesian setting with priors on nuclei of the Voronoi tessellation and other model parameters. An MCMC algorithm is constructed to sample...

  20. Performance Analysis of a Maximum Power Point Tracking Technique using Silver Mean Method

    Directory of Open Access Journals (Sweden)

    Shobha Rani Depuru

    2018-01-01

    Full Text Available The proposed paper presents a simple and particularly efficacious Maximum Power Point Tracking (MPPT algorithm based on Silver Mean Method (SMM. This method operates by choosing a search interval from the P-V characteristics of the given solar array and converges to MPP of the Solar Photo-Voltaic (SPV system by shrinking its interval. After achieving the maximum power, the algorithm stops shrinking and maintains constant voltage until the next interval is decided. The tracking capability efficiency and performance analysis of the proposed algorithm are validated by the simulation and experimental results with a 100W solar panel for variable temperature and irradiance conditions. The results obtained confirm that even without any perturbation and observation process, the proposed method still outperforms the traditional perturb and observe (P&O method by demonstrating far better steady state output, more accuracy and higher efficiency.

  1. Semiclassical analysis of long-wavelength multiphoton processes: The periodically driven harmonic oscillator

    International Nuclear Information System (INIS)

    Fox, Ronald F.; Vela-Arevalo, Luz V.

    2002-01-01

    The problem of multiphoton processes for intense, long-wavelength irradiation of atomic and molecular electrons is presented. The recently developed method of quasiadiabatic time evolution is used to obtain a nonperturbative analysis. When applied to the standard vector potential coupling, an exact auxiliary equation is obtained that is in the electric dipole coupling form. This is achieved through application of the Goeppert-Mayer gauge. While the analysis to this point is general and aimed at microwave irradiation of Rydberg atoms, a Floquet analysis of the auxiliary equation is presented for the special case of the periodically driven harmonic oscillator. Closed form expressions for a complete set of Floquet states are obtained. These are used to demonstrate that for the oscillator case there are no multiphoton resonances

  2. The signer and the sign: cortical correlates of person identity and language processing from point-light displays.

    Science.gov (United States)

    Campbell, Ruth; Capek, Cheryl M; Gazarian, Karine; MacSweeney, Mairéad; Woll, Bencie; David, Anthony S; McGuire, Philip K; Brammer, Michael J

    2011-09-01

    In this study, the first to explore the cortical correlates of signed language (SL) processing under point-light display conditions, the observer identified either a signer or a lexical sign from a display in which different signers were seen producing a number of different individual signs. Many of the regions activated by point-light under these conditions replicated those previously reported for full-image displays, including regions within the inferior temporal cortex that are specialised for face and body-part identification, although such body parts were invisible in the display. Right frontal regions were also recruited - a pattern not usually seen in full-image SL processing. This activation may reflect the recruitment of information about person identity from the reduced display. A direct comparison of identify-signer and identify-sign conditions showed these tasks relied to a different extent on the posterior inferior regions. Signer identification elicited greater activation than sign identification in (bilateral) inferior temporal gyri (BA 37/19), fusiform gyri (BA 37), middle and posterior portions of the middle temporal gyri (BAs 37 and 19), and superior temporal gyri (BA 22 and 42). Right inferior frontal cortex was a further focus of differential activation (signer>sign). These findings suggest that the neural systems supporting point-light displays for the processing of SL rely on a cortical network including areas of the inferior temporal cortex specialized for face and body identification. While this might be predicted from other studies of whole body point-light actions (Vaina, Solomon, Chowdhury, Sinha, & Belliveau, 2001) it is not predicted from the perspective of spoken language processing, where voice characteristics and speech content recruit distinct cortical regions (Stevens, 2004) in addition to a common network. In this respect, our findings contrast with studies of voice/speech recognition (Von Kriegstein, Kleinschmidt, Sterzer

  3. Classification of Phase Transitions by Microcanonical Inflection-Point Analysis

    Science.gov (United States)

    Qi, Kai; Bachmann, Michael

    2018-05-01

    By means of the principle of minimal sensitivity we generalize the microcanonical inflection-point analysis method by probing derivatives of the microcanonical entropy for signals of transitions in complex systems. A strategy of systematically identifying and locating independent and dependent phase transitions of any order is proposed. The power of the generalized method is demonstrated in applications to the ferromagnetic Ising model and a coarse-grained model for polymer adsorption onto a substrate. The results shed new light on the intrinsic phase structure of systems with cooperative behavior.

  4. IMAGE-PLANE ANALYSIS OF n-POINT-MASS LENS CRITICAL CURVES AND CAUSTICS

    Energy Technology Data Exchange (ETDEWEB)

    Danek, Kamil; Heyrovský, David, E-mail: kamil.danek@utf.mff.cuni.cz, E-mail: heyrovsky@utf.mff.cuni.cz [Institute of Theoretical Physics, Faculty of Mathematics and Physics, Charles University in Prague (Czech Republic)

    2015-06-10

    The interpretation of gravitational microlensing events caused by planetary systems or multiple stars is based on the n-point-mass lens model. The first planets detected by microlensing were well described by the two-point-mass model of a star with one planet. By the end of 2014, four events involving three-point-mass lenses had been announced. Two of the lenses were stars with two planetary companions each; two were binary stars with a planet orbiting one component. While the two-point-mass model is well understood, the same cannot be said for lenses with three or more components. Even the range of possible critical-curve topologies and caustic geometries of the three-point-mass lens remains unknown. In this paper we provide new tools for mapping the critical-curve topology and caustic cusp number in the parameter space of n-point-mass lenses. We perform our analysis in the image plane of the lens. We show that all contours of the Jacobian are critical curves of re-scaled versions of the lens configuration. Utilizing this property further, we introduce the cusp curve to identify cusp-image positions on all contours simultaneously. In order to track cusp-number changes in caustic metamorphoses, we define the morph curve, which pinpoints the positions of metamorphosis-point images along the cusp curve. We demonstrate the usage of both curves on simple two- and three-point-mass lens examples. For the three simplest caustic metamorphoses we illustrate the local structure of the image and source planes.

  5. Simulation, integration, and economic analysis of gas-to-liquid processes

    International Nuclear Information System (INIS)

    Bao, Buping; El-Halwagi, Mahmoud M.; Elbashir, Nimir O.

    2010-01-01

    Gas-to-liquid (GTL) involves the chemical conversion of natural gas into synthetic crude that can be upgraded and separated into different useful hydrocarbon fractions including liquid transportation fuels. Such technology can also be used to convert other abundant natural resources such as coal and biomass to fuels and value added chemicals (referred to as coal-to-liquid (CTL) and biomass-to-liquid (BTL)). A leading GTL technology is the Fischer-Tropsch (FT) process. The objective of this work is to provide a techno-economic analysis of the GTL process and to identify optimization and integration opportunities for cost saving and reduction of energy usage while accounting for the environmental impact. First, a base-case flowsheet is synthesized to include the key processing steps of the plant. Then, a computer-aided process simulation is carried out to determine the key mass and energy flows, performance criteria, and equipment specifications. Next, energy and mass integration studies are performed to address the following items: (a) heating and cooling utilities, (b) combined heat and power (process cogeneration), (c) management of process water, (c) optimization of tail gas allocation, and (d) recovery of catalyst-supporting hydrocarbon solvents. Finally, these integration studies are conducted and the results are documented in terms of conserving energy and mass resources as well as providing economic impact. Finally, an economic analysis is undertaken to determine the plant capacity needed to achieve the break-even point and to estimate the return on investment for the base-case study. (author)

  6. Point defect characterization in HAADF-STEM images using multivariate statistical analysis

    International Nuclear Information System (INIS)

    Sarahan, Michael C.; Chi, Miaofang; Masiel, Daniel J.; Browning, Nigel D.

    2011-01-01

    Quantitative analysis of point defects is demonstrated through the use of multivariate statistical analysis. This analysis consists of principal component analysis for dimensional estimation and reduction, followed by independent component analysis to obtain physically meaningful, statistically independent factor images. Results from these analyses are presented in the form of factor images and scores. Factor images show characteristic intensity variations corresponding to physical structure changes, while scores relate how much those variations are present in the original data. The application of this technique is demonstrated on a set of experimental images of dislocation cores along a low-angle tilt grain boundary in strontium titanate. A relationship between chemical composition and lattice strain is highlighted in the analysis results, with picometer-scale shifts in several columns measurable from compositional changes in a separate column. -- Research Highlights: → Multivariate analysis of HAADF-STEM images. → Distinct structural variations among SrTiO 3 dislocation cores. → Picometer atomic column shifts correlated with atomic column population changes.

  7. An Exploratory Study: A Kinesic Analysis of Academic Library Public Service Points

    Science.gov (United States)

    Kazlauskas, Edward

    1976-01-01

    An analysis of body movements of individuals at reference and circulation public service points in four academic libraries indicated that both receptive and nonreceptive nonverbal behaviors were used by all levels of library employees, and these behaviors influenced patron interaction. (Author/LS)

  8. The influence of biological and technical factors on quantitative analysis of amyloid PET: Points to consider and recommendations for controlling variability in longitudinal data.

    Science.gov (United States)

    Schmidt, Mark E; Chiao, Ping; Klein, Gregory; Matthews, Dawn; Thurfjell, Lennart; Cole, Patricia E; Margolin, Richard; Landau, Susan; Foster, Norman L; Mason, N Scott; De Santi, Susan; Suhy, Joyce; Koeppe, Robert A; Jagust, William

    2015-09-01

    In vivo imaging of amyloid burden with positron emission tomography (PET) provides a means for studying the pathophysiology of Alzheimer's and related diseases. Measurement of subtle changes in amyloid burden requires quantitative analysis of image data. Reliable quantitative analysis of amyloid PET scans acquired at multiple sites and over time requires rigorous standardization of acquisition protocols, subject management, tracer administration, image quality control, and image processing and analysis methods. We review critical points in the acquisition and analysis of amyloid PET, identify ways in which technical factors can contribute to measurement variability, and suggest methods for mitigating these sources of noise. Improved quantitative accuracy could reduce the sample size necessary to detect intervention effects when amyloid PET is used as a treatment end point and allow more reliable interpretation of change in amyloid burden and its relationship to clinical course. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Control system for technological processes in tritium processing plants with process analysis

    International Nuclear Information System (INIS)

    Retevoi, Carmen Maria; Stefan, Iuliana; Balteanu, Ovidiu; Stefan, Liviu; Bucur, Ciprian

    2005-01-01

    Integration of a large variety of installations and equipment into a unitary system for controlling the technological process in tritium processing nuclear facilities appears to be a rather complex approach particularly when experimental or new technologies are developed. Ensuring a high degree of versatility allowing easy modifications in configurations and process parameters is a major requirement imposed on experimental installations. The large amount of data which must be processed, stored and easily accessed for subsequent analyses imposes development of a large information network based on a highly integrated system containing the acquisition, control and technological process analysis data as well as data base system. On such a basis integrated systems of computation and control able to conduct the technological process could be developed as well protection systems for cases of failures or break down. The integrated system responds to the control and security requirements in case of emergency and of the technological processes specific to the industry that processes radioactive or toxic substances with severe consequences in case of technological failure as in the case of tritium processing nuclear plant. In order to lower the risk technological failure of these processes an integrated software, data base and process analysis system are developed, which, based on identification algorithm of the important parameters for protection and security systems, will display the process evolution trend. The system was checked on a existing plant that includes a removal tritium unit, finally used in a nuclear power plant, by simulating the failure events as well as the process. The system will also include a complete data base monitoring all the parameters and a process analysis software for the main modules of the tritium processing plant, namely, isotope separation, catalytic purification and cryogenic distillation

  10. Quantification of annual wildfire risk; A spatio-temporal point process approach.

    Directory of Open Access Journals (Sweden)

    Paula Pereira

    2013-10-01

    Full Text Available Policy responses for local and global firemanagement depend heavily on the proper understanding of the fire extent as well as its spatio-temporal variation across any given study area. Annual fire risk maps are important tools for such policy responses, supporting strategic decisions such as location-allocation of equipment and human resources. Here, we define risk of fire in the narrow sense as the probability of its occurrence without addressing the loss component. In this paper, we study the spatio-temporal point patterns of wildfires and model them by a log Gaussian Cox processes. Themean of predictive distribution of randomintensity function is used in the narrow sense, as the annual fire risk map for next year.

  11. An introduction to nonlinear analysis and fixed point theory

    CERN Document Server

    Pathak, Hemant Kumar

    2018-01-01

    This book systematically introduces the theory of nonlinear analysis, providing an overview of topics such as geometry of Banach spaces, differential calculus in Banach spaces, monotone operators, and fixed point theorems. It also discusses degree theory, nonlinear matrix equations, control theory, differential and integral equations, and inclusions. The book presents surjectivity theorems, variational inequalities, stochastic game theory and mathematical biology, along with a large number of applications of these theories in various other disciplines. Nonlinear analysis is characterised by its applications in numerous interdisciplinary fields, ranging from engineering to space science, hydromechanics to astrophysics, chemistry to biology, theoretical mechanics to biomechanics and economics to stochastic game theory. Organised into ten chapters, the book shows the elegance of the subject and its deep-rooted concepts and techniques, which provide the tools for developing more realistic and accurate models for ...

  12. Proposition for Improvement of Economics Situation with Use of Analysis of Break Even Point

    OpenAIRE

    Starečková, Alena

    2015-01-01

    Bakalářská práce se zabývá realizací Break-even-point analýzy v podniku, analýzou nákladů a návrhem na zlepšení finanční situace podniku zejména z pohledu nákladů. V první části práce jsou vymezeny pojmy a vzorce týkajících se Break-even-point analýzy a problematiky nákladů. Ve druhé části pak na konkrétním podniku bude provedena analýza bodu zvratu a následné návrhy na zlepšení stávajícího stavu. Bachelor work is dealing with realization of break even point analysis of company, analysis o...

  13. Application of ISO22000 and Failure Mode and Effect Analysis (fmea) for Industrial Processing of Poultry Products

    Science.gov (United States)

    Varzakas, Theodoros H.; Arvanitoyannis, Ioannis S.

    Failure Mode and Effect Analysis (FMEA) model has been applied for the risk assessment of poultry slaughtering and manufacturing. In this work comparison of ISO22000 analysis with HACCP is carried out over poultry slaughtering, processing and packaging. Critical Control points and Prerequisite programs (PrPs) have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram and fishbone diagram).

  14. SU-E-T-205: MLC Predictive Maintenance Using Statistical Process Control Analysis.

    Science.gov (United States)

    Able, C; Hampton, C; Baydush, A; Bright, M

    2012-06-01

    MLC failure increases accelerator downtime and negatively affects the clinic treatment delivery schedule. This study investigates the use of Statistical Process Control (SPC), a modern quality control methodology, to retrospectively evaluate MLC performance data thereby predicting the impending failure of individual MLC leaves. SPC, a methodology which detects exceptional variability in a process, was used to analyze MLC leaf velocity data. A MLC velocity test is performed weekly on all leaves during morning QA. The leaves sweep 15 cm across the radiation field with the gantry pointing down. The leaf speed is analyzed from the generated dynalog file using quality assurance software. MLC leaf speeds in which a known motor failure occurred (8) and those in which no motor replacement was performed (11) were retrospectively evaluated for a 71 week period. SPC individual and moving range (I/MR) charts were used in the analysis. The I/MR chart limits were calculated using the first twenty weeks of data and set at 3 standard deviations from the mean. The MLCs in which a motor failure occurred followed two general trends: (a) no data indicating a change in leaf speed prior to failure (5 of 8) and (b) a series of data points exceeding the limit prior to motor failure (3 of 8). I/MR charts for a high percentage (8 of 11) of the non-replaced MLC motors indicated that only a single point exceeded the limit. These single point excesses were deemed false positives. SPC analysis using MLC performance data may be helpful in detecting a significant percentage of impending failures of MLC motors. The ability to detect MLC failure may depend on the method of failure (i.e. gradual or catastrophic). Further study is needed to determine if increasing the sampling frequency could increase reliability. Project was support by a grant from Varian Medical Systems, Inc. © 2012 American Association of Physicists in Medicine.

  15. Marked point process framework for living probabilistic safety assessment and risk follow-up

    International Nuclear Information System (INIS)

    Arjas, Elja; Holmberg, Jan

    1995-01-01

    We construct a model for living probabilistic safety assessment (PSA) by applying the general framework of marked point processes. The framework provides a theoretically rigorous approach for considering risk follow-up of posterior hazards. In risk follow-up, the hazard of core damage is evaluated synthetically at time points in the past, by using some observed events as logged history and combining it with re-evaluated potential hazards. There are several alternatives for doing this, of which we consider three here, calling them initiating event approach, hazard rate approach, and safety system approach. In addition, for a comparison, we consider a core damage hazard arising in risk monitoring. Each of these four definitions draws attention to a particular aspect in risk assessment, and this is reflected in the behaviour of the consequent risk importance measures. Several alternative measures are again considered. The concepts and definitions are illustrated by a numerical example

  16. Statistical Analysis of Deep Drilling Process Conditions Using Vibrations and Force Signals

    Directory of Open Access Journals (Sweden)

    Syafiq Hazwan

    2016-01-01

    Full Text Available Cooling systems is a key point for hot forming process of Ultra High Strength Steels (UHSS. Normally, cooling systems is made using deep drilling technique. Although deep twist drill is better than other drilling techniques in term of higher productivity however its main problem is premature tool breakage, which affects the production quality. In this paper, analysis of deep twist drill process parameters such as cutting speed, feed rate and depth of cut by using statistical analysis to identify the tool condition is presented. The comparisons between different two tool geometries are also studied. Measured data from vibrations and force sensors are being analyzed through several statistical parameters such as root mean square (RMS, mean, kurtosis, standard deviation and skewness. Result found that kurtosis and skewness value are the most appropriate parameters to represent the deep twist drill tool conditions behaviors from vibrations and forces data. The condition of the deep twist drill process been classified according to good, blunt and fracture. It also found that the different tool geometry parameters affect the performance of the tool drill. It believe the results of this study are useful in determining the suitable analysis method to be used for developing online tool condition monitoring system to identify the tertiary tool life stage and helps to avoid mature of tool fracture during drilling process.

  17. Uncertainty analysis of point by point sampling complex surfaces using touch probe CMMs

    DEFF Research Database (Denmark)

    Barini, Emanuele; Tosello, Guido; De Chiffre, Leonardo

    2007-01-01

    The paper describes a study concerning point by point scanning of complex surfaces using tactile CMMs. A four factors-two level full factorial experiment was carried out, involving measurements on a complex surface configuration item comprising a sphere, a cylinder and a cone, combined in a singl...

  18. Reassessing Function Points

    Directory of Open Access Journals (Sweden)

    G.R. Finnie

    1997-05-01

    Full Text Available Accurate estimation of the size and development effort for software projects requires estimation models which can be used early enough in the development life cycle to be of practical value. Function Point Analysis (FPA has become possibly the most widely used estimation technique in practice. However the technique was developed in the data processing environment of the 1970's and, despite undergoing considerable reassessment and formalisation, still attracts criticism for the weighting scoring it employs and for the way in which the function point score is adapted for specific system characteristics. This paper reviews the validity of the weighting scheme and the value of adjusting for system characteristics by studying their effect in a sample of 299 software developments. In general the value adjustment scheme does not appear to cater for differences in productivity. The weighting scheme used to adjust system components in terms of being simple, average or complex also appears suspect and should be redesigned to provide a more realistic estimate of system functionality.

  19. Comparison of plastic strains on AA5052 by single point incremental forming process using digital image processing

    International Nuclear Information System (INIS)

    Mugendiran, V.; Gnanavelbabu, A.

    2017-01-01

    In this study, a surface based strain measurement was used to determine the formability of the sheet metal. A strain measurement may employ manual calculation of plastic strains based on the reference circle and the deformed circle. The manual calculation method has a greater margin of error in the practical applications. In this paper, an attempt has been made to compare the formability by implementing three different theoretical approaches: Namely conventional method, least square method and digital based strain measurements. As the sheet metal was formed by a single point incremental process the etched circles get deformed into elliptical shapes approximately, image acquisition has been done before and after forming. The plastic strains of the deformed circle grids are calculated based on the non- deformed reference. The coordinates of the deformed circles are measured by various image processing steps. Finally the strains obtained from the deformed circle are used to plot the forming limit diagram. To evaluate the accuracy of the system, the conventional, least square and digital based method of prediction of the forming limit diagram was compared. Conventional method and least square method have marginal error when compared with digital based processing method. Measurement of strain based on image processing agrees well and can be used to improve the accuracy and to reduce the measurement error in prediction of forming limit diagram.

  20. Comparison of plastic strains on AA5052 by single point incremental forming process using digital image processing

    Energy Technology Data Exchange (ETDEWEB)

    Mugendiran, V.; Gnanavelbabu, A. [Anna University, Chennai, Tamilnadu (India)

    2017-06-15

    In this study, a surface based strain measurement was used to determine the formability of the sheet metal. A strain measurement may employ manual calculation of plastic strains based on the reference circle and the deformed circle. The manual calculation method has a greater margin of error in the practical applications. In this paper, an attempt has been made to compare the formability by implementing three different theoretical approaches: Namely conventional method, least square method and digital based strain measurements. As the sheet metal was formed by a single point incremental process the etched circles get deformed into elliptical shapes approximately, image acquisition has been done before and after forming. The plastic strains of the deformed circle grids are calculated based on the non- deformed reference. The coordinates of the deformed circles are measured by various image processing steps. Finally the strains obtained from the deformed circle are used to plot the forming limit diagram. To evaluate the accuracy of the system, the conventional, least square and digital based method of prediction of the forming limit diagram was compared. Conventional method and least square method have marginal error when compared with digital based processing method. Measurement of strain based on image processing agrees well and can be used to improve the accuracy and to reduce the measurement error in prediction of forming limit diagram.

  1. Spatial Distribution Characteristics of Healthcare Facilities in Nanjing: Network Point Pattern Analysis and Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Jianhua Ni

    2016-08-01

    Full Text Available The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities.

  2. A critical analysis of the tender points in fibromyalgia.

    Science.gov (United States)

    Harden, R Norman; Revivo, Gadi; Song, Sharon; Nampiaparampil, Devi; Golden, Gary; Kirincic, Marie; Houle, Timothy T

    2007-03-01

    To pilot methodologies designed to critically assess the American College of Rheumatology's (ACR) diagnostic criteria for fibromyalgia. Prospective, psychophysical testing. An urban teaching hospital. Twenty-five patients with fibromyalgia and 31 healthy controls (convenience sample). Pressure pain threshold was determined at the 18 ACR tender points and five sham points using an algometer (dolorimeter). The patients "algometric total scores" (sums of the patients' average pain thresholds at the 18 tender points) were derived, as well as pain thresholds across sham points. The "algometric total score" could differentiate patients with fibromyalgia from normals with an accuracy of 85.7% (P pain across sham points than across ACR tender points, sham points also could be used for diagnosis (85.7%; Ps tested vs other painful conditions. The points specified by the ACR were only modestly superior to sham points in making the diagnosis. Most importantly, this pilot suggests single points, smaller groups of points, or sham points may be as effective in diagnosing fibromyalgia as the use of all 18 points, and suggests methodologies to definitively test that hypothesis.

  3. Multi-Class Simultaneous Adaptive Segmentation and Quality Control of Point Cloud Data

    Directory of Open Access Journals (Sweden)

    Ayman Habib

    2016-01-01

    Full Text Available 3D modeling of a given site is an important activity for a wide range of applications including urban planning, as-built mapping of industrial sites, heritage documentation, military simulation, and outdoor/indoor analysis of airflow. Point clouds, which could be either derived from passive or active imaging systems, are an important source for 3D modeling. Such point clouds need to undergo a sequence of data processing steps to derive the necessary information for the 3D modeling process. Segmentation is usually the first step in the data processing chain. This paper presents a region-growing multi-class simultaneous segmentation procedure, where planar, pole-like, and rough regions are identified while considering the internal characteristics (i.e., local point density/spacing and noise level of the point cloud in question. The segmentation starts with point cloud organization into a kd-tree data structure and characterization process to estimate the local point density/spacing. Then, proceeding from randomly-distributed seed points, a set of seed regions is derived through distance-based region growing, which is followed by modeling of such seed regions into planar and pole-like features. Starting from optimally-selected seed regions, planar and pole-like features are then segmented. The paper also introduces a list of hypothesized artifacts/problems that might take place during the region-growing process. Finally, a quality control process is devised to detect, quantify, and mitigate instances of partially/fully misclassified planar and pole-like features. Experimental results from airborne and terrestrial laser scanning as well as image-based point clouds are presented to illustrate the performance of the proposed segmentation and quality control framework.

  4. Analysis on Single Point Vulnerabilities of Plant Control System

    International Nuclear Information System (INIS)

    Chi, Moon Goo; Lee, Eun Chan; Bae, Yeon Kyoung

    2011-01-01

    The Plant Control System (PCS) is a system that controls pumps, valves, dampers, etc. in nuclear power plants with an OPR-1000 design. When there is a failure or spurious actuation of the critical components in the PCS, it can result in unexpected plant trips or transients. From this viewpoint, single point vulnerabilities are evaluated in detail using failure mode effect analyses (FMEA) and fault tree analyses (FTA). This evaluation demonstrates that the PCS has many vulnerable components and the analysis results are provided for OPR-1000 plants for reliability improvements that can reduce their vulnerabilities

  5. Analysis on Single Point Vulnerabilities of Plant Control System

    Energy Technology Data Exchange (ETDEWEB)

    Chi, Moon Goo; Lee, Eun Chan; Bae, Yeon Kyoung [Korea Hydro and Nuclear Power Co., Daejeon (Korea, Republic of)

    2011-08-15

    The Plant Control System (PCS) is a system that controls pumps, valves, dampers, etc. in nuclear power plants with an OPR-1000 design. When there is a failure or spurious actuation of the critical components in the PCS, it can result in unexpected plant trips or transients. From this viewpoint, single point vulnerabilities are evaluated in detail using failure mode effect analyses (FMEA) and fault tree analyses (FTA). This evaluation demonstrates that the PCS has many vulnerable components and the analysis results are provided for OPR-1000 plants for reliability improvements that can reduce their vulnerabilities.

  6. Trend analysis and change point detection of annual and seasonal temperature series in Peninsular Malaysia

    Science.gov (United States)

    Suhaila, Jamaludin; Yusop, Zulkifli

    2017-06-01

    Most of the trend analysis that has been conducted has not considered the existence of a change point in the time series analysis. If these occurred, then the trend analysis will not be able to detect an obvious increasing or decreasing trend over certain parts of the time series. Furthermore, the lack of discussion on the possible factors that influenced either the decreasing or the increasing trend in the series needs to be addressed in any trend analysis. Hence, this study proposes to investigate the trends, and change point detection of mean, maximum and minimum temperature series, both annually and seasonally in Peninsular Malaysia and determine the possible factors that could contribute to the significance trends. In this study, Pettitt and sequential Mann-Kendall (SQ-MK) tests were used to examine the occurrence of any abrupt climate changes in the independent series. The analyses of the abrupt changes in temperature series suggested that most of the change points in Peninsular Malaysia were detected during the years 1996, 1997 and 1998. These detection points captured by Pettitt and SQ-MK tests are possibly related to climatic factors, such as El Niño and La Niña events. The findings also showed that the majority of the significant change points that exist in the series are related to the significant trend of the stations. Significant increasing trends of annual and seasonal mean, maximum and minimum temperatures in Peninsular Malaysia were found with a range of 2-5 °C/100 years during the last 32 years. It was observed that the magnitudes of the increasing trend in minimum temperatures were larger than the maximum temperatures for most of the studied stations, particularly at the urban stations. These increases are suspected to be linked with the effect of urban heat island other than El Niño event.

  7. Formal analysis of design process dynamics

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2010-01-01

    This paper presents a formal analysis of design process dynamics. Such a formal analysis is a prerequisite to come to a formal theory of design and for the development of automated support for the dynamics of design processes. The analysis was geared toward the identification of dynamic design

  8. Formal Analysis of Design Process Dynamics

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2010-01-01

    This paper presents a formal analysis of design process dynamics. Such a formal analysis is a prerequisite to come to a formal theory of design and for the development of automated support for the dynamics of design processes. The analysis was geared toward the identification of dynamic design

  9. Research on on-line monitoring technology for steel ball's forming process based on load signal analysis method

    Science.gov (United States)

    Li, Ying-jun; Ai, Chang-sheng; Men, Xiu-hua; Zhang, Cheng-liang; Zhang, Qi

    2013-04-01

    This paper presents a novel on-line monitoring technology to obtain forming quality in steel ball's forming process based on load signal analysis method, in order to reveal the bottom die's load characteristic in initial cold heading forging process of steel balls. A mechanical model of the cold header producing process is established and analyzed by using finite element method. The maximum cold heading force is calculated. The results prove that the monitoring on the cold heading process with upsetting force is reasonable and feasible. The forming defects are inflected on the three feature points of the bottom die signals, which are the initial point, infection point, and peak point. A novel PVDF piezoelectric force sensor which is simple on construction and convenient on installation is designed. The sensitivity of the PVDF force sensor is calculated. The characteristics of PVDF force sensor are analyzed by FEM. The PVDF piezoelectric force sensor is fabricated to acquire the actual load signals in the cold heading process, and calibrated by a special device. The measuring system of on-line monitoring is built. The characteristics of the actual signals recognized by learning and identification algorithm are in consistence with simulation results. Identification of actual signals shows that the timing difference values of all feature points for qualified products are not exceed ±6 ms, and amplitude difference values are less than ±3%. The calibration and application experiments show that PVDF force sensor has good static and dynamic performances, and is competent at dynamic measuring on upsetting force. It greatly improves automatic level and machining precision. Equipment capacity factor with damages identification method depends on grade of steel has been improved to 90%.

  10. Impedance analysis of acupuncture points and pathways

    International Nuclear Information System (INIS)

    Teplan, Michal; Kukucka, Marek; Ondrejkovicová, Alena

    2011-01-01

    Investigation of impedance characteristics of acupuncture points from acoustic to radio frequency range is addressed. Discernment and localization of acupuncture points in initial single subject study was unsuccessfully attempted by impedance map technique. Vector impedance analyses determined possible resonant zones in MHz region.

  11. Simultaneous colour visualizations of multiple ALS point cloud attributes for land cover and vegetation analysis

    Science.gov (United States)

    Zlinszky, András; Schroiff, Anke; Otepka, Johannes; Mandlburger, Gottfried; Pfeifer, Norbert

    2014-05-01

    LIDAR point clouds hold valuable information for land cover and vegetation analysis, not only in the spatial distribution of the points but also in their various attributes. However, LIDAR point clouds are rarely used for visual interpretation, since for most users, the point cloud is difficult to interpret compared to passive optical imagery. Meanwhile, point cloud viewing software is available allowing interactive 3D interpretation, but typically only one attribute at a time. This results in a large number of points with the same colour, crowding the scene and often obscuring detail. We developed a scheme for mapping information from multiple LIDAR point attributes to the Red, Green, and Blue channels of a widely used LIDAR data format, which are otherwise mostly used to add information from imagery to create "photorealistic" point clouds. The possible combinations of parameters are therefore represented in a wide range of colours, but relative differences in individual parameter values of points can be well understood. The visualization was implemented in OPALS software, using a simple and robust batch script, and is viewer independent since the information is stored in the point cloud data file itself. In our case, the following colour channel assignment delivered best results: Echo amplitude in the Red, echo width in the Green and normalized height above a Digital Terrain Model in the Blue channel. With correct parameter scaling (but completely without point classification), points belonging to asphalt and bare soil are dark red, low grassland and crop vegetation are bright red to yellow, shrubs and low trees are green and high trees are blue. Depending on roof material and DTM quality, buildings are shown from red through purple to dark blue. Erroneously high or low points, or points with incorrect amplitude or echo width usually have colours contrasting from terrain or vegetation. This allows efficient visual interpretation of the point cloud in planar

  12. Hygienic-sanitary working practices and implementation of a Hazard Analysis and Critical Control Point (HACCP plan in lobster processing industries Condições higiênico-sanitárias e implementação do plano de Análise de Perigos e Pontos Críticos de Controle (APPCC em indústrias processadoras de lagosta

    Directory of Open Access Journals (Sweden)

    Cristina Farias da Fonseca

    2013-03-01

    Full Text Available This study aimed to verify the hygienic-sanitary working practices and to create and implement a Hazard Analysis Critical Control Point (HACCP) in two lobster processing industries in Pernambuco State, Brazil. The industries studied process frozen whole lobsters, frozen whole cooked lobsters, and frozen lobster tails for exportation. The application of the hygienic-sanitary checklist in the industries analyzed achieved conformity rates over 96% to the aspects evaluated. The use of the Hazard Analysis Critical Control Point (HACCP) plan resulted in the detection of two critical control points (CCPs) including the receiving and classification steps in the processing of frozen lobster and frozen lobster tails, and an additional critical control point (CCP) was detected during the cooking step of processing of the whole frozen cooked lobster. The proper implementation of the Hazard Analysis Critical Control Point (HACCP) plan in the lobster processing industries studied proved to be the safest and most cost-effective method to monitor each critical control point (CCP) hazards.Objetivou-se com este estudo verificar as condições higiênico-sanitárias e criar um plano de Análise de Perigos e Pontos Críticos de Controle (APPCC) para implantação em duas indústrias de processamento de lagosta no Estado de Pernambuco, Brasil. As indústrias estudadas processam lagosta inteira congelada, lagostas inteiras cozidas congeladas e caudas de lagosta congelada para exportação. A aplicação de um checklist de controle higiênico-sanitário nas indústrias visitadas resultou em uma classificação global de conformidades maior que 96% dos aspectos analisados. O desenvolvimento do plano APPCC resultou na detecção de dois pontos críticos de controle (PCC), incluindo o recebimento e etapas de classificação, no processamento de lagosta congelada e caudas de lagosta congelada, e um PCC adicional foi detectado no processamento de lagosta inteira cozida

  13. Multiview 3D sensing and analysis for high quality point cloud reconstruction

    Science.gov (United States)

    Satnik, Andrej; Izquierdo, Ebroul; Orjesek, Richard

    2018-04-01

    Multiview 3D reconstruction techniques enable digital reconstruction of 3D objects from the real world by fusing different viewpoints of the same object into a single 3D representation. This process is by no means trivial and the acquisition of high quality point cloud representations of dynamic 3D objects is still an open problem. In this paper, an approach for high fidelity 3D point cloud generation using low cost 3D sensing hardware is presented. The proposed approach runs in an efficient low-cost hardware setting based on several Kinect v2 scanners connected to a single PC. It performs autocalibration and runs in real-time exploiting an efficient composition of several filtering methods including Radius Outlier Removal (ROR), Weighted Median filter (WM) and Weighted Inter-Frame Average filtering (WIFA). The performance of the proposed method has been demonstrated through efficient acquisition of dense 3D point clouds of moving objects.

  14. Multivariate analysis and extraction of parameters in resistive RAMs using the Quantum Point Contact model

    Science.gov (United States)

    Roldán, J. B.; Miranda, E.; González-Cordero, G.; García-Fernández, P.; Romero-Zaliz, R.; González-Rodelas, P.; Aguilera, A. M.; González, M. B.; Jiménez-Molinos, F.

    2018-01-01

    A multivariate analysis of the parameters that characterize the reset process in Resistive Random Access Memory (RRAM) has been performed. The different correlations obtained can help to shed light on the current components that contribute in the Low Resistance State (LRS) of the technology considered. In addition, a screening method for the Quantum Point Contact (QPC) current component is presented. For this purpose, the second derivative of the current has been obtained using a novel numerical method which allows determining the QPC model parameters. Once the procedure is completed, a whole Resistive Switching (RS) series of thousands of curves is studied by means of a genetic algorithm. The extracted QPC parameter distributions are characterized in depth to get information about the filamentary pathways associated with LRS in the low voltage conduction regime.

  15. Accuracy of multi-point boundary crossing time analysis

    Directory of Open Access Journals (Sweden)

    J. Vogt

    2011-12-01

    Full Text Available Recent multi-spacecraft studies of solar wind discontinuity crossings using the timing (boundary plane triangulation method gave boundary parameter estimates that are significantly different from those of the well-established single-spacecraft minimum variance analysis (MVA technique. A large survey of directional discontinuities in Cluster data turned out to be particularly inconsistent in the sense that multi-point timing analyses did not identify any rotational discontinuities (RDs whereas the MVA results of the individual spacecraft suggested that RDs form the majority of events. To make multi-spacecraft studies of discontinuity crossings more conclusive, the present report addresses the accuracy of the timing approach to boundary parameter estimation. Our error analysis is based on the reciprocal vector formalism and takes into account uncertainties both in crossing times and in the spacecraft positions. A rigorous error estimation scheme is presented for the general case of correlated crossing time errors and arbitrary spacecraft configurations. Crossing time error covariances are determined through cross correlation analyses of the residuals. The principal influence of the spacecraft array geometry on the accuracy of the timing method is illustrated using error formulas for the simplified case of mutually uncorrelated and identical errors at different spacecraft. The full error analysis procedure is demonstrated for a solar wind discontinuity as observed by the Cluster FGM instrument.

  16. Thermal analysis on x-ray tube for exhaust process

    Science.gov (United States)

    Kumar, Rakesh; Rao Ratnala, Srinivas; Veeresh Kumar, G. B.; Shivakumar Gouda, P. S.

    2018-02-01

    It is great importance in the use of X-rays for medical purposes that the dose given to both the patient and the operator is carefully controlled. There are many types of the X- ray tubes used for different applications based on their capacity and power supplied. In present thesis maxi ray 165 tube is analysed for thermal exhaust processes with ±5% accuracy. Exhaust process is usually done to remove all the air particles and to degasify the insert under high vacuum at 2e-05Torr. The tube glass is made up of Pyrex material, 95%Tungsten and 5%rhenium is used as target material for which the melting point temperature is 3350°C. Various materials are used for various parts; during the operation of X- ray tube these waste gases are released due to high temperature which in turn disturbs the flow of electrons. Thus, before using the X-ray tube for practical applications it has to undergo exhaust processes. Initially we build MX 165 model to carry out thermal analysis, and then we simulate the bearing temperature profiles with FE model to match with test results with ±5%accuracy. At last implement the critical protocols required for manufacturing processes like MF Heating, E-beam, Seasoning and FT.

  17. Improved DEA Cross Efficiency Evaluation Method Based on Ideal and Anti-Ideal Points

    Directory of Open Access Journals (Sweden)

    Qiang Hou

    2018-01-01

    Full Text Available A new model is introduced in the process of evaluating efficiency value of decision making units (DMUs through data envelopment analysis (DEA method. Two virtual DMUs called ideal point DMU and anti-ideal point DMU are combined to form a comprehensive model based on the DEA method. The ideal point DMU is taking self-assessment system according to efficiency concept. The anti-ideal point DMU is taking other-assessment system according to fairness concept. The two distinctive ideal point models are introduced to the DEA method and combined through using variance ration. From the new model, a reasonable result can be obtained. Numerical examples are provided to illustrate the new constructed model and certify the rationality of the constructed model through relevant analysis with the traditional DEA model.

  18. Dosimetric analysis at ICRU reference points in HDR-brachytherapy of cervical carcinoma.

    Science.gov (United States)

    Eich, H T; Haverkamp, U; Micke, O; Prott, F J; Müller, R P

    2000-01-01

    In vivo dosimetry in bladder and rectum as well as determining doses on suggested reference points following the ICRU report 38 contribute to quality assurance in HDR-brachytherapy of cervical carcinoma, especially to minimize side effects. In order to gain information regarding the radiation exposure at ICRU reference points in rectum, bladder, ureter and regional lymph nodes those were calculated (digitalisation) by means of orthogonal radiographs of 11 applications in patients with cervical carcinoma, who received primary radiotherapy. In addition, the doses at the ICRU rectum reference point was compared to the results of in vivo measurements in the rectum. The in vivo measurements were by factor 1.5 below the doses determined for the ICRU rectum reference point (4.05 +/- 0.68 Gy versus 6.11 +/- 1.63 Gy). Reasons for this were: calibration errors, non-orthogonal radiographs, movement of applicator and probe in the time span between X-ray and application, missing connection of probe and anterior rectal wall. The standard deviation of calculations at ICRU reference points was on average +/- 30%. Possible reasons for the relatively large standard deviation were difficulties in defining the points, identifying them on radiographs and the different locations of the applicators. Although 3 D CT, US or MR based treatment planning using dose volume histogram analysis is more and more established, this simple procedure of marking and digitising the ICRU reference points lengthened treatment planning only by 5 to 10 minutes. The advantages of in vivo dosimetry are easy practicability and the possibility to determine rectum doses during radiation. The advantages of computer-aided planning at ICRU reference points are that calculations are available before radiation and that they can still be taken into account for treatment planning. Both methods should be applied in HDR-brachytherapy of cervical carcinoma.

  19. Ergodic Capacity Analysis of Free-Space Optical Links with Nonzero Boresight Pointing Errors

    KAUST Repository

    Ansari, Imran Shafique; Alouini, Mohamed-Slim; Cheng, Julian

    2015-01-01

    A unified capacity analysis of a free-space optical (FSO) link that accounts for nonzero boresight pointing errors and both types of detection techniques (i.e. intensity modulation/ direct detection as well as heterodyne detection) is addressed

  20. Multivariate Analysis for the Processing of Signals

    Directory of Open Access Journals (Sweden)

    Beattie J.R.

    2014-01-01

    Full Text Available Real-world experiments are becoming increasingly more complex, needing techniques capable of tracking this complexity. Signal based measurements are often used to capture this complexity, where a signal is a record of a sample’s response to a parameter (e.g. time, displacement, voltage, wavelength that is varied over a range of values. In signals the responses at each value of the varied parameter are related to each other, depending on the composition or state sample being measured. Since signals contain multiple information points, they have rich information content but are generally complex to comprehend. Multivariate Analysis (MA has profoundly transformed their analysis by allowing gross simplification of the tangled web of variation. In addition MA has also provided the advantage of being much more robust to the influence of noise than univariate methods of analysis. In recent years, there has been a growing awareness that the nature of the multivariate methods allows exploitation of its benefits for purposes other than data analysis, such as pre-processing of signals with the aim of eliminating irrelevant variations prior to analysis of the signal of interest. It has been shown that exploiting multivariate data reduction in an appropriate way can allow high fidelity denoising (removal of irreproducible non-signals, consistent and reproducible noise-insensitive correction of baseline distortions (removal of reproducible non-signals, accurate elimination of interfering signals (removal of reproducible but unwanted signals and the standardisation of signal amplitude fluctuations. At present, the field is relatively small but the possibilities for much wider application are considerable. Where signal properties are suitable for MA (such as the signal being stationary along the x-axis, these signal based corrections have the potential to be highly reproducible, and highly adaptable and are applicable in situations where the data is noisy or

  1. Spectrum of classes of point emitters of electromagnetic wave fields.

    Science.gov (United States)

    Castañeda, Román

    2016-09-01

    The spectrum of classes of point emitters has been introduced as a numerical tool suitable for the design, analysis, and synthesis of non-paraxial optical fields in arbitrary states of spatial coherence. In this paper, the polarization state of planar electromagnetic wave fields is included in the spectrum of classes, thus increasing its modeling capabilities. In this context, optical processing is realized as a filtering on the spectrum of classes of point emitters, performed by the complex degree of spatial coherence and the two-point correlation of polarization, which could be implemented dynamically by using programmable optical devices.

  2. PowerPivot for Business Intelligence Using Excel and SharePoint

    CERN Document Server

    Ralston, Barry

    2011-01-01

    PowerPivot comprises a set of technologies for easy access to data mining and business intelligence analysis from Microsoft Excel and SharePoint. Power users and developers alike can create sophisticated, online analytic processing (OLAP) solutions using PowerPivot for Excel, and then share those solutions with other users via PowerPivot for SharePoint. Data can be pulled in from any of the leading database platforms, as well as from spreadsheets and flat files. PowerPivot for Business Intelligence Using Excel and SharePoint is your key to mastering PowerPivot. The book takes a scenario-based

  3. A note on the statistical analysis of point judgment matrices

    Directory of Open Access Journals (Sweden)

    MG Kabera

    2013-06-01

    Full Text Available The Analytic Hierarchy Process is a multicriteria decision making technique developed by Saaty in the 1970s. The core of the approach is the pairwise comparison of objects according to a single criterion using a 9-point ratio scale and the estimation of weights associated with these objects based on the resultant judgment matrix. In the present paper some statistical approaches to extracting the weights of objects from a judgment matrix are reviewed and new ideas which are rooted in the traditional method of paired comparisons are introduced.

  4. Sensing with Superconducting Point Contacts

    Directory of Open Access Journals (Sweden)

    Argo Nurbawono

    2012-05-01

    Full Text Available Superconducting point contacts have been used for measuring magnetic polarizations, identifying magnetic impurities, electronic structures, and even the vibrational modes of small molecules. Due to intrinsically small energy scale in the subgap structures of the supercurrent determined by the size of the superconducting energy gap, superconductors provide ultrahigh sensitivities for high resolution spectroscopies. The so-called Andreev reflection process between normal metal and superconductor carries complex and rich information which can be utilized as powerful sensor when fully exploited. In this review, we would discuss recent experimental and theoretical developments in the supercurrent transport through superconducting point contacts and their relevance to sensing applications, and we would highlight their current issues and potentials. A true utilization of the method based on Andreev reflection analysis opens up possibilities for a new class of ultrasensitive sensors.

  5. Bayesian Estimation Of Shift Point In Poisson Model Under Asymmetric Loss Functions

    Directory of Open Access Journals (Sweden)

    uma srivastava

    2012-01-01

    Full Text Available The paper deals with estimating  shift point which occurs in any sequence of independent observations  of Poisson model in statistical process control. This shift point occurs in the sequence when  i.e. m  life data are observed. The Bayes estimator on shift point 'm' and before and after shift process means are derived for symmetric and asymmetric loss functions under informative and non informative priors. The sensitivity analysis of Bayes estimators are carried out by simulation and numerical comparisons with  R-programming. The results shows the effectiveness of shift in sequence of Poisson disribution .

  6. Scientific information processing procedures

    Directory of Open Access Journals (Sweden)

    García, Maylin

    2013-07-01

    Full Text Available The paper systematizes several theoretical view-points on scientific information processing skill. It decomposes the processing skills into sub-skills. Several methods such analysis, synthesis, induction, deduction, document analysis were used to build up a theoretical framework. Interviews and survey to professional being trained and a case study was carried out to evaluate the results. All professional in the sample improved their performance in scientific information processing.

  7. Detection of bursts in extracellular spike trains using hidden semi-Markov point process models.

    Science.gov (United States)

    Tokdar, Surya; Xi, Peiyi; Kelly, Ryan C; Kass, Robert E

    2010-08-01

    Neurons in vitro and in vivo have epochs of bursting or "up state" activity during which firing rates are dramatically elevated. Various methods of detecting bursts in extracellular spike trains have appeared in the literature, the most widely used apparently being Poisson Surprise (PS). A natural description of the phenomenon assumes (1) there are two hidden states, which we label "burst" and "non-burst," (2) the neuron evolves stochastically, switching at random between these two states, and (3) within each state the spike train follows a time-homogeneous point process. If in (2) the transitions from non-burst to burst and burst to non-burst states are memoryless, this becomes a hidden Markov model (HMM). For HMMs, the state transitions follow exponential distributions, and are highly irregular. Because observed bursting may in some cases be fairly regular-exhibiting inter-burst intervals with small variation-we relaxed this assumption. When more general probability distributions are used to describe the state transitions the two-state point process model becomes a hidden semi-Markov model (HSMM). We developed an efficient Bayesian computational scheme to fit HSMMs to spike train data. Numerical simulations indicate the method can perform well, sometimes yielding very different results than those based on PS.

  8. Phase-equilibria for design of coal-gasification processes: dew points of hot gases containing condensible tars. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Prausnitz, J.M.

    1980-05-01

    This research is concerned with the fundamental physical chemistry and thermodynamics of condensation of tars (dew points) from the vapor phase at advanced temperatures and pressures. Fundamental quantitative understanding of dew points is important for rational design of heat exchangers to recover sensible heat from hot, tar-containing gases that are produced in coal gasification. This report includes essentially six contributions toward establishing the desired understanding: (1) Characterization of Coal Tars for Dew-Point Calculations; (2) Fugacity Coefficients for Dew-Point Calculations in Coal-Gasification Process Design; (3) Vapor Pressures of High-Molecular-Weight Hydrocarbons; (4) Estimation of Vapor Pressures of High-Boiling Fractions in Liquefied Fossil Fuels Containing Heteroatoms Nitrogen or Sulfur; and (5) Vapor Pressures of Heavy Liquid Hydrocarbons by a Group-Contribution Method.

  9. Topological data analysis of contagion maps for examining spreading processes on networks

    KAUST Repository

    Taylor, Dane

    2015-07-21

    Social and biological contagions are influenced by the spatial embeddedness of networks. Historically, many epidemics spread as a wave across part of the Earth\\'s surface; however, in modern contagions long-range edges - for example, due to airline transportation or communication media - allow clusters of a contagion to appear in distant locations. Here we study the spread of contagions on networks through a methodology grounded in topological data analysis and nonlinear dimension reduction. We construct \\'contagion maps\\' that use multiple contagions on a network to map the nodes as a point cloud. By analysing the topology, geometry and dimensionality of manifold structure in such point clouds, we reveal insights to aid in the modelling, forecast and control of spreading processes. Our approach highlights contagion maps also as a viable tool for inferring low-dimensional structure in networks.

  10. Topological data analysis of contagion maps for examining spreading processes on networks.

    Science.gov (United States)

    Taylor, Dane; Klimm, Florian; Harrington, Heather A; Kramár, Miroslav; Mischaikow, Konstantin; Porter, Mason A; Mucha, Peter J

    2015-07-21

    Social and biological contagions are influenced by the spatial embeddedness of networks. Historically, many epidemics spread as a wave across part of the Earth's surface; however, in modern contagions long-range edges-for example, due to airline transportation or communication media-allow clusters of a contagion to appear in distant locations. Here we study the spread of contagions on networks through a methodology grounded in topological data analysis and nonlinear dimension reduction. We construct 'contagion maps' that use multiple contagions on a network to map the nodes as a point cloud. By analysing the topology, geometry and dimensionality of manifold structure in such point clouds, we reveal insights to aid in the modelling, forecast and control of spreading processes. Our approach highlights contagion maps also as a viable tool for inferring low-dimensional structure in networks.

  11. Topological data analysis of contagion maps for examining spreading processes on networks

    Science.gov (United States)

    Taylor, Dane; Klimm, Florian; Harrington, Heather A.; Kramár, Miroslav; Mischaikow, Konstantin; Porter, Mason A.; Mucha, Peter J.

    2015-07-01

    Social and biological contagions are influenced by the spatial embeddedness of networks. Historically, many epidemics spread as a wave across part of the Earth's surface; however, in modern contagions long-range edges--for example, due to airline transportation or communication media--allow clusters of a contagion to appear in distant locations. Here we study the spread of contagions on networks through a methodology grounded in topological data analysis and nonlinear dimension reduction. We construct `contagion maps' that use multiple contagions on a network to map the nodes as a point cloud. By analysing the topology, geometry and dimensionality of manifold structure in such point clouds, we reveal insights to aid in the modelling, forecast and control of spreading processes. Our approach highlights contagion maps also as a viable tool for inferring low-dimensional structure in networks.

  12. Topological data analysis of contagion maps for examining spreading processes on networks

    KAUST Repository

    Taylor, Dane; Klimm, Florian; Harrington, Heather A.; Kramá r, Miroslav; Mischaikow, Konstantin; Porter, Mason A.; Mucha, Peter J.

    2015-01-01

    Social and biological contagions are influenced by the spatial embeddedness of networks. Historically, many epidemics spread as a wave across part of the Earth's surface; however, in modern contagions long-range edges - for example, due to airline transportation or communication media - allow clusters of a contagion to appear in distant locations. Here we study the spread of contagions on networks through a methodology grounded in topological data analysis and nonlinear dimension reduction. We construct 'contagion maps' that use multiple contagions on a network to map the nodes as a point cloud. By analysing the topology, geometry and dimensionality of manifold structure in such point clouds, we reveal insights to aid in the modelling, forecast and control of spreading processes. Our approach highlights contagion maps also as a viable tool for inferring low-dimensional structure in networks.

  13. Benefits analysis of Soft Open Points for electrical distribution network operation

    International Nuclear Information System (INIS)

    Cao, Wanyu; Wu, Jianzhong; Jenkins, Nick; Wang, Chengshan; Green, Timothy

    2016-01-01

    Highlights: • An analysis framework was developed to quantify the operational benefits. • The framework considers both network reconfiguration and SOP control. • Benefits were analyzed through both quantitative and sensitivity analysis. - Abstract: Soft Open Points (SOPs) are power electronic devices installed in place of normally-open points in electrical power distribution networks. They are able to provide active power flow control, reactive power compensation and voltage regulation under normal network operating conditions, as well as fast fault isolation and supply restoration under abnormal conditions. A steady state analysis framework was developed to quantify the operational benefits of a distribution network with SOPs under normal network operating conditions. A generic power injection model was developed and used to determine the optimal SOP operation using an improved Powell’s Direct Set method. Physical limits and power losses of the SOP device (based on back to back voltage-source converters) were considered in the model. Distribution network reconfiguration algorithms, with and without SOPs, were developed and used to identify the benefits of using SOPs. Test results on a 33-bus distribution network compared the benefits of using SOPs, traditional network reconfiguration and the combination of both. The results showed that using only one SOP achieved a similar improvement in network operation compared to the case of using network reconfiguration with all branches equipped with remotely controlled switches. A combination of SOP control and network reconfiguration provided the optimal network operation.

  14. Information resources and the correlation of response patterns between biological end points

    Energy Technology Data Exchange (ETDEWEB)

    Malling, H.V. [National Institute of Environmental Health Sciences, Research Triangle Park, NC (United States); Wassom, J.S. [Oak Ridge National Laboratory, TN (United States)

    1990-12-31

    This paper focuses on the analysis of information for mutagenesis, a biological end point that is important in the overall process of assessing possible adverse health effects from chemical exposure. 17 refs.

  15. Teacher training of secondary - Orient from the point of view practice

    Science.gov (United States)

    Hai, Tuong Duy; Huong, Nguyen Thanh

    2018-01-01

    The article presents the point of view about teacher training based on analysis of practices of teaching/learning in disciplinary in high school. Basing on analysis results of teaching faculty and the learning process of students in the disciplinary in high school to offer tags referred to the ongoing training of secondary teachers to adapt to educational.

  16. Relatively Inexact Proximal Point Algorithm and Linear Convergence Analysis

    Directory of Open Access Journals (Sweden)

    Ram U. Verma

    2009-01-01

    Full Text Available Based on a notion of relatively maximal (m-relaxed monotonicity, the approximation solvability of a general class of inclusion problems is discussed, while generalizing Rockafellar's theorem (1976 on linear convergence using the proximal point algorithm in a real Hilbert space setting. Convergence analysis, based on this new model, is simpler and compact than that of the celebrated technique of Rockafellar in which the Lipschitz continuity at 0 of the inverse of the set-valued mapping is applied. Furthermore, it can be used to generalize the Yosida approximation, which, in turn, can be applied to first-order evolution equations as well as evolution inclusions.

  17. Tube Bulge Process : Theoretical Analysis and Finite Element Simulations

    International Nuclear Information System (INIS)

    Velasco, Raphael; Boudeau, Nathalie

    2007-01-01

    This paper is focused on the determination of mechanics characteristics for tubular materials, using tube bulge process. A comparative study is made between two different models: theoretical model and finite element analysis. The theoretical model is completely developed, based first on a geometrical analysis of the tube profile during bulging, which is assumed to strain in arc of circles. Strain and stress analysis complete the theoretical model, which allows to evaluate tube thickness and state of stress, at any point of the free bulge region. Free bulging of a 304L stainless steel is simulated using Ls-Dyna 970. To validate FE simulations approach, a comparison between theoretical and finite elements models is led on several parameters such as: thickness variation at the free bulge region pole with bulge height, tube thickness variation with z axial coordinate, and von Mises stress variation with plastic strain. Finally, the influence of geometrical parameters deviations on flow stress curve is observed using analytical model: deviations of the tube outer diameter, its initial thickness and the bulge height measurement are taken into account to obtain a resulting error on plastic strain and von Mises stress

  18. Datum Feature Extraction and Deformation Analysis Method Based on Normal Vector of Point Cloud

    Science.gov (United States)

    Sun, W.; Wang, J.; Jin, F.; Liang, Z.; Yang, Y.

    2018-04-01

    In order to solve the problem lacking applicable analysis method in the application of three-dimensional laser scanning technology to the field of deformation monitoring, an efficient method extracting datum feature and analysing deformation based on normal vector of point cloud was proposed. Firstly, the kd-tree is used to establish the topological relation. Datum points are detected by tracking the normal vector of point cloud determined by the normal vector of local planar. Then, the cubic B-spline curve fitting is performed on the datum points. Finally, datum elevation and the inclination angle of the radial point are calculated according to the fitted curve and then the deformation information was analyzed. The proposed approach was verified on real large-scale tank data set captured with terrestrial laser scanner in a chemical plant. The results show that the method could obtain the entire information of the monitor object quickly and comprehensively, and reflect accurately the datum feature deformation.

  19. One-point fluctuation analysis of the high-energy neutrino sky

    Energy Technology Data Exchange (ETDEWEB)

    Feyereisen, Michael R.; Ando, Shin' ichiro [GRAPPA Institute, University of Amsterdam, Science Park 904, 1098 XH Amsterdam (Netherlands); Tamborra, Irene, E-mail: m.r.feyereisen@uva.nl, E-mail: tamborra@nbi.ku.dk, E-mail: s.ando@uva.nl [Niels Bohr International Academy, Niels Bohr Institute, Blegdamsvej 17, 2100 Copenhagen (Denmark)

    2017-03-01

    We perform the first one-point fluctuation analysis of the high-energy neutrino sky. This method reveals itself to be especially suited to contemporary neutrino data, as it allows to study the properties of the astrophysical components of the high-energy flux detected by the IceCube telescope, even with low statistics and in the absence of point source detection. Besides the veto-passing atmospheric foregrounds, we adopt a simple model of the high-energy neutrino background by assuming two main extra-galactic components: star-forming galaxies and blazars. By leveraging multi-wavelength data from Herschel and Fermi , we predict the spectral and anisotropic probability distributions for their expected neutrino counts in IceCube. We find that star-forming galaxies are likely to remain a diffuse background due to the poor angular resolution of IceCube, and we determine an upper limit on the number of shower events that can reasonably be associated to blazars. We also find that upper limits on the contribution of blazars to the measured flux are unfavourably affected by the skewness of the blazar flux distribution. One-point event clustering and likelihood analyses of the IceCube HESE data suggest that this method has the potential to dramatically improve over more conventional model-based analyses, especially for the next generation of neutrino telescopes.

  20. Single cell analysis of G1 check points-the relationship between the restriction point and phosphorylation of pRb

    International Nuclear Information System (INIS)

    Martinsson, Hanna-Stina; Starborg, Maria; Erlandsson, Fredrik; Zetterberg, Anders

    2005-01-01

    Single cell analysis allows high resolution investigation of temporal relationships between transition events in G 1 . It has been suggested that phosphorylation of the retinoblastoma tumor suppressor protein (pRb) is the molecular mechanism behind passage through the restriction point (R). We performed a detailed single cell study of the temporal relationship between R and pRb phosphorylation in human fibroblasts using time lapse video-microscopy combined with immunocytochemistry. Four principally different criteria for pRb phosphorylation were used, namely (i) phosphorylation of residues Ser 795 and Ser 780 (ii) degree of pRb-association with the nuclear structure, a property that is closely related with pRb phosphorylation status, (iii) release of the transcription factor E2F-1 from pRb, and (iv) accumulation of cyclin E, which is dependent on phosphorylation of pRb. The analyses of individual cells revealed that passage through R preceded phosphorylation of pRb, which occurs in a gradually increasing proportion of cells in late G 1 . Our data clearly suggest that pRb phosphorylation is not the molecular mechanism behind the passage through R. The restriction point and phosphorylation of pRb thus seem to represent two separate check point in G 1

  1. Spatially explicit spectral analysis of point clouds and geospatial data

    Science.gov (United States)

    Buscombe, Daniel D.

    2015-01-01

    The increasing use of spatially explicit analyses of high-resolution spatially distributed data (imagery and point clouds) for the purposes of characterising spatial heterogeneity in geophysical phenomena necessitates the development of custom analytical and computational tools. In recent years, such analyses have become the basis of, for example, automated texture characterisation and segmentation, roughness and grain size calculation, and feature detection and classification, from a variety of data types. In this work, much use has been made of statistical descriptors of localised spatial variations in amplitude variance (roughness), however the horizontal scale (wavelength) and spacing of roughness elements is rarely considered. This is despite the fact that the ratio of characteristic vertical to horizontal scales is not constant and can yield important information about physical scaling relationships. Spectral analysis is a hitherto under-utilised but powerful means to acquire statistical information about relevant amplitude and wavelength scales, simultaneously and with computational efficiency. Further, quantifying spatially distributed data in the frequency domain lends itself to the development of stochastic models for probing the underlying mechanisms which govern the spatial distribution of geological and geophysical phenomena. The software packagePySESA (Python program for Spatially Explicit Spectral Analysis) has been developed for generic analyses of spatially distributed data in both the spatial and frequency domains. Developed predominantly in Python, it accesses libraries written in Cython and C++ for efficiency. It is open source and modular, therefore readily incorporated into, and combined with, other data analysis tools and frameworks with particular utility for supporting research in the fields of geomorphology, geophysics, hydrography, photogrammetry and remote sensing. The analytical and computational structure of the toolbox is

  2. Spatially explicit spectral analysis of point clouds and geospatial data

    Science.gov (United States)

    Buscombe, Daniel

    2016-01-01

    The increasing use of spatially explicit analyses of high-resolution spatially distributed data (imagery and point clouds) for the purposes of characterising spatial heterogeneity in geophysical phenomena necessitates the development of custom analytical and computational tools. In recent years, such analyses have become the basis of, for example, automated texture characterisation and segmentation, roughness and grain size calculation, and feature detection and classification, from a variety of data types. In this work, much use has been made of statistical descriptors of localised spatial variations in amplitude variance (roughness), however the horizontal scale (wavelength) and spacing of roughness elements is rarely considered. This is despite the fact that the ratio of characteristic vertical to horizontal scales is not constant and can yield important information about physical scaling relationships. Spectral analysis is a hitherto under-utilised but powerful means to acquire statistical information about relevant amplitude and wavelength scales, simultaneously and with computational efficiency. Further, quantifying spatially distributed data in the frequency domain lends itself to the development of stochastic models for probing the underlying mechanisms which govern the spatial distribution of geological and geophysical phenomena. The software package PySESA (Python program for Spatially Explicit Spectral Analysis) has been developed for generic analyses of spatially distributed data in both the spatial and frequency domains. Developed predominantly in Python, it accesses libraries written in Cython and C++ for efficiency. It is open source and modular, therefore readily incorporated into, and combined with, other data analysis tools and frameworks with particular utility for supporting research in the fields of geomorphology, geophysics, hydrography, photogrammetry and remote sensing. The analytical and computational structure of the toolbox is described

  3. StreakDet data processing and analysis pipeline for space debris optical observations

    Science.gov (United States)

    Virtanen, Jenni; Flohrer, Tim; Muinonen, Karri; Granvik, Mikael; Torppa, Johanna; Poikonen, Jonne; Lehti, Jussi; Santti, Tero; Komulainen, Tuomo; Naranen, Jyri

    We describe a novel data processing and analysis pipeline for optical observations of space debris. The monitoring of space object populations requires reliable acquisition of observational data, to support the development and validation of space debris environment models, the build-up and maintenance of a catalogue of orbital elements. In addition, data is needed for the assessment of conjunction events and for the support of contingency situations or launches. The currently available, mature image processing algorithms for detection and astrometric reduction of optical data cover objects that cross the sensor field-of-view comparably slowly, and within a rather narrow, predefined range of angular velocities. By applying specific tracking techniques, the objects appear point-like or as short trails in the exposures. However, the general survey scenario is always a “track before detect” problem, resulting in streaks, i.e., object trails of arbitrary lengths, in the images. The scope of the ESA-funded StreakDet (Streak detection and astrometric reduction) project is to investigate solutions for detecting and reducing streaks from optical images, particularly in the low signal-to-noise ratio (SNR) domain, where algorithms are not readily available yet. For long streaks, the challenge is to extract precise position information and related registered epochs with sufficient precision. Although some considerations for low-SNR processing of streak-like features are available in the current image processing and computer vision literature, there is a need to discuss and compare these approaches for space debris analysis, in order to develop and evaluate prototype implementations. In the StreakDet project, we develop algorithms applicable to single images (as compared to consecutive frames of the same field) obtained with any observing scenario, including space-based surveys and both low- and high-altitude populations. The proposed processing pipeline starts from the

  4. Throughput analysis of point-to-multi-point hybric FSO/RF network

    KAUST Repository

    Rakia, Tamer

    2017-07-31

    This paper presents and analyzes a point-to-multi-point (P2MP) network that uses a number of free-space optical (FSO) links for data transmission from the central node to the different remote nodes. A common backup radio-frequency (RF) link is used by the central node for data transmission to any remote node in case of the failure of any one of FSO links. We develop a cross-layer Markov chain model to study the throughput from central node to a tagged remote node. Numerical examples are presented to compare the performance of the proposed P2MP hybrid FSO/RF network with that of a P2MP FSO-only network and show that the P2MP Hybrid FSO/RF network achieves considerable performance improvement over the P2MP FSO-only network.

  5. Comparative analysis among several methods used to solve the point kinetic equations

    International Nuclear Information System (INIS)

    Nunes, Anderson L.; Goncalves, Alessandro da C.; Martinez, Aquilino S.; Silva, Fernando Carvalho da

    2007-01-01

    The main objective of this work consists on the methodology development for comparison of several methods for the kinetics equations points solution. The evaluated methods are: the finite differences method, the stiffness confinement method, improved stiffness confinement method and the piecewise constant approximations method. These methods were implemented and compared through a systematic analysis that consists basically of confronting which one of the methods consume smaller computational time with higher precision. It was calculated the relative which function is to combine both criteria in order to reach the goal. Through the analyses of the performance factor it is possible to choose the best method for the solution of point kinetics equations. (author)

  6. Comparative analysis among several methods used to solve the point kinetic equations

    Energy Technology Data Exchange (ETDEWEB)

    Nunes, Anderson L.; Goncalves, Alessandro da C.; Martinez, Aquilino S.; Silva, Fernando Carvalho da [Universidade Federal, Rio de Janeiro, RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia. Programa de Engenharia Nuclear; E-mails: alupo@if.ufrj.br; agoncalves@con.ufrj.br; aquilino@lmp.ufrj.br; fernando@con.ufrj.br

    2007-07-01

    The main objective of this work consists on the methodology development for comparison of several methods for the kinetics equations points solution. The evaluated methods are: the finite differences method, the stiffness confinement method, improved stiffness confinement method and the piecewise constant approximations method. These methods were implemented and compared through a systematic analysis that consists basically of confronting which one of the methods consume smaller computational time with higher precision. It was calculated the relative which function is to combine both criteria in order to reach the goal. Through the analyses of the performance factor it is possible to choose the best method for the solution of point kinetics equations. (author)

  7. Three-dimensional distribution of cortical synapses: a replicated point pattern-based analysis

    Science.gov (United States)

    Anton-Sanchez, Laura; Bielza, Concha; Merchán-Pérez, Angel; Rodríguez, José-Rodrigo; DeFelipe, Javier; Larrañaga, Pedro

    2014-01-01

    The biggest problem when analyzing the brain is that its synaptic connections are extremely complex. Generally, the billions of neurons making up the brain exchange information through two types of highly specialized structures: chemical synapses (the vast majority) and so-called gap junctions (a substrate of one class of electrical synapse). Here we are interested in exploring the three-dimensional spatial distribution of chemical synapses in the cerebral cortex. Recent research has showed that the three-dimensional spatial distribution of synapses in layer III of the neocortex can be modeled by a random sequential adsorption (RSA) point process, i.e., synapses are distributed in space almost randomly, with the only constraint that they cannot overlap. In this study we hypothesize that RSA processes can also explain the distribution of synapses in all cortical layers. We also investigate whether there are differences in both the synaptic density and spatial distribution of synapses between layers. Using combined focused ion beam milling and scanning electron microscopy (FIB/SEM), we obtained three-dimensional samples from the six layers of the rat somatosensory cortex and identified and reconstructed the synaptic junctions. A total volume of tissue of approximately 4500μm3 and around 4000 synapses from three different animals were analyzed. Different samples, layers and/or animals were aggregated and compared using RSA replicated spatial point processes. The results showed no significant differences in the synaptic distribution across the different rats used in the study. We found that RSA processes described the spatial distribution of synapses in all samples of each layer. We also found that the synaptic distribution in layers II to VI conforms to a common underlying RSA process with different densities per layer. Interestingly, the results showed that synapses in layer I had a slightly different spatial distribution from the other layers. PMID:25206325

  8. Trends in business process analysis: from verification to process mining

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Cardoso, J.; Cordeiro, J.; Filipe, J.

    2007-01-01

    Business process analysis ranges from model verification at design-time to the monitoring of processes at runtime. Much progress has been achieved in process verification. Today we are able to verify the entire reference model of SAP without any problems. Moreover, more and more processes leave

  9. Nonlinear bending and collapse analysis of a poked cylinder and other point-loaded cylinders

    International Nuclear Information System (INIS)

    Sobel, L.H.

    1983-06-01

    This paper analyzes the geometrically nonlinear bending and collapse behavior of an elastic, simply supported cylindrical shell subjected to an inward-directed point load applied at midlength. The large displacement analysis results for this thin (R/t = 638) poked cylinder were obtained from the STAGSC-1 finite element computer program. STAGSC-1 results are also presented for two other point-loaded shell problems: a pinched cylinder (R/t = 100), and a venetian blind (R/t = 250)

  10. Some probabilistic properties of fractional point processes

    KAUST Repository

    Garra, Roberto

    2017-05-16

    In this article, the first hitting times of generalized Poisson processes N-f (t), related to Bernstein functions f are studied. For the spacefractional Poisson processes, N alpha (t), t > 0 ( corresponding to f = x alpha), the hitting probabilities P{T-k(alpha) < infinity} are explicitly obtained and analyzed. The processes N-f (t) are time-changed Poisson processes N( H-f (t)) with subordinators H-f (t) and here we study N(Sigma H-n(j= 1)f j (t)) and obtain probabilistic features of these extended counting processes. A section of the paper is devoted to processes of the form N( G(H,v) (t)) where G(H,v) (t) are generalized grey Brownian motions. This involves the theory of time-dependent fractional operators of the McBride form. While the time-fractional Poisson process is a renewal process, we prove that the space-time Poisson process is no longer a renewal process.

  11. Analysis of hygienic critical control points in boar semen production.

    Science.gov (United States)

    Schulze, M; Ammon, C; Rüdiger, K; Jung, M; Grobbel, M

    2015-02-01

    The present study addresses the microbiological results of a quality control audit in artificial insemination (AI) boar studs in Germany and Austria. The raw and processed semen of 344 boars in 24 AI boar studs were analyzed. Bacteria were found in 26% (88 of 344) of the extended ejaculates and 66.7% (18 of 24) of the boar studs. The bacterial species found in the AI dose were not cultured from the respective raw semen in 95.5% (84 of 88) of the positive samples. These data, together with the fact that in most cases all the samples from one stud were contaminated with identical bacteria (species and resistance profile), indicate contamination during processing. Microbiological investigations of the equipment and the laboratory environment during semen processing in 21 AI boar studs revealed nine hygienic critical control points (HCCP), which were addressed after the first audit. On the basis of the analysis of the contamination rates of the ejaculate samples, improvements in the hygiene status were already present in the second audit (P = 0.0343, F-test). Significant differences were observed for heating cabinets (improvement, P = 0.0388) and manual operating elements (improvement, P = 0.0002). The odds ratio of finding contaminated ejaculates in the first and second audit was 1.68 (with the 95% confidence interval ranging from 1.04 to 2.69). Furthermore, an overall good hygienic status was shown for extenders, the inner face of dilution tank lids, dyes, and ultrapure water treatment plants. Among the nine HCCP considered, the most heavily contaminated samples, as assessed by the median scores throughout all the studs, were found in the sinks and/or drains. High numbers (>10(3) colony-forming units/cm(2)) of bacteria were found in the heating cabinets, ejaculate transfer, manual operating elements, and laboratory surfaces. In conclusion, the present study emphasizes the need for both training of the laboratory staff in monitoring HCCP in routine semen

  12. Gaussian process regression analysis for functional data

    CERN Document Server

    Shi, Jian Qing

    2011-01-01

    Gaussian Process Regression Analysis for Functional Data presents nonparametric statistical methods for functional regression analysis, specifically the methods based on a Gaussian process prior in a functional space. The authors focus on problems involving functional response variables and mixed covariates of functional and scalar variables.Covering the basics of Gaussian process regression, the first several chapters discuss functional data analysis, theoretical aspects based on the asymptotic properties of Gaussian process regression models, and new methodological developments for high dime

  13. Probabilistic Design in a Sheet Metal Stamping Process under Failure Analysis

    International Nuclear Information System (INIS)

    Buranathiti, Thaweepat; Cao, Jian; Chen, Wei; Xia, Z. Cedric

    2005-01-01

    Sheet metal stamping processes have been widely implemented in many industries due to its repeatability and productivity. In general, the simulations for a sheet metal forming process involve nonlinearity, complex material behavior and tool-material interaction. Instabilities in terms of tearing and wrinkling are major concerns in many sheet metal stamping processes. In this work, a sheet metal stamping process of a mild steel for a wheelhouse used in automobile industry is studied by using an explicit nonlinear finite element code and incorporating failure analysis (tearing and wrinkling) and design under uncertainty. Margins of tearing and wrinkling are quantitatively defined via stress-based criteria for system-level design. The forming process utilizes drawbeads instead of using the blank holder force to restrain the blank. The main parameters of interest in this work are friction conditions, drawbead configurations, sheet metal properties, and numerical errors. A robust design model is created to conduct a probabilistic design, which is made possible for this complex engineering process via an efficient uncertainty propagation technique. The method called the weighted three-point-based method estimates the statistical characteristics (mean and variance) of the responses of interest (margins of failures), and provide a systematic approach in designing a sheet metal forming process under the framework of design under uncertainty

  14. Acceleration of Meshfree Radial Point Interpolation Method on Graphics Hardware

    International Nuclear Information System (INIS)

    Nakata, Susumu

    2008-01-01

    This article describes a parallel computational technique to accelerate radial point interpolation method (RPIM)-based meshfree method using graphics hardware. RPIM is one of the meshfree partial differential equation solvers that do not require the mesh structure of the analysis targets. In this paper, a technique for accelerating RPIM using graphics hardware is presented. In the method, the computation process is divided into small processes suitable for processing on the parallel architecture of the graphics hardware in a single instruction multiple data manner.

  15. European monetary union: limits to growth or bifurcation point

    Directory of Open Access Journals (Sweden)

    Oleksandr Sharov

    2014-02-01

    Full Text Available The paper presents the background and process of the EU monetary union establishment with regard to historical experience of European countries involving previous attempts of currency integration between separate countries. The author also analyzes methods of solving various theoretical and practical problems arising during the process. In particular, it is pointed out that the majority of the problems were caused by neglecting monetary integration principles, the need for observing which had been clearly stated yet at the preliminary stages of the integration process. Special emphasis is made on reviewing current development stage of the monetary union, in particular, with regard to problems caused by the financial crisis in "peripheral countries" of the Union as well as by concurrent intensification of cooperation in the field of banking and fiscal issues. In this context, the trends of further European monetary integration development are also considered. As resulted from analysis, the author concludes that the European Monetary Union had exhausted its energy for development along previously assigned trajectory and reached the bifurcation point, whereas its further improvement or gradual preservation and decline depend upon the direction in which the point is passed.

  16. Calorimetry end-point predictions

    International Nuclear Information System (INIS)

    Fox, M.A.

    1981-01-01

    This paper describes a portion of the work presently in progress at Rocky Flats in the field of calorimetry. In particular, calorimetry end-point predictions are outlined. The problems associated with end-point predictions and the progress made in overcoming these obstacles are discussed. The two major problems, noise and an accurate description of the heat function, are dealt with to obtain the most accurate results. Data are taken from an actual calorimeter and are processed by means of three different noise reduction techniques. The processed data are then utilized by one to four algorithms, depending on the accuracy desired to determined the end-point

  17. Point process models for spatio-temporal distance sampling data from a large-scale survey of blue whales

    KAUST Repository

    Yuan, Yuan; Bachl, Fabian E.; Lindgren, Finn; Borchers, David L.; Illian, Janine B.; Buckland, Stephen T.; Rue, Haavard; Gerrodette, Tim

    2017-01-01

    Distance sampling is a widely used method for estimating wildlife population abundance. The fact that conventional distance sampling methods are partly design-based constrains the spatial resolution at which animal density can be estimated using these methods. Estimates are usually obtained at survey stratum level. For an endangered species such as the blue whale, it is desirable to estimate density and abundance at a finer spatial scale than stratum. Temporal variation in the spatial structure is also important. We formulate the process generating distance sampling data as a thinned spatial point process and propose model-based inference using a spatial log-Gaussian Cox process. The method adopts a flexible stochastic partial differential equation (SPDE) approach to model spatial structure in density that is not accounted for by explanatory variables, and integrated nested Laplace approximation (INLA) for Bayesian inference. It allows simultaneous fitting of detection and density models and permits prediction of density at an arbitrarily fine scale. We estimate blue whale density in the Eastern Tropical Pacific Ocean from thirteen shipboard surveys conducted over 22 years. We find that higher blue whale density is associated with colder sea surface temperatures in space, and although there is some positive association between density and mean annual temperature, our estimates are consistent with no trend in density across years. Our analysis also indicates that there is substantial spatially structured variation in density that is not explained by available covariates.

  18. Point process models for spatio-temporal distance sampling data from a large-scale survey of blue whales

    KAUST Repository

    Yuan, Yuan

    2017-12-28

    Distance sampling is a widely used method for estimating wildlife population abundance. The fact that conventional distance sampling methods are partly design-based constrains the spatial resolution at which animal density can be estimated using these methods. Estimates are usually obtained at survey stratum level. For an endangered species such as the blue whale, it is desirable to estimate density and abundance at a finer spatial scale than stratum. Temporal variation in the spatial structure is also important. We formulate the process generating distance sampling data as a thinned spatial point process and propose model-based inference using a spatial log-Gaussian Cox process. The method adopts a flexible stochastic partial differential equation (SPDE) approach to model spatial structure in density that is not accounted for by explanatory variables, and integrated nested Laplace approximation (INLA) for Bayesian inference. It allows simultaneous fitting of detection and density models and permits prediction of density at an arbitrarily fine scale. We estimate blue whale density in the Eastern Tropical Pacific Ocean from thirteen shipboard surveys conducted over 22 years. We find that higher blue whale density is associated with colder sea surface temperatures in space, and although there is some positive association between density and mean annual temperature, our estimates are consistent with no trend in density across years. Our analysis also indicates that there is substantial spatially structured variation in density that is not explained by available covariates.

  19. A Comparative Study of Applying Active-Set and Interior Point Methods in MPC for Controlling Nonlinear pH Process

    Directory of Open Access Journals (Sweden)

    Syam Syafiie

    2014-06-01

    Full Text Available A comparative study of Model Predictive Control (MPC using active-set method and interior point methods is proposed as a control technique for highly non-linear pH process. The process is a strong acid-strong base system. A strong acid of hydrochloric acid (HCl and a strong base of sodium hydroxide (NaOH with the presence of buffer solution sodium bicarbonate (NaHCO3 are used in a neutralization process flowing into reactor. The non-linear pH neutralization model governed in this process is presented by multi-linear models. Performance of both controllers is studied by evaluating its ability of set-point tracking and disturbance-rejection. Besides, the optimization time is compared between these two methods; both MPC shows the similar performance with no overshoot, offset, and oscillation. However, the conventional active-set method gives a shorter control action time for small scale optimization problem compared to MPC using IPM method for pH control.

  20. Intensity-dependent point spread image processing

    International Nuclear Information System (INIS)

    Cornsweet, T.N.; Yellott, J.I.

    1984-01-01

    There is ample anatomical, physiological and psychophysical evidence that the mammilian retina contains networks that mediate interactions among neighboring receptors, resulting in intersecting transformations between input images and their corresponding neural output patterns. The almost universally accepted view is that the principal form of interaction involves lateral inhibition, resulting in an output pattern that is the convolution of the input with a ''Mexican hat'' or difference-of-Gaussians spread function, having a positive center and a negative surround. A closely related process is widely applied in digital image processing, and in photography as ''unsharp masking''. The authors show that a simple and fundamentally different process, involving no inhibitory or subtractive terms can also account for the physiological and psychophysical findings that have been attributed to lateral inhibition. This process also results in a number of fundamental effects that occur in mammalian vision and that would be of considerable significance in robotic vision, but which cannot be explained by lateral inhibitory interaction

  1. Kinematic and dynamic analysis of a serial-link robot for inspection process in EAST vacuum vessel

    Energy Technology Data Exchange (ETDEWEB)

    Peng Xuebing, E-mail: pengxb@ipp.ac.cn [Institute of Plasma Physics, Chinese Academy of Sciences, P.O. Box 1126, Shushanhu Road 350, Hefei, Anhui 230031 (China); Yuan Jianjun; Zhang Weijun [Research Institute of Robotics, Mechanical Engineering School, Shanghai Jiao Tong University, No.800, Dong Chuan Road, Min Hang District, Shanghai 200240 (China); Yang Yang; Song Yuntao [Institute of Plasma Physics, Chinese Academy of Sciences, P.O. Box 1126, Shushanhu Road 350, Hefei, Anhui 230031 (China)

    2012-08-15

    Highlights: Black-Right-Pointing-Pointer A serial-link robot FIVIR is proposed for inspection of EAST PFCs between plasma shots. Black-Right-Pointing-Pointer The FIVIR is a function modular design and has specially designed curvilinear mechanism for axes 4-6. Black-Right-Pointing-Pointer The D-H coordinate systems, forward and inverse kinematic model can be easily established and solved for the FIVIR. Black-Right-Pointing-Pointer The FIVIR can fulfill the required workspace and has a good dynamic performance in the inspection process. - Abstract: The present paper introduces a serial-link robot which is named flexible in-vessel inspection robot (FIVIR) and developed for Experimental Advanced Superconducting Tokamak (EAST). The task of the robot is to carry process tools, such as viewing camera and leakage detector, to inspect the components installed inside of EAST vacuum vessel. The FIVIR can help to understand the physical phenomena which could be happened in the vacuum vessel during plasma operation and could be one part of EAST remote handling system if needed. The FIVIR was designed with the consideration of having easy control and a good mechanics property which drives it resulted in function modular design. The workspace simulation and kinematic analysis are given in this paper. The dynamic behavior of the FIVIR is studied by multi-body system simulation using ADAMS software. The study result shows the FIVIR has ascendant kinematic and dynamic performance and can fulfill the design requirement for inspection process in EAST vacuum vessel.

  2. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  3. Chain digitisation support by point-of-sale systems: an analysis of the Dutch product software market.

    NARCIS (Netherlands)

    Plomp, M.G.A.; Rijn, G. van; Batenburg, R.S.

    2012-01-01

    Point-of-sale (POS) systems increasingly support more retail processes than just the basic cash functionality. But to what extent do they support chain digitisation, i.e., interorganisational processes as the exchange of order and sales information? We develop a two-dimensional maturity model for

  4. Chain digitisation support by point-of-sale systems: An analysis of the Dutch product software market

    NARCIS (Netherlands)

    Plomp, M.G.A.; van Rijn, G.; Batenburg, R.S.

    2012-01-01

    Point-of-sale (POS) systems increasingly support more retail processes than just the basic cash functionality. But to what extent do they support chain digitisation, i.e., interorganisational processes as the exchange of order and sales information? We develop a two-dimensional maturity model for

  5. [Evaluation of a new blood gas analysis system: RapidPoint 500(®)].

    Science.gov (United States)

    Nicolas, Thierry; Cabrolier, Nadège; Bardonnet, Karine; Davani, Siamak

    2013-01-01

    We present here evaluation of a new blood gas analysis system, RapidPoint 500(®) (Siemens Healthcare Diagnostics). The aim of this research was to compare the ergonomics and analytical performances of this analyser with those of the RapidLab 1265 for the following parameters: pH, partial oxygen pressure, partial carbon dioxide pressure, sodium, potassium, ionized calcium, lactate and the CO-oximetry parameters: hemoglobin, oxyhemoglobin, carboxyhemoglobin, methemoglobin, reduced hemoglobin, neonatal bilirubin; as well as with the Dimension Vista 500 results for chloride and glucose. The Valtec protocol, recommended by the French Society of Clinical Biology (SFBC), was used to analyze the study results. The experiment was carried out over a period of one month in the Department of medical biochemistry. One hundred sixty five samples from adult patients admitted to the ER or hospitalized in intensive care were tested. The RapidPoint 500(®) was highly satisfactory from an ergonomic point of view. Intra-and inter- assay coefficients of variation (CV) with the three control levels were below those recommended by the SFBC for all parameters, and the comparative study gave coefficients of determination higher than 0.91. Taken together, the RapidPoint 500(®) appears fully satisfactory in terms of ergonomics and analytical performance.

  6. Moments analysis of concurrent Poisson processes

    International Nuclear Information System (INIS)

    McBeth, G.W.; Cross, P.

    1975-01-01

    A moments analysis of concurrent Poisson processes has been carried out. Equations are given which relate combinations of distribution moments to sums of products involving the number of counts associated with the processes and the mean rate of the processes. Elimination of background is discussed and equations suitable for processing random radiation, parent-daughter pairs in the presence of background, and triple and double correlations in the presence of background are given. The theory of identification of the four principle radioactive series by moments analysis is discussed. (Auth.)

  7. Identification of the Key Fields and Their Key Technical Points of Oncology by Patent Analysis.

    Science.gov (United States)

    Zhang, Ting; Chen, Juan; Jia, Xiaofeng

    2015-01-01

    This paper aims to identify the key fields and their key technical points of oncology by patent analysis. Patents of oncology applied from 2006 to 2012 were searched in the Thomson Innovation database. The key fields and their key technical points were determined by analyzing the Derwent Classification (DC) and the International Patent Classification (IPC), respectively. Patent applications in the top ten DC occupied 80% of all the patent applications of oncology, which were the ten fields of oncology to be analyzed. The number of patent applications in these ten fields of oncology was standardized based on patent applications of oncology from 2006 to 2012. For each field, standardization was conducted separately for each of the seven years (2006-2012) and the mean of the seven standardized values was calculated to reflect the relative amount of patent applications in that field; meanwhile, regression analysis using time (year) and the standardized values of patent applications in seven years (2006-2012) was conducted so as to evaluate the trend of patent applications in each field. Two-dimensional quadrant analysis, together with the professional knowledge of oncology, was taken into consideration in determining the key fields of oncology. The fields located in the quadrant with high relative amount or increasing trend of patent applications are identified as key ones. By using the same method, the key technical points in each key field were identified. Altogether 116,820 patents of oncology applied from 2006 to 2012 were retrieved, and four key fields with twenty-nine key technical points were identified, including "natural products and polymers" with nine key technical points, "fermentation industry" with twelve ones, "electrical medical equipment" with four ones, and "diagnosis, surgery" with four ones. The results of this study could provide guidance on the development direction of oncology, and also help researchers broaden innovative ideas and discover new

  8. Pseudo-dynamic source modelling with 1-point and 2-point statistics of earthquake source parameters

    KAUST Repository

    Song, S. G.; Dalguer, L. A.; Mai, Paul Martin

    2013-01-01

    statistical framework that governs the finite-fault rupture process with 1-point and 2-point statistics of source parameters in order to quantify the variability of finite source models for future scenario events. We test this method by extracting 1-point

  9. Optimization of the single point incremental forming process for titanium sheets by using response surface

    Directory of Open Access Journals (Sweden)

    Saidi Badreddine

    2016-01-01

    Full Text Available The single point incremental forming process is well-known to be perfectly suited for prototyping and small series. One of its fields of applicability is the medicine area for the forming of titanium prostheses or titanium medical implants. However this process is not yet very industrialized, mainly due its geometrical inaccuracy, its not homogeneous thickness distribution& Moreover considerable forces can occur. They must be controlled in order to preserve the tooling. In this paper, a numerical approach is proposed in order to minimize the maximum force achieved during the incremental forming of titanium sheets and to maximize the minimal thickness. A surface response methodology is used to find the optimal values of two input parameters of the process, the punch diameter and the vertical step size of the tool path.

  10. Semantic multimedia analysis and processing

    CERN Document Server

    Spyrou, Evaggelos; Mylonas, Phivos

    2014-01-01

    Broad in scope, Semantic Multimedia Analysis and Processing provides a complete reference of techniques, algorithms, and solutions for the design and the implementation of contemporary multimedia systems. Offering a balanced, global look at the latest advances in semantic indexing, retrieval, analysis, and processing of multimedia, the book features the contributions of renowned researchers from around the world. Its contents are based on four fundamental thematic pillars: 1) information and content retrieval, 2) semantic knowledge exploitation paradigms, 3) multimedia personalization, and 4)

  11. Temperature calibration procedure for thin film substrates for thermo-ellipsometric analysis using melting point standards

    International Nuclear Information System (INIS)

    Kappert, Emiel J.; Raaijmakers, Michiel J.T.; Ogieglo, Wojciech; Nijmeijer, Arian; Huiskes, Cindy; Benes, Nieck E.

    2015-01-01

    Highlights: • Facile temperature calibration method for thermo-ellipsometric analysis. • The melting point of thin films of indium, lead, zinc, and water can be detected by ellipsometry. • In-situ calibration of ellipsometry hot stage, without using any external equipment. • High-accuracy temperature calibration (±1.3 °C). - Abstract: Precise and accurate temperature control is pertinent to studying thermally activated processes in thin films. Here, we present a calibration method for the substrate–film interface temperature using spectroscopic ellipsometry. The method is adapted from temperature calibration methods that are well developed for thermogravimetric analysis and differential scanning calorimetry instruments, and is based on probing a transition temperature. Indium, lead, and zinc could be spread on a substrate, and the phase transition of these metals could be detected by a change in the Ψ signal of the ellipsometer. For water, the phase transition could be detected by a loss of signal intensity as a result of light scattering by the ice crystals. The combined approach allowed for construction of a linear calibration curve with an accuracy of 1.3 °C or lower over the full temperature range

  12. Material-Point Analysis of Large-Strain Problems

    DEFF Research Database (Denmark)

    Andersen, Søren

    The aim of this thesis is to apply and improve the material-point method for modelling of geotechnical problems. One of the geotechnical phenomena that is a subject of active research is the study of landslides. A large amount of research is focused on determining when slopes become unstable. Hence......, it is possible to predict if a certain slope is stable using commercial finite element or finite difference software such as PLAXIS, ABAQUS or FLAC. However, the dynamics during a landslide are less explored. The material-point method (MPM) is a novel numerical method aimed at analysing problems involving...... materials subjected to large strains in a dynamical time–space domain. This thesis explores the material-point method with the specific aim of improving the performance for geotechnical problems. Large-strain geotechnical problems such as landslides pose a major challenge to model numerically. Employing...

  13. Chemical process hazards analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-01

    The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

  14. From Takeoff to Landing: Looking at the Design Process for the Development of NASA Blast at Thanksgiving Point

    Directory of Open Access Journals (Sweden)

    Stephen Ashton

    2011-01-01

    Full Text Available In this article we discuss the process of design used to develop and design the NASA Blast exhibition at Thanksgiving Point, a museum complex in Lehi, Utah. This was a class project for the Advanced Instructional Design Class at Brigham Young University. In an attempt to create a new discourse (Krippendorff, 2006 for Thanksgiving Point visitors and staff members, the design class used a very fluid design approach by utilizing brainstorming, researching, class member personas, and prototyping to create ideas for the new exhibition. Because of the nature of the experience, the design class developed their own techniques to enhance the process of their design. The result of the design was a compelling narrative that brought all the elements of the exhibition together in a cohesive piece.

  15. Distributed maximum power point tracking in wind micro-grids

    Directory of Open Access Journals (Sweden)

    Carlos Andrés Ramos-Paja

    2012-06-01

    Full Text Available With the aim of reducing the hardware requirements in micro-grids based on wind generators, a distributed maximum power point tracking algorithm is proposed. Such a solution reduces the amount of current sensors and processing devices to maximize the power extracted from the micro-grid, reducing the application cost. The analysis of the optimal operating points of the wind generator was performed experimentally, which in addition provides realistic model parameters. Finally, the proposed solution was validated by means of detailed simulations performed in the power electronics software PSIM, contrasting the achieved performance with traditional solutions.

  16. A Practical Decision-Analysis Process for Forest Ecosystem Management

    Science.gov (United States)

    H. Michael Rauscher; F. Thomas Lloyd; David L. Loftis; Mark J. Twery

    2000-01-01

    Many authors have pointed out the need to firm up the 'fuzzy' ecosystem management paradigm and develop operationally practical processes to allow forest managers to accommodate more effectively the continuing rapid change in societal perspectives and goals. There are three spatial scales where clear, precise, practical ecosystem management processes are...

  17. Two-stage process analysis using the process-based performance measurement framework and business process simulation

    NARCIS (Netherlands)

    Han, K.H.; Kang, J.G.; Song, M.S.

    2009-01-01

    Many enterprises have recently been pursuing process innovation or improvement to attain their performance goals. To align a business process with enterprise performances, this study proposes a two-stage process analysis for process (re)design that combines the process-based performance measurement

  18. Analysis of a simple pendulum driven at its suspension point

    International Nuclear Information System (INIS)

    Yoshida, S; Findley, T

    2005-01-01

    To familiarize undergraduate students with the dynamics of a damped driven harmonic oscillator, a simple pendulum was set up and driven at its suspension point under different damping conditions. From the time domain analysis, the decay constant was estimated and used to predict the frequency response. The simple pendulum was then driven at a series of frequencies near the resonance. By measuring the maximum amplitude at each driving frequency, the frequency response was determined. With one free parameter, which was determined under the first damping condition, the predicted frequency responses showed good agreement with the measured frequency responses under all damping conditions

  19. Simulated interprofessional education: an analysis of teaching and learning processes.

    Science.gov (United States)

    van Soeren, Mary; Devlin-Cop, Sandra; Macmillan, Kathleen; Baker, Lindsay; Egan-Lee, Eileen; Reeves, Scott

    2011-11-01

    Simulated learning activities are increasingly being used in health professions and interprofessional education (IPE). Specifically, IPE programs are frequently adopting role-play simulations as a key learning approach. Despite this widespread adoption, there is little empirical evidence exploring the teaching and learning processes embedded within this type of simulation. This exploratory study provides insight into the nature of these processes through the use of qualitative methods. A total of 152 clinicians, 101 students and 9 facilitators representing a range of health professions, participated in video-recorded role-plays and debrief sessions. Videotapes were analyzed to explore emerging issues and themes related to teaching and learning processes related to this type of interprofessional simulated learning experience. In addition, three focus groups were conducted with a subset of participants to explore perceptions of their educational experiences. Five key themes emerged from the data analysis: enthusiasm and motivation, professional role assignment, scenario realism, facilitator style and background and team facilitation. Our findings suggest that program developers need to be mindful of these five themes when using role-plays in an interprofessional context and point to the importance of deliberate and skilled facilitation in meeting desired learning outcomes.

  20. Music analysis and point-set compression

    DEFF Research Database (Denmark)

    Meredith, David

    2015-01-01

    COSIATEC, SIATECCompress and Forth’s algorithm are point-set compression algorithms developed for discovering repeated patterns in music, such as themes and motives that would be of interest to a music analyst. To investigate their effectiveness and versatility, these algorithms were evaluated...... on three analytical tasks that depend on the discovery of repeated patterns: classifying folk song melodies into tune families, discovering themes and sections in polyphonic music, and discovering subject and countersubject entries in fugues. Each algorithm computes a compressed encoding of a point......-set representation of a musical object in the form of a list of compact patterns, each pattern being given with a set of vectors indicating its occurrences. However, the algorithms adopt different strategies in their attempts to discover encodings that maximize compression.The best-performing algorithm on the folk...

  1. Meta-Analysis of Effect Sizes Reported at Multiple Time Points Using General Linear Mixed Model

    Science.gov (United States)

    Musekiwa, Alfred; Manda, Samuel O. M.; Mwambi, Henry G.; Chen, Ding-Geng

    2016-01-01

    Meta-analysis of longitudinal studies combines effect sizes measured at pre-determined time points. The most common approach involves performing separate univariate meta-analyses at individual time points. This simplistic approach ignores dependence between longitudinal effect sizes, which might result in less precise parameter estimates. In this paper, we show how to conduct a meta-analysis of longitudinal effect sizes where we contrast different covariance structures for dependence between effect sizes, both within and between studies. We propose new combinations of covariance structures for the dependence between effect size and utilize a practical example involving meta-analysis of 17 trials comparing postoperative treatments for a type of cancer, where survival is measured at 6, 12, 18 and 24 months post randomization. Although the results from this particular data set show the benefit of accounting for within-study serial correlation between effect sizes, simulations are required to confirm these results. PMID:27798661

  2. Analysis of Multicomponent Adsorption Close to a Dew Point

    DEFF Research Database (Denmark)

    Shapiro, Alexander; Stenby, Erling Halfdan

    1998-01-01

    We develop the potential theory of multicomponent adsorption close to a dew point. The approach is based on an asymptotic adsorption equation (AAE) which is valid in a vicinity of the dew point. By this equation the thickness of the liquid film is expressed through thermodynamic characteristics...... and the direct calculations, even if the mixture is not close to a dew point.Key Words: adsorption; potential theory; multicomponent; dew point....

  3. Analysis of Arbitrary Reflector Antennas Applying the Geometrical Theory of Diffraction Together with the Master Points Technique

    Directory of Open Access Journals (Sweden)

    María Jesús Algar

    2013-01-01

    Full Text Available An efficient approach for the analysis of surface conformed reflector antennas fed arbitrarily is presented. The near field in a large number of sampling points in the aperture of the reflector is obtained applying the Geometrical Theory of Diffraction (GTD. A new technique named Master Points has been developed to reduce the complexity of the ray-tracing computations. The combination of both GTD and Master Points reduces the time requirements of this kind of analysis. To validate the new approach, several reflectors and the effects on the radiation pattern caused by shifting the feed and introducing different obstacles have been considered concerning both simple and complex geometries. The results of these analyses have been compared with the Method of Moments (MoM results.

  4. Analysis of step-up transformer tap change on the quantities at the point of connection to transmission grid

    Directory of Open Access Journals (Sweden)

    Đorđević Dragan

    2017-01-01

    Full Text Available The analysis of a step-up transformer tap change on the quantities at the point of connection to the transmission grid is presented in this paper. The point of connection of generator TENT A6 has been analyzed, and a detailed model of this generator is available in software package DIgSILENT Power Factory. The comparison between the effect of a step-up transformer tap change on the quantities at the point of connection during automatic and manual operation of voltage regulator has been conducted. In order to conduct the analysis of the manual operation of the voltage regulator, the comparison between the different methods of modeling of these modes has been performed. Several generator operating points, selected in order to represent the need for tap change, have been analyzed. Also, previously mentioned analyses have been performed taking into account the voltage-reactive stiffness at the point of connection.

  5. Analysis of relationship between registration performance of point cloud statistical model and generation method of corresponding points

    International Nuclear Information System (INIS)

    Yamaoka, Naoto; Watanabe, Wataru; Hontani, Hidekata

    2010-01-01

    Most of the time when we construct statistical point cloud model, we need to calculate the corresponding points. Constructed statistical model will not be the same if we use different types of method to calculate the corresponding points. This article proposes the effect to statistical model of human organ made by different types of method to calculate the corresponding points. We validated the performance of statistical model by registering a surface of an organ in a 3D medical image. We compare two methods to calculate corresponding points. The first, the 'Generalized Multi-Dimensional Scaling (GMDS)', determines the corresponding points by the shapes of two curved surfaces. The second approach, the 'Entropy-based Particle system', chooses corresponding points by calculating a number of curved surfaces statistically. By these methods we construct the statistical models and using these models we conducted registration with the medical image. For the estimation, we use non-parametric belief propagation and this method estimates not only the position of the organ but also the probability density of the organ position. We evaluate how the two different types of method that calculates corresponding points affects the statistical model by change in probability density of each points. (author)

  6. Discussion of "Modern statistics for spatial point processes"

    DEFF Research Database (Denmark)

    Jensen, Eva Bjørn Vedel; Prokesová, Michaela; Hellmund, Gunnar

    2007-01-01

    ABSTRACT. The paper ‘Modern statistics for spatial point processes’ by Jesper Møller and Rasmus P. Waagepetersen is based on a special invited lecture given by the authors at the 21st Nordic Conference on Mathematical Statistics, held at Rebild, Denmark, in June 2006. At the conference, Antti...

  7. Finite Elements on Point Based Surfaces

    NARCIS (Netherlands)

    Clarenz, U.; Rumpf, M.; Telea, A.

    2004-01-01

    We present a framework for processing point-based surfaces via partial differential equations (PDEs). Our framework efficiently and effectively brings well-known PDE-based processing techniques to the field of point-based surfaces. Our method is based on the construction of local tangent planes and

  8. Parametric Architectural Design with Point-clouds

    DEFF Research Database (Denmark)

    Zwierzycki, Mateusz; Evers, Henrik Leander; Tamke, Martin

    2016-01-01

    This paper investigates the efforts and benefits of the implementation of point clouds into architectural design processes and tools. Based on a study on the principal work processes of designers with point clouds the prototypical plugin/library - Volvox - was developed for the parametric modelling...

  9. Study on characteristic points of boiling curve by using wavelet analysis and genetic algorithm

    International Nuclear Information System (INIS)

    Wei Huiming; Su Guanghui; Qiu Suizheng; Yang Xingbo

    2009-01-01

    Based on the wavelet analysis theory of signal singularity detection,the critical heat flux (CHF) and minimum film boiling starting point (q min ) of boiling curves can be detected and analyzed by using the wavelet multi-resolution analysis. To predict the CHF in engineering, empirical relations were obtained based on genetic algorithm. The results of wavelet detection and genetic algorithm prediction are consistent with experimental data very well. (authors)

  10. Point Cloud Based Change Detection - an Automated Approach for Cloud-based Services

    Science.gov (United States)

    Collins, Patrick; Bahr, Thomas

    2016-04-01

    The fusion of stereo photogrammetric point clouds with LiDAR data or terrain information derived from SAR interferometry has a significant potential for 3D topographic change detection. In the present case study latest point cloud generation and analysis capabilities are used to examine a landslide that occurred in the village of Malin in Maharashtra, India, on 30 July 2014, and affected an area of ca. 44.000 m2. It focuses on Pléiades high resolution satellite imagery and the Airbus DS WorldDEMTM as a product of the TanDEM-X mission. This case study was performed using the COTS software package ENVI 5.3. Integration of custom processes and automation is supported by IDL (Interactive Data Language). Thus, ENVI analytics is running via the object-oriented and IDL-based ENVITask API. The pre-event topography is represented by the WorldDEMTM product, delivered with a raster of 12 m x 12 m and based on the EGM2008 geoid (called pre-DEM). For the post-event situation a Pléiades 1B stereo image pair of the AOI affected was obtained. The ENVITask "GeneratePointCloudsByDenseImageMatching" was implemented to extract passive point clouds in LAS format from the panchromatic stereo datasets: • A dense image-matching algorithm is used to identify corresponding points in the two images. • A block adjustment is applied to refine the 3D coordinates that describe the scene geometry. • Additionally, the WorldDEMTM was input to constrain the range of heights in the matching area, and subsequently the length of the epipolar line. The "PointCloudFeatureExtraction" task was executed to generate the post-event digital surface model from the photogrammetric point clouds (called post-DEM). Post-processing consisted of the following steps: • Adding the geoid component (EGM 2008) to the post-DEM. • Pre-DEM reprojection to the UTM Zone 43N (WGS-84) coordinate system and resizing. • Subtraction of the pre-DEM from the post-DEM. • Filtering and threshold based classification of

  11. Scalets, wavelets and (complex) turning point quantization

    Science.gov (United States)

    Handy, C. R.; Brooks, H. A.

    2001-05-01

    Despite the many successes of wavelet analysis in image and signal processing, the incorporation of continuous wavelet transform theory within quantum mechanics has lacked a compelling, first principles, motivating analytical framework, until now. For arbitrary one-dimensional rational fraction Hamiltonians, we develop a simple, unified formalism, which clearly underscores the complementary, and mutually interdependent, role played by moment quantization theory (i.e. via scalets, as defined herein) and wavelets. This analysis involves no approximation of the Hamiltonian within the (equivalent) wavelet space, and emphasizes the importance of (complex) multiple turning point contributions in the quantization process. We apply the method to three illustrative examples. These include the (double-well) quartic anharmonic oscillator potential problem, V(x) = Z2x2 + gx4, the quartic potential, V(x) = x4, and the very interesting and significant non-Hermitian potential V(x) = -(ix)3, recently studied by Bender and Boettcher.

  12. Identification of 'Point A' as the prevalent source of error in cephalometric analysis of lateral radiographs.

    Science.gov (United States)

    Grogger, P; Sacher, C; Weber, S; Millesi, G; Seemann, R

    2018-04-10

    Deviations in measuring dentofacial components in a lateral X-ray represent a major hurdle in the subsequent treatment of dysgnathic patients. In a retrospective study, we investigated the most prevalent source of error in the following commonly used cephalometric measurements: the angles Sella-Nasion-Point A (SNA), Sella-Nasion-Point B (SNB) and Point A-Nasion-Point B (ANB); the Wits appraisal; the anteroposterior dysplasia indicator (APDI); and the overbite depth indicator (ODI). Preoperative lateral radiographic images of patients with dentofacial deformities were collected and the landmarks digitally traced by three independent raters. Cephalometric analysis was automatically performed based on 1116 tracings. Error analysis identified the x-coordinate of Point A as the prevalent source of error in all investigated measurements, except SNB, in which it is not incorporated. In SNB, the y-coordinate of Nasion predominated error variance. SNB showed lowest inter-rater variation. In addition, our observations confirmed previous studies showing that landmark identification variance follows characteristic error envelopes in the highest number of tracings analysed up to now. Variance orthogonal to defining planes was of relevance, while variance parallel to planes was not. Taking these findings into account, orthognathic surgeons as well as orthodontists would be able to perform cephalometry more accurately and accomplish better therapeutic results. Copyright © 2018 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  13. Accuracy of heart rate variability estimation by photoplethysmography using an smartphone: Processing optimization and fiducial point selection.

    Science.gov (United States)

    Ferrer-Mileo, V; Guede-Fernandez, F; Fernandez-Chimeno, M; Ramos-Castro, J; Garcia-Gonzalez, M A

    2015-08-01

    This work compares several fiducial points to detect the arrival of a new pulse in a photoplethysmographic signal using the built-in camera of smartphones or a photoplethysmograph. Also, an optimization process for the signal preprocessing stage has been done. Finally we characterize the error produced when we use the best cutoff frequencies and fiducial point for smartphones and photopletysmograph and compare if the error of smartphones can be reasonably be explained by variations in pulse transit time. The results have revealed that the peak of the first derivative and the minimum of the second derivative of the pulse wave have the lowest error. Moreover, for these points, high pass filtering the signal between 0.1 to 0.8 Hz and low pass around 2.7 Hz or 3.5 Hz are the best cutoff frequencies. Finally, the error in smartphones is slightly higher than in a photoplethysmograph.

  14. Process Integration Analysis of an Industrial Hydrogen Production Process

    OpenAIRE

    Stolten, Detlef; Grube, Thomas; Tock, Laurence; Maréchal, François; Metzger, Christian; Arpentinier, Philippe

    2010-01-01

    The energy efficiency of an industrial hydrogen production process using steam methane reforming (SMR) combined with the water gas shift reaction (WGS) is analyzed using process integration techniques based on heat cascade calculation and pinch analysis with the aim of identifying potential measures to enhance the process performance. The challenge is to satisfy the high temperature heat demand of the SMR reaction by minimizing the consumption of natural gas to feed the combustion and to expl...

  15. Conditional analysis of mixed Poisson processes with baseline counts: implications for trial design and analysis.

    Science.gov (United States)

    Cook, Richard J; Wei, Wei

    2003-07-01

    The design of clinical trials is typically based on marginal comparisons of a primary response under two or more treatments. The considerable gains in efficiency afforded by models conditional on one or more baseline responses has been extensively studied for Gaussian models. The purpose of this article is to present methods for the design and analysis of clinical trials in which the response is a count or a point process, and a corresponding baseline count is available prior to randomization. The methods are based on a conditional negative binomial model for the response given the baseline count and can be used to examine the effect of introducing selection criteria on power and sample size requirements. We show that designs based on this approach are more efficient than those proposed by McMahon et al. (1994).

  16. Seed Dispersal, Microsites or Competition—What Drives Gap Regeneration in an Old-Growth Forest? An Application of Spatial Point Process Modelling

    Directory of Open Access Journals (Sweden)

    Georg Gratzer

    2018-04-01

    Full Text Available The spatial structure of trees is a template for forest dynamics and the outcome of a variety of processes in ecosystems. Identifying the contribution and magnitude of the different drivers is an age-old task in plant ecology. Recently, the modelling of a spatial point process was used to identify factors driving the spatial distribution of trees at stand scales. Processes driving the coexistence of trees, however, frequently unfold within gaps and questions on the role of resource heterogeneity within-gaps have become central issues in community ecology. We tested the applicability of a spatial point process modelling approach for quantifying the effects of seed dispersal, within gap light environment, microsite heterogeneity, and competition on the generation of within gap spatial structure of small tree seedlings in a temperate, old growth, mixed-species forest. By fitting a non-homogeneous Neyman–Scott point process model, we could disentangle the role of seed dispersal from niche partitioning for within gap tree establishment and did not detect seed densities as a factor explaining the clustering of small trees. We found only a very weak indication for partitioning of within gap light among the three species and detected a clear niche segregation of Picea abies (L. Karst. on nurse logs. The other two dominating species, Abies alba Mill. and Fagus sylvatica L., did not show signs of within gap segregation.

  17. ANALYSIS, THEMATIC MAPS AND DATA MINING FROM POINT CLOUD TO ONTOLOGY FOR SOFTWARE DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    R. Nespeca

    2016-06-01

    Full Text Available The primary purpose of the survey for the restoration of Cultural Heritage is the interpretation of the state of building preservation. For this, the advantages of the remote sensing systems that generate dense point cloud (range-based or image-based are not limited only to the acquired data. The paper shows that it is possible to extrapolate very useful information in diagnostics using spatial annotation, with the use of algorithms already implemented in open-source software. Generally, the drawing of degradation maps is the result of manual work, so dependent on the subjectivity of the operator. This paper describes a method of extraction and visualization of information, obtained by mathematical procedures, quantitative, repeatable and verifiable. The case study is a part of the east facade of the Eglise collégiale Saint-Maurice also called Notre Dame des Grâces, in Caromb, in southern France. The work was conducted on the matrix of information contained in the point cloud asci format. The first result is the extrapolation of new geometric descriptors. First, we create the digital maps with the calculated quantities. Subsequently, we have moved to semi-quantitative analyses that transform new data into useful information. We have written the algorithms for accurate selection, for the segmentation of point cloud, for automatic calculation of the real surface and the volume. Furthermore, we have created the graph of spatial distribution of the descriptors. This work shows that if we work during the data processing we can transform the point cloud into an enriched database: the use, the management and the data mining is easy, fast and effective for everyone involved in the restoration process.

  18. Application of Hazard Analysis and Critical Control Points (HACCP) to the Cultivation Line of Mushroom and Other Cultivated Edible Fungi.

    Science.gov (United States)

    Pardo, José E; de Figueirêdo, Vinícius Reis; Alvarez-Ortí, Manuel; Zied, Diego C; Peñaranda, Jesús A; Dias, Eustáquio Souza; Pardo-Giménez, Arturo

    2013-09-01

    The Hazard analysis and critical control points (HACCP) is a preventive system which seeks to ensure food safety and security. It allows product protection and correction of errors, improves the costs derived from quality defects and reduces the final overcontrol. In this paper, the system is applied to the line of cultivation of mushrooms and other edible cultivated fungi. From all stages of the process, only the reception of covering materials (stage 1) and compost (stage 3), the pre-fruiting and induction (step 6) and the harvest (stage 7) have been considered as critical control point (CCP). The main hazards found were the presence of unauthorized phytosanitary products or above the permitted dose (stages 6 and 7), and the presence of pathogenic bacteria (stages 1 and 3) and/or heavy metals (stage 3). The implementation of this knowledge will allow the self-control of their productions based on the system HACCP to any plant dedicated to mushroom or other edible fungi cultivation.

  19. Experimental investigation of the hot point generation in the Z pinch plasma

    International Nuclear Information System (INIS)

    Afonin, V.I.; Podgornov, V.A.; Litvin, D.N.; Senik, A.V.

    1999-01-01

    Experiments to explode thin composite (W-Al-W, W-SiO 2 -W) wires in SIGNAL fast high-current generator diode under about 200 kA load current amplitude and about 50 ns rise duration were carried out to study the possibility to control generation of hot point in Z pinch plasma. The parameters of generated hot points were studied using X-ray techniques. Analysis of the experiment results shows the possibility to control this process [ru

  20. A Unified Point Process Probabilistic Framework to Assess Heartbeat Dynamics and Autonomic Cardiovascular Control

    Directory of Open Access Journals (Sweden)

    Zhe eChen

    2012-02-01

    Full Text Available In recent years, time-varying inhomogeneous point process models have been introduced for assessment of instantaneous heartbeat dynamics as well as specific cardiovascular control mechanisms and hemodynamics. Assessment of the model's statistics is established through the Wiener-Volterra theory and a multivariate autoregressive (AR structure. A variety of instantaneous cardiovascular metrics, such as heart rate (HR, heart rate variability (HRV, respiratory sinus arrhythmia (RSA, and baroreceptor-cardiac reflex (baroreflex sensitivity (BRS, are derived within a parametric framework and instantaneously updated with adaptive and local maximum likelihood estimation algorithms. Inclusion of second order nonlinearities, with subsequent bispectral quantification in the frequency domain, further allows for definition of instantaneous metrics of nonlinearity. We here organize a comprehensive review of the devised methods as applied to experimental recordings from healthy subjects during propofol anesthesia. Collective results reveal interesting dynamic trends across the different pharmacological interventions operated within each anesthesia session, confirming the ability of the algorithm to track important changes in cardiorespiratory elicited interactions, and pointing at our mathematical approach as a promising monitoring tool for an accurate, noninvasive assessment in clinical practice.

  1. Identifying Organizational Inefficiencies with Pictorial Process Analysis (PPA

    Directory of Open Access Journals (Sweden)

    David John Patrishkoff

    2013-11-01

    Full Text Available Pictorial Process Analysis (PPA was created by the author in 2004. PPA is a unique methodology which offers ten layers of additional analysis when compared to standard process mapping techniques.  The goal of PPA is to identify and eliminate waste, inefficiencies and risk in manufacturing or transactional business processes at 5 levels in an organization. The highest level being assessed is the process management, followed by the process work environment, detailed work habits, process performance metrics and general attitudes towards the process. This detailed process assessment and analysis is carried out during process improvement brainstorming efforts and Kaizen events. PPA creates a detailed visual efficiency rating for each step of the process under review.  A selection of 54 pictorial Inefficiency Icons (cards are available for use to highlight major inefficiencies and risks that are present in the business process under review. These inefficiency icons were identified during the author's independent research on the topic of why things go wrong in business. This paper will highlight how PPA was developed and show the steps required to conduct Pictorial Process Analysis on a sample manufacturing process. The author has successfully used PPA to dramatically improve business processes in over 55 different industries since 2004.  

  2. Assessment of hygiene standards and Hazard Analysis Critical Control Points implementation on passenger ships.

    Science.gov (United States)

    Mouchtouri, Varavara; Malissiova, Eleni; Zisis, Panagiotis; Paparizou, Evina; Hadjichristodoulou, Christos

    2013-01-01

    The level of hygiene on ferries can have impact on travellers' health. The aim of this study was to assess the hygiene standards of ferries in Greece and to investigate whether Hazard Analysis Critical Control Points (HACCP) implementation contributes to the hygiene status and particularly food safety aboard passenger ships. Hygiene inspections on 17 ferries in Greece were performed using a standardized inspection form, with a 135-point scale. Thirty-four water and 17 food samples were collected and analysed. About 65% (11/17) of ferries were scored with >100 points. Ferries with HACCP received higher scores during inspection compared to those without HACCP (p value food samples, only one was found positive for Salmonella spp. Implementation of management systems including HACCP principles can help to raise the level of hygiene aboard passenger ships.

  3. Materials, process, product analysis of coal process technology. Phase I final report

    Energy Technology Data Exchange (ETDEWEB)

    Saxton, J. C.; Roig, R. W.; Loridan, A.; Leggett, N. E.; Capell, R. G.; Humpstone, C. C.; Mudry, R. N.; Ayres, E.

    1976-02-01

    The purpose of materials-process-product analysis is a systematic evaluation of alternative manufacturing processes--in this case processes for converting coal into energy and material products that can supplement or replace petroleum-based products. The methodological steps in the analysis include: Definition of functional operations that enter into coal conversion processes, and modeling of alternative, competing methods to accomplish these functions; compilation of all feasible conversion processes that can be assembled from combinations of competing methods for the functional operations; systematic, iterative evaluation of all feasible conversion processes under a variety of economic situations, environmental constraints, and projected technological advances; and aggregative assessments (economic and environmental) of various industrial development scenarios. An integral part of the present project is additional development of the existing computer model to include: A data base for coal-related materials and coal conversion processes; and an algorithmic structure that facilitates the iterative, systematic evaluations in response to exogenously specified variables, such as tax policy, environmental limitations, and changes in process technology and costs. As an analytical tool, the analysis is intended to satisfy the needs of an analyst working at the process selection level, for example, with respect to the allocation of RDandD funds to competing technologies.

  4. Point-and-Click Pedagogy: Is it Effective for Teaching Information Technology?

    Directory of Open Access Journals (Sweden)

    Mark Angolia

    2016-09-01

    Full Text Available This paper assesses the effectiveness of the adoption of curriculum content developed and supported by a global academic university-industry alliance sponsored by one of the world’s largest information technology software providers. Academic alliances promote practical and future-oriented education while providing access to proprietary software and technology. Specifically, this paper addresses a lack of quantitative analysis to substantiate the perceived benefits of using information technology “point-and-click” instructional pedagogy to teach fundamental business processes and concepts. The analysis of over 800 test questions from 229 students allowed inferences regarding the utilization of self-directed “point-and-click” driven case studies employed to teach software applications of business processes needed for supply chain management. Correlation studies and analysis of variance investigated data collected from 10 individual course sections over a two-and-one-half-year period in a four-year public university. The data showed statistically significant positive correlations between the pedagogy and conceptual learning. Further, the research provided evidence that the methodology is equally effective for teaching information technology applications using either face-to-face or distance education delivery methods.

  5. Evaluation of the H-point standard additions method (HPSAM) and the generalized H-point standard additions method (GHPSAM) for the UV-analysis of two-component mixtures.

    Science.gov (United States)

    Hund, E; Massart, D L; Smeyers-Verbeke, J

    1999-10-01

    The H-point standard additions method (HPSAM) and two versions of the generalized H-point standard additions method (GHPSAM) are evaluated for the UV-analysis of two-component mixtures. Synthetic mixtures of anhydrous caffeine and phenazone as well as of atovaquone and proguanil hydrochloride were used. Furthermore, the method was applied to pharmaceutical formulations that contain these compounds as active drug substances. This paper shows both the difficulties that are related to the methods and the conditions by which acceptable results can be obtained.

  6. Computer-aided analysis of cutting processes for brittle materials

    Science.gov (United States)

    Ogorodnikov, A. I.; Tikhonov, I. N.

    2017-12-01

    This paper is focused on 3D computer simulation of cutting processes for brittle materials and silicon wafers. Computer-aided analysis of wafer scribing and dicing is carried out with the use of the ANSYS CAE (computer-aided engineering) software, and a parametric model of the processes is created by means of the internal ANSYS APDL programming language. Different types of tool tip geometry are analyzed to obtain internal stresses, such as a four-sided pyramid with an included angle of 120° and a tool inclination angle to the normal axis of 15°. The quality of the workpieces after cutting is studied by optical microscopy to verify the FE (finite-element) model. The disruption of the material structure during scribing occurs near the scratch and propagates into the wafer or over its surface at a short range. The deformation area along the scratch looks like a ragged band, but the stress width is rather low. The theory of cutting brittle semiconductor and optical materials is developed on the basis of the advanced theory of metal turning. The fall of stress intensity along the normal on the way from the tip point to the scribe line can be predicted using the developed theory and with the verified FE model. The crystal quality and dimensions of defects are determined by the mechanics of scratching, which depends on the shape of the diamond tip, the scratching direction, the velocity of the cutting tool and applied force loads. The disunity is a rate-sensitive process, and it depends on the cutting thickness. The application of numerical techniques, such as FE analysis, to cutting problems enhances understanding and promotes the further development of existing machining technologies.

  7. Rapid Process to Generate Beam Envelopes for Optical System Analysis

    Science.gov (United States)

    Howard, Joseph; Seals, Lenward

    2012-01-01

    The task of evaluating obstructions in the optical throughput of an optical system requires the use of two disciplines, and hence, two models: optical models for the details of optical propagation, and mechanical models for determining the actual structure that exists in the optical system. Previous analysis methods for creating beam envelopes (or cones of light) for use in this obstruction analysis were found to be cumbersome to calculate and take significant time and resources to complete. A new process was developed that takes less time to complete beam envelope analysis, is more accurate and less dependent upon manual node tracking to create the beam envelopes, and eases the burden on the mechanical CAD (computer-aided design) designers to form the beam solids. This algorithm allows rapid generation of beam envelopes for optical system obstruction analysis. Ray trace information is taken from optical design software and used to generate CAD objects that represent the boundary of the beam envelopes for detailed analysis in mechanical CAD software. Matlab is used to call ray trace data from the optical model for all fields and entrance pupil points of interest. These are chosen to be the edge of each space, so that these rays produce the bounding volume for the beam. The x and y global coordinate data is collected on the surface planes of interest, typically an image of the field and entrance pupil internal of the optical system. This x and y coordinate data is then evaluated using a convex hull algorithm, which removes any internal points, which are unnecessary to produce the bounding volume of interest. At this point, tolerances can be applied to expand the size of either the field or aperture, depending on the allocations. Once this minimum set of coordinates on the pupil and field is obtained, a new set of rays is generated between the field plane and aperture plane (or vice-versa). These rays are then evaluated at planes between the aperture and field, at a

  8. On the thermochemical conversions of hard coal pitches in the process of raising the softening point to 358-363 K

    Energy Technology Data Exchange (ETDEWEB)

    Kekin, N.A.; Belkina, T.V.; Stepanenko, M.A.; Gordienko, V.G.

    1983-09-01

    High resolution paramagnetic resonance and infrared spectroscopy are used to obtain data on the nature of changes in hydrogen content of various groups in the substances of soluble functions in raw pitch and its thermoproducts during the process of producing binders with an increased softening point of 358-363 K. It was shown that thermal treatment of pitch during the process of raising the softening point leads to enrichment of the pitch structure with aromatic hydrogen and to reduction in the structure of the hydrogen in aliphatic bonds. The basis of these conversions is the splitting off of CH/SUB/3 groups and the formation of new structures containing CH/SUB/2 groups. (11 refs.)

  9. FE-Analysis of Stretch-Blow Moulded Bottles Using an Integrative Process Simulation

    Science.gov (United States)

    Hopmann, C.; Michaeli, W.; Rasche, S.

    2011-05-01

    The two-stage stretch-blow moulding process has been established for the large scale production of high quality PET containers with excellent mechanical and optical properties. The total production costs of a bottle are significantly caused by the material costs. Due to this dominant share of the bottle material, the PET industry is interested in reducing the total production costs by an optimised material efficiency. However, a reduced material inventory means decreasing wall thicknesses and therewith a reduction of the bottle properties (e.g. mechanical properties, barrier properties). Therefore, there is often a trade-off between a minimal bottle weight and adequate properties of the bottle. In order to achieve the objectives Computer Aided Engineering (CAE) techniques can assist the designer of new stretch-blow moulded containers. Hence, tools such as the process simulation and the structural analysis have become important in the blow moulding sector. The Institute of Plastics Processing (IKV) at RWTH Aachen University, Germany, has developed an integrative three-dimensional process simulation which models the complete path of a preform through a stretch-blow moulding machine. At first, the reheating of the preform is calculated by a thermal simulation. Afterwards, the inflation of the preform to a bottle is calculated by finite element analysis (FEA). The results of this step are e.g. the local wall thickness distribution and the local biaxial stretch ratios. Not only the material distribution but also the material properties that result from the deformation history of the polymer have significant influence on the bottle properties. Therefore, a correlation between the material properties and stretch ratios is considered in an integrative simulation approach developed at IKV. The results of the process simulation (wall thickness, stretch ratios) are transferred to a further simulation program and mapped on the bottles FE mesh. This approach allows a local

  10. A Selection Approach for Optimized Problem-Solving Process by Grey Relational Utility Model and Multicriteria Decision Analysis

    Directory of Open Access Journals (Sweden)

    Chih-Kun Ke

    2012-01-01

    Full Text Available In business enterprises, especially the manufacturing industry, various problem situations may occur during the production process. A situation denotes an evaluation point to determine the status of a production process. A problem may occur if there is a discrepancy between the actual situation and the desired one. Thus, a problem-solving process is often initiated to achieve the desired situation. In the process, how to determine an action need to be taken to resolve the situation becomes an important issue. Therefore, this work uses a selection approach for optimized problem-solving process to assist workers in taking a reasonable action. A grey relational utility model and a multicriteria decision analysis are used to determine the optimal selection order of candidate actions. The selection order is presented to the worker as an adaptive recommended solution. The worker chooses a reasonable problem-solving action based on the selection order. This work uses a high-tech company’s knowledge base log as the analysis data. Experimental results demonstrate that the proposed selection approach is effective.

  11. Spatial Stochastic Point Models for Reservoir Characterization

    Energy Technology Data Exchange (ETDEWEB)

    Syversveen, Anne Randi

    1997-12-31

    The main part of this thesis discusses stochastic modelling of geology in petroleum reservoirs. A marked point model is defined for objects against a background in a two-dimensional vertical cross section of the reservoir. The model handles conditioning on observations from more than one well for each object and contains interaction between objects, and the objects have the correct length distribution when penetrated by wells. The model is developed in a Bayesian setting. The model and the simulation algorithm are demonstrated by means of an example with simulated data. The thesis also deals with object recognition in image analysis, in a Bayesian framework, and with a special type of spatial Cox processes called log-Gaussian Cox processes. In these processes, the logarithm of the intensity function is a Gaussian process. The class of log-Gaussian Cox processes provides flexible models for clustering. The distribution of such a process is completely characterized by the intensity and the pair correlation function of the Cox process. 170 refs., 37 figs., 5 tabs.

  12. Process analysis of recycled thermoplasts from consumer electronics by laser-induced plasma spectroscopy.

    Science.gov (United States)

    Fink, Herbert; Panne, Ulrich; Niessner, Reinhard

    2002-09-01

    An experimental setup for direct elemental analysis of recycled thermoplasts from consumer electronics by laser-induced plasma spectroscopy (LIPS, or laser-induced breakdown spectroscopy, LIBS) was realized. The combination of a echelle spectrograph, featuring a high resolution with a broad spectral coverage, with multivariate methods, such as PLS, PCR, and variable subset selection via a genetic algorithm, resulted in considerable improvements in selectivity and sensitivity for this complex matrix. With a normalization to carbon as internal standard, the limits of detection were in the ppm range. A preliminary pattern recognition study points to the possibility of polymer recognition via the line-rich echelle spectra. Several experiments at an extruder within a recycling plant demonstrated successfully the capability of LIPS for different kinds of routine on-line process analysis.

  13. Evaluation of transport safety analysis processes of radioactive material performed by a regulatory body

    International Nuclear Information System (INIS)

    Mattar, Patricia Morais

    2017-01-01

    Radioactive substances have many beneficial applications, ranging from power generation to uses in medicine, industry and agriculture. As a rule, they are produced in different places from where they are used, needing to be transported. In order for transport to take place safely and efficiently, national and international standards must be complied with. This research aims to assess the safety analysis processes for the transport of radioactive material carried out by the regulatory body in Brazil, from the point of view of their compliance with the International Atomic Energy Agency (IAEA) standards. The self-assessment methodology named SARIS, developed by the AIEA, was used. The following steps were carried out: evaluation of the Diagnosis and Processes Mapping; responses to the SARIS Question Set and complementary questions; SWOT analysis; interviews with stakeholders and evaluation of a TranSAS mission conducted by the IAEA in 2002. Considering only SARIS questions, processes are 100% adherent. The deepening of the research, however, led to the development of twenty-two improvement proposals and the identification of nine good practices. The results showed that the safety analysis processes of the transport of radioactive material are being carried out in a structured, safe and reliable way but also that there is much opportunity for improvement. The formulation of an action plan, based on the presented proposals, can bring to the regulatory body many benefits. This would be an important step towards convening an external evaluation, providing greater reliability and transparency to the regulatory body´s processes. (author)

  14. On the estimation of the spherical contact distribution Hs(y) for spatial point processes

    International Nuclear Information System (INIS)

    Doguwa, S.I.

    1990-08-01

    RIPLEY (1977, Journal of the Royal Statistical Society, B39 172-212) proposed an estimator for the spherical contact distribution H s (s), of a spatial point process observed in a bounded planar region. However, this estimator is not defined for some distances of interest, in this bounded region. A new estimator for H s (y), is proposed for use with regular grid of sampling locations. This new estimator is defined for all distances of interest. It also appears to have a smaller bias and a smaller mean squared error than the previously suggested alternative. (author). 11 refs, 4 figs, 1 tab

  15. Performance Analysis of Entropy Methods on K Means in Clustering Process

    Science.gov (United States)

    Dicky Syahputra Lubis, Mhd.; Mawengkang, Herman; Suwilo, Saib

    2017-12-01

    K Means is a non-hierarchical data clustering method that attempts to partition existing data into one or more clusters / groups. This method partitions the data into clusters / groups so that data that have the same characteristics are grouped into the same cluster and data that have different characteristics are grouped into other groups.The purpose of this data clustering is to minimize the objective function set in the clustering process, which generally attempts to minimize variation within a cluster and maximize the variation between clusters. However, the main disadvantage of this method is that the number k is often not known before. Furthermore, a randomly chosen starting point may cause two points to approach the distance to be determined as two centroids. Therefore, for the determination of the starting point in K Means used entropy method where this method is a method that can be used to determine a weight and take a decision from a set of alternatives. Entropy is able to investigate the harmony in discrimination among a multitude of data sets. Using Entropy criteria with the highest value variations will get the highest weight. Given this entropy method can help K Means work process in determining the starting point which is usually determined at random. Thus the process of clustering on K Means can be more quickly known by helping the entropy method where the iteration process is faster than the K Means Standard process. Where the postoperative patient dataset of the UCI Repository Machine Learning used and using only 12 data as an example of its calculations is obtained by entropy method only with 2 times iteration can get the desired end result.

  16. The chaotic points and XRD analysis of Hg-based superconductors

    Energy Technology Data Exchange (ETDEWEB)

    Aslan, Oe [Anatuerkler Educational Consultancy and Trading Company, Orhan Veli Kanik Cad., 6/1, Kavacik 34810 Beykoz, Istanbul (Turkey); Oezdemir, Z Gueven [Physics Department, Yildiz Technical University, Davutpasa Campus, Esenler 34210, Istanbul (Turkey); Keskin, S S [Department of Environmental Eng., University of Marmara, Ziverbey, 34722, Istanbul (Turkey); Onbasli, Ue, E-mail: ozdenaslan@yahoo.co [Physics Department, University of Marmara, Ridvan Pasa Cad. 3. Sok. 85/12 Goztepe, Istanbul (Turkey)

    2009-03-01

    In this article, high T{sub c} mercury based cuprate superconductors with different oxygen doping rates have been examined by means of magnetic susceptibility (magnetization) versus temperature data and X-ray diffraction pattern analysis. The under, optimally and over oxygen doping procedures have been defined from the magnetic susceptibility versus temperature data of the superconducting sample by extracting the Meissner critical transition temperature, T{sub c} and the paramagnetic Meissner temperature, T{sub PME}, so called as the critical quantum chaos points. Moreover, the optimally oxygen doped samples have been investigated under both a.c. and d.c. magnetic fields. The related a.c. data for virgin(uncut) and cut samples with optimal doping have been obtained under a.c. magnetic field of 1 Gauss. For the cut sample with the rectangular shape, the chaotic points have been found to occur at 122 and 140 K, respectively. The Meissner critical temperature of 140 K is the new world record for the high temperature oxide superconductors under normal atmospheric pressure. Moreover, the crystallographic lattice parameters of superconducting samples have a crucial importance in calculating Josephson penetration depth determined by the XRD patterns. From the XRD data obtained for under and optimally doped samples, the crystal symmetries have been found in tetragonal structure.

  17. Analysis methods of stochastic transient electro–magnetic processes in electric traction system

    Directory of Open Access Journals (Sweden)

    T. M. Mishchenko

    2013-04-01

    Full Text Available Purpose. The essence and basic characteristics of calculation methods of transient electromagnetic processes in the elements and devices of non–linear dynamic electric traction systems taking into account the stochastic changes of voltages and currents in traction networks of power supply subsystem and power circuits of electric rolling stock are developed. Methodology. Classical methods and the methods of non–linear electric engineering, as well as probability theory method, especially the methods of stationary ergodic and non–stationary stochastic processes application are used in the research. Findings. Using the above-mentioned methods an equivalent circuit and the system of nonlinear integra–differential equations for electromagnetic condition of the double–track inter-substation zone of alternating current electric traction system are drawn up. Calculations allow obtaining electric traction current distribution in the areas of feeder zones. Originality. First of all the paper is interesting and important from scientific point of view due to the methods, which allow taking into account probabilistic character of change for traction voltages and electric traction system currents. On the second hand the researches develop the most efficient methods of nonlinear circuits’ analysis. Practical value. The practical value of the research is presented in application of the methods to the analysis of electromagnetic and electric energy processes in the traction power supply system in the case of high-speed train traffic.

  18. Finding a single point of truth

    Energy Technology Data Exchange (ETDEWEB)

    Sokolov, S.; Thijssen, H. [Autodesk Inc, Toronto, ON (Canada); Laslo, D.; Martin, J. [Autodesk Inc., San Rafael, CA (United States)

    2010-07-01

    Electric utilities collect large volumes of data at every level of their business, including SCADA, Smart Metering and Smart Grid initiatives, LIDAR and other 3D imagery surveys. Different types of database systems are used to store the information, rendering data flow within the utility business process extremely complicated. The industry trend has been to endure redundancy of data input and maintenance of multiple copies of the same data across different solution data sets. Efforts have been made to improve the situation with point to point interfaces, but with the tools and solutions available today, a single point of truth can be achieved. Consolidated and validated data can be published into a data warehouse at the right point in the process, making the information available to all other enterprise systems and solutions. This paper explained how the single point of truth spatial data warehouse and process automation services can be configured to streamline the flow of data within the utility business process using the initiate-plan-execute-close (IPEC) utility workflow model. The paper first discussed geospatial challenges faced by utilities and then presented the approach and technology aspects. It was concluded that adoption of systems and solutions that can function with and be controlled by the IPEC workflow can provide significant improvement for utility operations, particularly if those systems are coupled with the spatial data warehouse that reflects a single point of truth. 6 refs., 3 figs.

  19. An integrated study of surface roughness in EDM process using regression analysis and GSO algorithm

    Science.gov (United States)

    Zainal, Nurezayana; Zain, Azlan Mohd; Sharif, Safian; Nuzly Abdull Hamed, Haza; Mohamad Yusuf, Suhaila

    2017-09-01

    The aim of this study is to develop an integrated study of surface roughness (Ra) in the die-sinking electrical discharge machining (EDM) process of Ti-6AL-4V titanium alloy with positive polarity of copper-tungsten (Cu-W) electrode. Regression analysis and glowworm swarm optimization (GSO) algorithm were considered for modelling and optimization process. Pulse on time (A), pulse off time (B), peak current (C) and servo voltage (D) were selected as the machining parameters with various levels. The experiments have been conducted based on the two levels of full factorial design with an added center point design of experiments (DOE). Moreover, mathematical models with linear and 2 factor interaction (2FI) effects of the parameters chosen were developed. The validity test of the fit and the adequacy of the developed mathematical models have been carried out by using analysis of variance (ANOVA) and F-test. The statistical analysis showed that the 2FI model outperformed with the most minimal value of Ra compared to the linear model and experimental result.

  20. Hazard analysis of critical control points assessment as a tool to respond to emerging infectious disease outbreaks.

    Directory of Open Access Journals (Sweden)

    Kelly L Edmunds

    Full Text Available Highly pathogenic avian influenza virus (HPAI strain H5N1 has had direct and indirect economic impacts arising from direct mortality and control programmes in over 50 countries reporting poultry outbreaks. HPAI H5N1 is now reported as the most widespread and expensive zoonotic disease recorded and continues to pose a global health threat. The aim of this research was to assess the potential of utilising Hazard Analysis of Critical Control Points (HACCP assessments in providing a framework for a rapid response to emerging infectious disease outbreaks. This novel approach applies a scientific process, widely used in food production systems, to assess risks related to a specific emerging health threat within a known zoonotic disease hotspot. We conducted a HACCP assessment for HPAI viruses within Vietnam's domestic poultry trade and relate our findings to the existing literature. Our HACCP assessment identified poultry flock isolation, transportation, slaughter, preparation and consumption as critical control points for Vietnam's domestic poultry trade. Introduction of the preventative measures highlighted through this HACCP evaluation would reduce the risks posed by HPAI viruses and pressure on the national economy. We conclude that this HACCP assessment provides compelling evidence for the future potential that HACCP analyses could play in initiating a rapid response to emerging infectious diseases.

  1. Sample to answer visualization pipeline for low-cost point-of-care blood cell counting

    Science.gov (United States)

    Smith, Suzanne; Naidoo, Thegaran; Davies, Emlyn; Fourie, Louis; Nxumalo, Zandile; Swart, Hein; Marais, Philip; Land, Kevin; Roux, Pieter

    2015-03-01

    We present a visualization pipeline from sample to answer for point-of-care blood cell counting applications. Effective and low-cost point-of-care medical diagnostic tests provide developing countries and rural communities with accessible healthcare solutions [1], and can be particularly beneficial for blood cell count tests, which are often the starting point in the process of diagnosing a patient [2]. The initial focus of this work is on total white and red blood cell counts, using a microfluidic cartridge [3] for sample processing. Analysis of the processed samples has been implemented by means of two main optical visualization systems developed in-house: 1) a fluidic operation analysis system using high speed video data to determine volumes, mixing efficiency and flow rates, and 2) a microscopy analysis system to investigate homogeneity and concentration of blood cells. Fluidic parameters were derived from the optical flow [4] as well as color-based segmentation of the different fluids using a hue-saturation-value (HSV) color space. Cell count estimates were obtained using automated microscopy analysis and were compared to a widely accepted manual method for cell counting using a hemocytometer [5]. The results using the first iteration microfluidic device [3] showed that the most simple - and thus low-cost - approach for microfluidic component implementation was not adequate as compared to techniques based on manual cell counting principles. An improved microfluidic design has been developed to incorporate enhanced mixing and metering components, which together with this work provides the foundation on which to successfully implement automated, rapid and low-cost blood cell counting tests.

  2. Root cause analysis with enriched process logs

    NARCIS (Netherlands)

    Suriadi, S.; Ouyang, C.; Aalst, van der W.M.P.; Hofstede, ter A.H.M.; La Rosa, M.; Soffer, P.

    2013-01-01

    n the field of process mining, the use of event logs for the purpose of root cause analysis is increasingly studied. In such an analysis, the availability of attributes/features that may explain the root cause of some phenomena is crucial. Currently, the process of obtaining these attributes from

  3. Change-point analysis data of neonatal diffusion tensor MRI in preterm and term-born infants

    Directory of Open Access Journals (Sweden)

    Dan Wu

    2017-06-01

    Full Text Available The data presented in this article are related to the research article entitled “Mapping the Critical Gestational Age at Birth that Alters Brain Development in Preterm-born Infants using Multi-Modal MRI” (Wu et al., 2017 [1]. Brain immaturity at birth poses critical neurological risks in the preterm-born infants. We used a novel change-point model to analyze the critical gestational age at birth (GAB that could affect postnatal development, based on diffusion tensor MRI (DTI acquired from 43 preterm and 43 term-born infants in 126 brain regions. In the corresponding research article, we presented change-point analysis of fractional anisotropy (FA and mean diffusivities (MD measurements in these infants. In this article, we offered the relative changes of axonal and radial diffusivities (AD and RD in relation to the change of FA and FA-based change-points, and we also provided the AD- and RD-based change-point results.

  4. Mesh Processing in Medical Image Analysis

    DEFF Research Database (Denmark)

    The following topics are dealt with: mesh processing; medical image analysis; interactive freeform modeling; statistical shape analysis; clinical CT images; statistical surface recovery; automated segmentation; cerebral aneurysms; and real-time particle-based representation....

  5. Accuracy of Point-of-care Ultrasonography for Diagnosing Acute Appendicitis: A Systematic Review and Meta-analysis.

    Science.gov (United States)

    Matthew Fields, J; Davis, Joshua; Alsup, Carl; Bates, Amanda; Au, Arthur; Adhikari, Srikar; Farrell, Isaac

    2017-09-01

    The use of ultrasonography (US) to diagnose appendicitis is well established. More recently, point-of-care ultrasonography (POCUS) has also been studied for the diagnosis of appendicitis, which may also prove a valuable diagnostic tool. The purpose of this study was through systematic review and meta-analysis to identify the test characteristics of POCUS, specifically US performed by a nonradiologist physician, in accurately diagnosing acute appendicitis in patients of any age. We conducted a thorough and systematic literature search of English language articles published on point-of-care, physician-performed transabdominal US used for the diagnosis of acute appendicitis from 1980 to May, 2015 using OVID MEDLINE In-Process & Other Non-indexed Citations and Scopus. Studies were selected and subsequently independently abstracted by two trained reviewers. A random-effects pooled analysis was used to construct a hierarchical summary receiver operator characteristic curve, and a meta-regression was performed. Quality of studies was assessed using the QUADAS-2 tool. Our search yielded 5,792 unique studies and we included 21 of these in our final review. Prevalence of disease in this study was 29.8%, (range = 6.4%-75.4%). The sensitivity and specificity for POCUS in diagnosing appendicitis were 91% (95% confidence interval [CI] = 83%-96%) and 97% (95% CI = 91%-99%), respectively. The positive and negative predictive values were 91 and 94%, respectively. Studies performed by emergency physicians had slightly lower test characteristics (sensitivity = 80%, specificity = 92%). There was significant heterogeneity between studies (I 2 = 99%, 95% CI = 99%-100%) and the quality of the reported studies was moderate, mostly due to unclear reporting of blinding of physicians and timing of scanning and patient enrollment. Several of the studies were performed by a single operator, and the education and training of the operators were variably reported. Point-of-care US has relatively

  6. Preliminary hazards analysis -- vitrification process

    International Nuclear Information System (INIS)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P.

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility's construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment

  7. Preliminary hazards analysis -- vitrification process

    Energy Technology Data Exchange (ETDEWEB)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  8. Risk analysis: opening the process

    International Nuclear Information System (INIS)

    Hubert, Ph.; Mays, C.

    1998-01-01

    This conference on risk analysis took place in Paris, 11-14 october 1999. Over 200 paper where presented in the seven following sessions: perception; environment and health; persuasive risks; objects and products; personal and collective involvement; assessment and valuation; management. A rational approach to risk analysis has been developed in the three last decades. Techniques for risk assessment have been thoroughly enhanced, risk management approaches have been developed, decision making processes have been clarified, the social dimensions of risk perception and management have been investigated. Nevertheless this construction is being challenged by recent events which reveal how deficits in stakeholder involvement, openness and democratic procedures can undermine risk management actions. Indeed, the global process most components of risk analysis may be radically called into question. Food safety has lately been a prominent issue, but now debates appear, or old debates are revisited in the domains of public health, consumer products safety, waste management, environmental risks, nuclear installations, automobile safety and pollution. To meet the growing pressures for efficiency, openness, accountability, and multi-partner communication in risk analysis, institutional changes are underway in many European countries. However, the need for stakeholders to develop better insight into the process may lead to an evolution of all the components of risks analysis, even in its most (technical' steps. For stakeholders of different professional background, political projects, and responsibilities, risk identification procedures must be rendered understandable, quantitative risk assessment must be intelligible and accommodated in action proposals, ranging from countermeasures to educational programs to insurance mechanisms. Management formats must be open to local and political input and other types of operational feedback. (authors)

  9. Cognitive Task Analysis of the Battalion Level Visualization Process

    National Research Council Canada - National Science Library

    Leedom, Dennis K; McElroy, William; Shadrick, Scott B; Lickteig, Carl; Pokorny, Robet A; Haynes, Jacqueline A; Bell, James

    2007-01-01

    ... position or as a battalion Operations Officer or Executive Officer. Bases on findings from the cognitive task analysis, 11 skill areas were identified as potential focal points for future training development...

  10. Nonlinear analysis and control of a continuous fermentation process

    DEFF Research Database (Denmark)

    Szederkényi, G.; Kristensen, Niels Rode; Hangos, K.M

    2002-01-01

    Different types of nonlinear controllers are designed and compared for a simple continuous bioreactor operating near optimal productivity. This operating point is located close to a fold bifurcation point. Nonlinear analysis of stability, controllability and zero dynamics is used to investigate o...... are recommended for the simple fermenter. Passivity based controllers have been found to be globally stable, not very sensitive to the uncertainties in the reaction rate and controller parameter but they require full nonlinear state feedback....

  11. Analysis of parameters for technological equipment of parallel kinematics based on rods of variable length for processing accuracy assurance

    Science.gov (United States)

    Koltsov, A. G.; Shamutdinov, A. H.; Blokhin, D. A.; Krivonos, E. V.

    2018-01-01

    A new classification of parallel kinematics mechanisms on symmetry coefficient, being proportional to mechanism stiffness and accuracy of the processing product using the technological equipment under study, is proposed. A new version of the Stewart platform with a high symmetry coefficient is presented for analysis. The workspace of the mechanism under study is described, this space being a complex solid figure. The workspace end points are reached by the center of the mobile platform which moves in parallel related to the base plate. Parameters affecting the processing accuracy, namely the static and dynamic stiffness, natural vibration frequencies are determined. The capability assessment of the mechanism operation under various loads, taking into account resonance phenomena at different points of the workspace, was conducted. The study proved that stiffness and therefore, processing accuracy with the use of the above mentioned mechanisms are comparable with the stiffness and accuracy of medium-sized series-produced machines.

  12. AUTOMATED VOXEL MODEL FROM POINT CLOUDS FOR STRUCTURAL ANALYSIS OF CULTURAL HERITAGE

    Directory of Open Access Journals (Sweden)

    G. Bitelli

    2016-06-01

    Full Text Available In the context of cultural heritage, an accurate and comprehensive digital survey of a historical building is today essential in order to measure its geometry in detail for documentation or restoration purposes, for supporting special studies regarding materials and constructive characteristics, and finally for structural analysis. Some proven geomatic techniques, such as photogrammetry and terrestrial laser scanning, are increasingly used to survey buildings with different complexity and dimensions; one typical product is in form of point clouds. We developed a semi-automatic procedure to convert point clouds, acquired from laserscan or digital photogrammetry, to a filled volume model of the whole structure. The filled volume model, in a voxel format, can be useful for further analysis and also for the generation of a Finite Element Model (FEM of the surveyed building. In this paper a new approach is presented with the aim to decrease operator intervention in the workflow and obtain a better description of the structure. In order to achieve this result a voxel model with variable resolution is produced. Different parameters are compared and different steps of the procedure are tested and validated in the case study of the North tower of the San Felice sul Panaro Fortress, a monumental historical building located in San Felice sul Panaro (Modena, Italy that was hit by an earthquake in 2012.

  13. Portable Dew Point Mass Spectrometry System for Real-Time Gas and Moisture Analysis

    Science.gov (United States)

    Arkin, C.; Gillespie, Stacey; Ratzel, Christopher

    2010-01-01

    A portable instrument incorporates both mass spectrometry and dew point measurement to provide real-time, quantitative gas measurements of helium, nitrogen, oxygen, argon, and carbon dioxide, along with real-time, quantitative moisture analysis. The Portable Dew Point Mass Spectrometry (PDP-MS) system comprises a single quadrupole mass spectrometer and a high vacuum system consisting of a turbopump and a diaphragm-backing pump. A capacitive membrane dew point sensor was placed upstream of the MS, but still within the pressure-flow control pneumatic region. Pressure-flow control was achieved with an upstream precision metering valve, a capacitance diaphragm gauge, and a downstream mass flow controller. User configurable LabVIEW software was developed to provide real-time concentration data for the MS, dew point monitor, and sample delivery system pressure control, pressure and flow monitoring, and recording. The system has been designed to include in situ, NIST-traceable calibration. Certain sample tubing retains sufficient water that even if the sample is dry, the sample tube will desorb water to an amount resulting in moisture concentration errors up to 500 ppm for as long as 10 minutes. It was determined that Bev-A-Line IV was the best sample line to use. As a result of this issue, it is prudent to add a high-level humidity sensor to PDP-MS so such events can be prevented in the future.

  14. Conducting Qualitative Data Analysis: Qualitative Data Analysis as a Metaphoric Process

    Science.gov (United States)

    Chenail, Ronald J.

    2012-01-01

    In the second of a series of "how-to" essays on conducting qualitative data analysis, Ron Chenail argues the process can best be understood as a metaphoric process. From this orientation he suggests researchers follow Kenneth Burke's notion of metaphor and see qualitative data analysis as the analyst systematically considering the "this-ness" of…

  15. Investigation of the s-process branch-point nucleus {sup 86}Rb at HIγS

    Energy Technology Data Exchange (ETDEWEB)

    Erbacher, Philipp; Glorius, Jan; Reifarth, Rene; Sonnabend, Kerstin [Goethe Universitaet Frankfurt am Main (Germany); Isaak, Johann; Loeher, Bastian; Savran, Deniz [GSI Helmholzzentrum fuer Schwerionenforschung (Germany); Tornow, Werner [Duke University (United States)

    2016-07-01

    The branch-point nucleus {sup 86}Rb determines the isotopic abundance ratio {sup 86}Sr/{sup 87}Sr in s-process nucleosynthesis. Thus, stellar parameters such as temperature and neutron density and their evolution in time as simulated by modern s-process network calculations can be constrained by a comparison of the calculated isotopic ratio with the one observed in SiC meteoritic grains. To this end, the radiative neutron-capture cross section of the unstable isotope {sup 86}Rb has to be known with sufficient accuracy. Since the short half-life of {sup 86}Rb prohibits the direct measurement, the nuclear-physics input to a calculation of the cross section has to be measured. For this reason, the γ-ray strength function of {sup 87}Rb was measured using the γ{sup 3} setup at the High Intensity γ-ray Source facility at TUNL in Durham, USA. First experimental results are presented.

  16. Design and fabrication of a diffractive beam splitter for dual-wavelength and concurrent irradiation of process points.

    Science.gov (United States)

    Amako, Jun; Shinozaki, Yu

    2016-07-11

    We report on a dual-wavelength diffractive beam splitter designed for use in parallel laser processing. This novel optical element generates two beam arrays of different wavelengths and allows their overlap at the process points on a workpiece. To design the deep surface-relief profile of a splitter using a simulated annealing algorithm, we introduce a heuristic but practical scheme to determine the maximum depth and the number of quantization levels. The designed corrugations were fabricated in a photoresist by maskless grayscale exposure using a high-resolution spatial light modulator. We characterized the photoresist splitter, thereby validating the proposed beam-splitting concept.

  17. Recommendations for dealing with waste contaminated with Ebola virus: a Hazard Analysis of Critical Control Points approach.

    Science.gov (United States)

    Edmunds, Kelly L; Elrahman, Samira Abd; Bell, Diana J; Brainard, Julii; Dervisevic, Samir; Fedha, Tsimbiri P; Few, Roger; Howard, Guy; Lake, Iain; Maes, Peter; Matofari, Joseph; Minnigh, Harvey; Mohamedani, Ahmed A; Montgomery, Maggie; Morter, Sarah; Muchiri, Edward; Mudau, Lutendo S; Mutua, Benedict M; Ndambuki, Julius M; Pond, Katherine; Sobsey, Mark D; van der Es, Mike; Zeitoun, Mark; Hunter, Paul R

    2016-06-01

    To assess, within communities experiencing Ebola virus outbreaks, the risks associated with the disposal of human waste and to generate recommendations for mitigating such risks. A team with expertise in the Hazard Analysis of Critical Control Points framework identified waste products from the care of individuals with Ebola virus disease and constructed, tested and confirmed flow diagrams showing the creation of such products. After listing potential hazards associated with each step in each flow diagram, the team conducted a hazard analysis, determined critical control points and made recommendations to mitigate the transmission risks at each control point. The collection, transportation, cleaning and shared use of blood-soiled fomites and the shared use of latrines contaminated with blood or bloodied faeces appeared to be associated with particularly high levels of risk of Ebola virus transmission. More moderate levels of risk were associated with the collection and transportation of material contaminated with bodily fluids other than blood, shared use of latrines soiled with such fluids, the cleaning and shared use of fomites soiled with such fluids, and the contamination of the environment during the collection and transportation of blood-contaminated waste. The risk of the waste-related transmission of Ebola virus could be reduced by the use of full personal protective equipment, appropriate hand hygiene and an appropriate disinfectant after careful cleaning. Use of the Hazard Analysis of Critical Control Points framework could facilitate rapid responses to outbreaks of emerging infectious disease.

  18. Spectral analysis of growing graphs a quantum probability point of view

    CERN Document Server

    Obata, Nobuaki

    2017-01-01

    This book is designed as a concise introduction to the recent achievements on spectral analysis of graphs or networks from the point of view of quantum (or non-commutative) probability theory. The main topics are spectral distributions of the adjacency matrices of finite or infinite graphs and their limit distributions for growing graphs. The main vehicle is quantum probability, an algebraic extension of the traditional probability theory, which provides a new framework for the analysis of adjacency matrices revealing their non-commutative nature. For example, the method of quantum decomposition makes it possible to study spectral distributions by means of interacting Fock spaces or equivalently by orthogonal polynomials. Various concepts of independence in quantum probability and corresponding central limit theorems are used for the asymptotic study of spectral distributions for product graphs. This book is written for researchers, teachers, and students interested in graph spectra, their (asymptotic) spectr...

  19. IRB Process Improvements: A Machine Learning Analysis.

    Science.gov (United States)

    Shoenbill, Kimberly; Song, Yiqiang; Cobb, Nichelle L; Drezner, Marc K; Mendonca, Eneida A

    2017-06-01

    Clinical research involving humans is critically important, but it is a lengthy and expensive process. Most studies require institutional review board (IRB) approval. Our objective is to identify predictors of delays or accelerations in the IRB review process and apply this knowledge to inform process change in an effort to improve IRB efficiency, transparency, consistency and communication. We analyzed timelines of protocol submissions to determine protocol or IRB characteristics associated with different processing times. Our evaluation included single variable analysis to identify significant predictors of IRB processing time and machine learning methods to predict processing times through the IRB review system. Based on initial identified predictors, changes to IRB workflow and staffing procedures were instituted and we repeated our analysis. Our analysis identified several predictors of delays in the IRB review process including type of IRB review to be conducted, whether a protocol falls under Veteran's Administration purview and specific staff in charge of a protocol's review. We have identified several predictors of delays in IRB protocol review processing times using statistical and machine learning methods. Application of this knowledge to process improvement efforts in two IRBs has led to increased efficiency in protocol review. The workflow and system enhancements that are being made support our four-part goal of improving IRB efficiency, consistency, transparency, and communication.

  20. Some probabilistic properties of fractional point processes

    KAUST Repository

    Garra, Roberto; Orsingher, Enzo; Scavino, Marco

    2017-01-01

    P{T-k(alpha) < infinity} are explicitly obtained and analyzed. The processes N-f (t) are time-changed Poisson processes N( H-f (t)) with subordinators H-f (t) and here we study N(Sigma H-n(j= 1)f j (t)) and obtain probabilistic features

  1. CPAS Preflight Drop Test Analysis Process

    Science.gov (United States)

    Englert, Megan E.; Bledsoe, Kristin J.; Romero, Leah M.

    2015-01-01

    Throughout the Capsule Parachute Assembly System (CPAS) drop test program, the CPAS Analysis Team has developed a simulation and analysis process to support drop test planning and execution. This process includes multiple phases focused on developing test simulations and communicating results to all groups involved in the drop test. CPAS Engineering Development Unit (EDU) series drop test planning begins with the development of a basic operational concept for each test. Trajectory simulation tools include the Flight Analysis and Simulation Tool (FAST) for single bodies, and the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulation for the mated vehicle. Results are communicated to the team at the Test Configuration Review (TCR) and Test Readiness Review (TRR), as well as at Analysis Integrated Product Team (IPT) meetings in earlier and intermediate phases of the pre-test planning. The ability to plan and communicate efficiently with rapidly changing objectives and tight schedule constraints is a necessity for safe and successful drop tests.

  2. The (n, $\\gamma$) reaction in the s-process branching point $^{59}$Ni

    CERN Multimedia

    We propose to measure the $^{59}$Ni(n,$\\gamma$)$^{56}$Fe cross section at the neutron time of flight (n TOF) facility with a dedicated chemical vapor deposition (CVD) diamond detector. The (n, ) reaction in the radioactive $^{59}$Ni is of relevance in nuclear astrophysics as it can be seen as a rst branching point in the astrophysical s-process. Its relevance in nuclear technology is especially related to material embrittlement in stainless steel. There is a strong discrepancy between available experimental data and the evaluated nuclear data les for this isotope. The aim of the measurement is to clarify this disagreement. The clear energy separation of the reaction products of neutron induced reactions in $^{59}$Ni makes it a very suitable candidate for a rst cross section measurement with the CVD diamond detector, which should serve in the future for similar measurements at n_TOF.

  3. Application of wavelet transform in seismic signal processing

    International Nuclear Information System (INIS)

    Ghasemi, M. R.; Mohammadzadeh, A.; Salajeghe, E.

    2005-01-01

    Wavelet transform is a new tool for signal analysis which can perform a simultaneous signal time and frequency representations. Under Multi Resolution Analysis, one can quickly determine details for signals and their properties using Fast Wavelet Transform algorithms. In this paper, for a better physical understanding of a signal and its basic algorithms, Multi Resolution Analysis together with wavelet transforms in a form of Digital Signal Processing will be discussed. For a Seismic Signal Processing, sets of Orthonormal Daubechies Wavelets are suggested. when dealing with the application of wavelets in SSP, one may discuss about denoising from the signal and data compression existed in the signal, which is important in seismic signal data processing. Using this techniques, EL-Centro and Nagan signals were remodeled with a 25% of total points, resulted in a satisfactory results with an acceptable error drift. Thus a total of 1559 and 2500 points for EL-centro and Nagan seismic curves each, were reduced to 389 and 625 points respectively, with a very reasonable error drift, details of which are recorded in the paper. Finally, the future progress in signal processing, based on wavelet theory will be appointed

  4. Nuclear structure and weak rates of heavy waiting point nuclei under rp-process conditions

    Science.gov (United States)

    Nabi, Jameel-Un; Böyükata, Mahmut

    2017-01-01

    The structure and the weak interaction mediated rates of the heavy waiting point (WP) nuclei 80Zr, 84Mo, 88Ru, 92Pd and 96Cd along N = Z line were studied within the interacting boson model-1 (IBM-1) and the proton-neutron quasi-particle random phase approximation (pn-QRPA). The energy levels of the N = Z WP nuclei were calculated by fitting the essential parameters of IBM-1 Hamiltonian and their geometric shapes were predicted by plotting potential energy surfaces (PESs). Half-lives, continuum electron capture rates, positron decay rates, electron capture cross sections of WP nuclei, energy rates of β-delayed protons and their emission probabilities were later calculated using the pn-QRPA. The calculated Gamow-Teller strength distributions were compared with previous calculation. We present positron decay and continuum electron capture rates on these WP nuclei under rp-process conditions using the same model. For the rp-process conditions, the calculated total weak rates are twice the Skyrme HF+BCS+QRPA rates for 80Zr. For remaining nuclei the two calculations compare well. The electron capture rates are significant and compete well with the corresponding positron decay rates under rp-process conditions. The finding of the present study supports that electron capture rates form an integral part of the weak rates under rp-process conditions and has an important role for the nuclear model calculations.

  5. Fuzzy Risk Analysis for a Production System Based on the Nagel Point of a Triangle

    Directory of Open Access Journals (Sweden)

    Handan Akyar

    2016-01-01

    Full Text Available Ordering and ranking fuzzy numbers and their comparisons play a significant role in decision-making problems such as social and economic systems, forecasting, optimization, and risk analysis problems. In this paper, a new method for ordering triangular fuzzy numbers using the Nagel point of a triangle is presented. With the aid of the proposed method, reasonable properties of ordering fuzzy numbers are verified. Certain comparative examples are given to illustrate the advantages of the new method. Many papers have been devoted to studies on fuzzy ranking methods, but some of these studies have certain shortcomings. The proposed method overcomes the drawbacks of the existing methods in the literature. The suggested method can order triangular fuzzy numbers as well as crisp numbers and fuzzy numbers with the same centroid point. An application to the fuzzy risk analysis problem is given, based on the suggested ordering approach.

  6. The solution of the neutron point kinetics equation with stochastic extension: an analysis of two moments

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Milena Wollmann da; Vilhena, Marco Tullio M.B.; Bodmann, Bardo Ernst J.; Vasques, Richard, E-mail: milena.wollmann@ufrgs.br, E-mail: vilhena@mat.ufrgs.br, E-mail: bardobodmann@ufrgs.br, E-mail: richard.vasques@fulbrightmail.org [Universidade Federal do Rio Grande do Sul (UFRGS), Porto Alegre, RS (Brazil). Programa de Pos-Graduacao em Engenharia Mecanica

    2015-07-01

    The neutron point kinetics equation, which models the time-dependent behavior of nuclear reactors, is often used to understand the dynamics of nuclear reactor operations. It consists of a system of coupled differential equations that models the interaction between (i) the neutron population; and (II) the concentration of the delayed neutron precursors, which are radioactive isotopes formed in the fission process that decay through neutron emission. These equations are deterministic in nature, and therefore can provide only average values of the modeled populations. However, the actual dynamical process is stochastic: the neutron density and the delayed neutron precursor concentrations vary randomly with time. To address this stochastic behavior, Hayes and Allen have generalized the standard deterministic point kinetics equation. They derived a system of stochastic differential equations that can accurately model the random behavior of the neutron density and the precursor concentrations in a point reactor. Due to the stiffness of these equations, this system was numerically implemented using a stochastic piecewise constant approximation method (Stochastic PCA). Here, we present a study of the influence of stochastic fluctuations on the results of the neutron point kinetics equation. We reproduce the stochastic formulation introduced by Hayes and Allen and compute Monte Carlo numerical results for examples with constant and time-dependent reactivity, comparing these results with stochastic and deterministic methods found in the literature. Moreover, we introduce a modified version of the stochastic method to obtain a non-stiff solution, analogue to a previously derived deterministic approach. (author)

  7. The solution of the neutron point kinetics equation with stochastic extension: an analysis of two moments

    International Nuclear Information System (INIS)

    Silva, Milena Wollmann da; Vilhena, Marco Tullio M.B.; Bodmann, Bardo Ernst J.; Vasques, Richard

    2015-01-01

    The neutron point kinetics equation, which models the time-dependent behavior of nuclear reactors, is often used to understand the dynamics of nuclear reactor operations. It consists of a system of coupled differential equations that models the interaction between (i) the neutron population; and (II) the concentration of the delayed neutron precursors, which are radioactive isotopes formed in the fission process that decay through neutron emission. These equations are deterministic in nature, and therefore can provide only average values of the modeled populations. However, the actual dynamical process is stochastic: the neutron density and the delayed neutron precursor concentrations vary randomly with time. To address this stochastic behavior, Hayes and Allen have generalized the standard deterministic point kinetics equation. They derived a system of stochastic differential equations that can accurately model the random behavior of the neutron density and the precursor concentrations in a point reactor. Due to the stiffness of these equations, this system was numerically implemented using a stochastic piecewise constant approximation method (Stochastic PCA). Here, we present a study of the influence of stochastic fluctuations on the results of the neutron point kinetics equation. We reproduce the stochastic formulation introduced by Hayes and Allen and compute Monte Carlo numerical results for examples with constant and time-dependent reactivity, comparing these results with stochastic and deterministic methods found in the literature. Moreover, we introduce a modified version of the stochastic method to obtain a non-stiff solution, analogue to a previously derived deterministic approach. (author)

  8. IRSN global process for leading a comprehensive fire safety analysis for nuclear installations

    International Nuclear Information System (INIS)

    Ormieres, Yannick; Lacoue, Jocelyne

    2013-01-01

    A fire safety analysis (FSA) is requested to justify the adequacy of fire protection measures set by the operator. A recent document written by IRSN outlines a global process for such a comprehensive fire safety analysis. Thanks to the French nuclear fire safety regulation evolutions, from prescriptive requirements to objective requirements, the proposed fire safety justification process focuses on compliance with performance criteria for fire protection measures. These performance criteria are related to the vulnerability of targets to effects of fire, and not only based upon radiological consequences out side the installation caused by a fire. In his FSA, the operator has to define the safety functions that should continue to ensure its mission even in the case of fire in order to be in compliance with nuclear safety objectives. Then, in order to maintain these safety functions, the operator has to justify the adequacy of fire protection measures, defined according to defence in depth principles. To reach the objective, the analysis process is based on the identification of targets to be protected in order to maintain safety functions, taken into account facility characteristics. These targets include structures, systems, components and personal important to safety. Facility characteristics include, for all operating conditions, potential ignition sources and fire protections systems. One of the key points of the fire analysis is the assessment of possible fire scenarios in the facility. Given the large number of possible fire scenarios, it is then necessary to evaluate 'reference fires' which are the worst case scenarios of all possible fire scenarios and which are used by the operator for the design of fire protection measures. (authors)

  9. Beginning SharePoint 2010 Development

    CERN Document Server

    Fox, Steve

    2010-01-01

    Discover how to take advantage of the many new features in SharePoint 2010. SharePoint provides content management (enterprise content management, Web content management, records management, and more), workflow, and social media features, and the new version boasts enhanced capabilities. This introductory-level book walks you through the process of learning, developing, and deploying SharePoint 2010 solutions. You'll leverage your existing skills and tools to grasp the fundamental programming concepts and practices of SharePoint 2010. The author clearly explains how to develop your first appli

  10. LIFE CYCLE ASSESSMENT AND HAZARD ANALYSIS AND CRITICAL CONTROL POINTS TO THE PASTA PRODUCT

    Directory of Open Access Journals (Sweden)

    Yulexis Meneses Linares

    2016-10-01

    Full Text Available The objective of this work is to combine the Life Cycle Assessment (LCA and Hazard Analysis and Critical Control Points (HACCP methodologies for the determination of risks that the food production represents to the human health and the ecosystem. The environmental performance of the production of pastas in the “Marta Abreu” Pasta Factory of Cienfuegos is assessed, where the critical control points determined by the biological dangers (mushrooms and plagues and the physical dangers (wood, paper, thread and ferromagnetic particles were the raw materials: flour, semolina and its mixtures, and the disposition and extraction of them. Resources are the most affected damage category due to the consumption of fossil fuels.

  11. Radionuclides in Bayer process residues: previous analysis for radiological protection

    International Nuclear Information System (INIS)

    Cuccia, Valeria; Rocha, Zildete; Oliveira, Arno H. de

    2011-01-01

    Natural occurring radionuclides are present in many natural resources. Human activities may enhance concentrations of radionuclides and/or enhance potential of exposure to naturally occurring radioactive material (NORM). The industrial residues containing radionuclides have been receiving a considerable global attention, because of the large amounts of NORM containing wastes and the potential long term risks of long-lived radionuclides. Included in this global concern, this work focuses on the characterization of radioactivity in the main residues of Bayer process for alumina production: red mud and sand samples. Usually, the residues of Bayer process are named red mud, in their totality. However, in the industry where the samples were collected, there is an additional residues separation: sand and red mud. The analytical techniques used were gamma spectrometry (HPGe detector) and neutron activation analysis. The concentrations of radionuclides are higher in the red mud than in the sand. These solid residues present activities concentrations enhanced, when compared to bauxite. Further uses for the residues as building material must be more evaluated from the radiological point of view, due to its potential of radiological exposure enhancement, specially caused by radon emission. (author)

  12. Material-Point Method Analysis of Bending in Elastic Beams

    DEFF Research Database (Denmark)

    Andersen, Søren Mikkel; Andersen, Lars

    2007-01-01

    The aim of this paper is to test different types of spatial interpolation for the material-point method. The interpolations include quadratic elements and cubic splines. A brief introduction to the material-point method is given. Simple liner-elastic problems are tested, including the classical...... cantilevered beam problem. As shown in the paper, the use of negative shape functions is not consistent with the material-point method in its current form, necessitating other types of interpolation such as cubic splines in order to obtain smoother representations of field quantities. It is shown...

  13. Do acupuncture points exist?

    International Nuclear Information System (INIS)

    Yan Xiaohui; Zhang Xinyi; Liu Chenglin; Dang, Ruishan; Huang Yuying; He Wei; Ding Guanghong

    2009-01-01

    We used synchrotron x-ray fluorescence analysis to probe the distribution of four chemical elements in and around acupuncture points, two located in the forearm and two in the lower leg. Three of the four acupuncture points showed significantly elevated concentrations of elements Ca, Fe, Cu and Zn in relation to levels in the surrounding tissue, with similar elevation ratios for Cu and Fe. The mapped distribution of these elements implies that each acupuncture point seems to be elliptical with the long axis along the meridian. (note)

  14. Do acupuncture points exist?

    Energy Technology Data Exchange (ETDEWEB)

    Yan Xiaohui; Zhang Xinyi [Department of Physics, Surface Physics Laboratory (State Key Laboratory), and Synchrotron Radiation Research Center of Fudan University, Shanghai 200433 (China); Liu Chenglin [Physics Department of Yancheng Teachers' College, Yancheng 224002 (China); Dang, Ruishan [Second Military Medical University, Shanghai 200433 (China); Huang Yuying; He Wei [Beijing Synchrotron Radiation Facility, Institute of High Energy Physics, Chinese Academy of Sciences, Beijing 100039 (China); Ding Guanghong [Shanghai Research Center of Acupuncture and Meridian, Pudong, Shanghai 201203 (China)

    2009-05-07

    We used synchrotron x-ray fluorescence analysis to probe the distribution of four chemical elements in and around acupuncture points, two located in the forearm and two in the lower leg. Three of the four acupuncture points showed significantly elevated concentrations of elements Ca, Fe, Cu and Zn in relation to levels in the surrounding tissue, with similar elevation ratios for Cu and Fe. The mapped distribution of these elements implies that each acupuncture point seems to be elliptical with the long axis along the meridian. (note)

  15. Predicting seizures in untreated temporal lobe epilepsy using point-process nonlinear models of heartbeat dynamics.

    Science.gov (United States)

    Valenza, G; Romigi, A; Citi, L; Placidi, F; Izzi, F; Albanese, M; Scilingo, E P; Marciani, M G; Duggento, A; Guerrisi, M; Toschi, N; Barbieri, R

    2016-08-01

    Symptoms of temporal lobe epilepsy (TLE) are frequently associated with autonomic dysregulation, whose underlying biological processes are thought to strongly contribute to sudden unexpected death in epilepsy (SUDEP). While abnormal cardiovascular patterns commonly occur during ictal events, putative patterns of autonomic cardiac effects during pre-ictal (PRE) periods (i.e. periods preceding seizures) are still unknown. In this study, we investigated TLE-related heart rate variability (HRV) through instantaneous, nonlinear estimates of cardiovascular oscillations during inter-ictal (INT) and PRE periods. ECG recordings from 12 patients with TLE were processed to extract standard HRV indices, as well as indices of instantaneous HRV complexity (dominant Lyapunov exponent and entropy) and higher-order statistics (bispectra) obtained through definition of inhomogeneous point-process nonlinear models, employing Volterra-Laguerre expansions of linear, quadratic, and cubic kernels. Experimental results demonstrate that the best INT vs. PRE classification performance (balanced accuracy: 73.91%) was achieved only when retaining the time-varying, nonlinear, and non-stationary structure of heartbeat dynamical features. The proposed approach opens novel important avenues in predicting ictal events using information gathered from cardiovascular signals exclusively.

  16. Point Analysis in Java applied to histological images of the perforant pathway: a user's account.

    Science.gov (United States)

    Scorcioni, Ruggero; Wright, Susan N; Patrick Card, J; Ascoli, Giorgio A; Barrionuevo, Germán

    2008-01-01

    The freeware Java tool Point Analysis in Java (PAJ), created to perform 3D point analysis, was tested in an independent laboratory setting. The input data consisted of images of the hippocampal perforant pathway from serial immunocytochemical localizations of the rat brain in multiple views at different resolutions. The low magnification set (x2 objective) comprised the entire perforant pathway, while the high magnification set (x100 objective) allowed the identification of individual fibers. A preliminary stereological study revealed a striking linear relationship between the fiber count at high magnification and the optical density at low magnification. PAJ enabled fast analysis for down-sampled data sets and a friendly interface with automated plot drawings. Noted strengths included the multi-platform support as well as the free availability of the source code, conducive to a broad user base and maximum flexibility for ad hoc requirements. PAJ has great potential to extend its usability by (a) improving its graphical user interface, (b) increasing its input size limit, (c) improving response time for large data sets, and (d) potentially being integrated with other Java graphical tools such as ImageJ.

  17. Groundwater Mixing Process Identification in Deep Mines Based on Hydrogeochemical Property Analysis

    Directory of Open Access Journals (Sweden)

    Bo Liu

    2016-12-01

    Full Text Available Karst collapse columns, as a potential water passageway for mine water inrush, are always considered a critical problem for the development of deep mining techniques. This study aims to identify the mixing process of groundwater deriving two different limestone karst-fissure aquifer systems. Based on analysis of mining groundwater hydrogeochemical properties, hydraulic connection between the karst-fissure objective aquifer systems was revealed. In this paper, piper diagram was used to calculate the mixing ratios at different sampling points in the aquifer systems, and PHREEQC Interactive model (Version 2.5, USGS, Reston, VA, USA, 2001 was applied to modify the mixing ratios and model the water–rock interactions during the mixing processes. The analysis results show that the highest mixing ratio is 0.905 in the C12 borehole that is located nearest to the #2 karst collapse column, and the mixing ratio decreases with the increase of the distance from the #2 karst collapse column. It demonstrated that groundwater of the two aquifers mixed through the passage of #2 karst collapse column. As a result, the proposed Piper-PHREEQC based method can provide accurate identification of karst collapse columns’ water conductivity, and can be applied to practical applications.

  18. Explore Stochastic Instabilities of Periodic Points by Transition Path Theory

    Science.gov (United States)

    Cao, Yu; Lin, Ling; Zhou, Xiang

    2016-06-01

    We consider the noise-induced transitions from a linearly stable periodic orbit consisting of T periodic points in randomly perturbed discrete logistic map. Traditional large deviation theory and asymptotic analysis at small noise limit cannot distinguish the quantitative difference in noise-induced stochastic instabilities among the T periodic points. To attack this problem, we generalize the transition path theory to the discrete-time continuous-space stochastic process. In our first criterion to quantify the relative instability among T periodic points, we use the distribution of the last passage location related to the transitions from the whole periodic orbit to a prescribed disjoint set. This distribution is related to individual contributions to the transition rate from each periodic points. The second criterion is based on the competency of the transition paths associated with each periodic point. Both criteria utilize the reactive probability current in the transition path theory. Our numerical results for the logistic map reveal the transition mechanism of escaping from the stable periodic orbit and identify which periodic point is more prone to lose stability so as to make successful transitions under random perturbations.

  19. Computerization of the safeguards analysis decision process

    International Nuclear Information System (INIS)

    Ehinger, M.H.

    1990-01-01

    This paper reports that safeguards regulations are evolving to meet new demands for timeliness and sensitivity in detecting the loss or unauthorized use of sensitive nuclear materials. The opportunities to meet new rules, particularly in bulk processing plants, involve developing techniques which use modern, computerized process control and information systems. Using these computerized systems in the safeguards analysis involves all the challenges of the man-machine interface experienced in the typical process control application and adds new dimensions to accuracy requirements, data analysis, and alarm resolution in the regulatory environment

  20. Retro-Techno-Economic Analysis: Using (Bio)Process Systems Engineering Tools to Attain Process Target Values

    DEFF Research Database (Denmark)

    Furlan, Felipe F.; Costa, Caliane B B; Secchi, Argimiro R.

    2016-01-01

    Economic analysis, allied to process systems engineering tools, can provide useful insights about process techno-economic feasibility. More interestingly, rather than being used to evaluate specific process conditions, this techno-economic analysis can be turned upside down to achieve target valu...