WorldWideScience

Sample records for point process analysis

  1. Residual analysis for spatial point processes

    DEFF Research Database (Denmark)

    Baddeley, A.; Turner, R.; Møller, Jesper

    We define residuals for point process models fitted to spatial point pattern data, and propose diagnostic plots based on these residuals. The techniques apply to any Gibbs point process model, which may exhibit spatial heterogeneity, interpoint interaction and dependence on spatial covariates. Ou...... or covariate effects. Q-Q plots of the residuals are effective in diagnosing interpoint interaction. Some existing ad hoc statistics of point patterns (quadrat counts, scan statistic, kernel smoothed intensity, Berman's diagnostic) are recovered as special cases....

  2. Bayesian analysis of Markov point processes

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    2006-01-01

    Recently Møller, Pettitt, Berthelsen and Reeves introduced a new MCMC methodology for drawing samples from a posterior distribution when the likelihood function is only specified up to a normalising constant. We illustrate the method in the setting of Bayesian inference for Markov point processes...... a partially ordered Markov point process as the auxiliary variable. As the method requires simulation from the "unknown" likelihood, perfect simulation algorithms for spatial point processes become useful....

  3. Fingerprint Analysis with Marked Point Processes

    DEFF Research Database (Denmark)

    Forbes, Peter G. M.; Lauritzen, Steffen; Møller, Jesper

    We present a framework for fingerprint matching based on marked point process models. An efficient Monte Carlo algorithm is developed to calculate the marginal likelihood ratio for the hypothesis that two observed prints originate from the same finger against the hypothesis that they originate from...... different fingers. Our model achieves good performance on an NIST-FBI fingerprint database of 258 matched fingerprint pairs....

  4. On statistical analysis of compound point process

    Czech Academy of Sciences Publication Activity Database

    Volf, Petr

    2006-01-01

    Roč. 35, 2-3 (2006), s. 389-396 ISSN 1026-597X R&D Projects: GA ČR(CZ) GA402/04/1294 Institutional research plan: CEZ:AV0Z10750506 Keywords : counting process * compound process * hazard function * Cox -model Subject RIV: BB - Applied Statistics, Operational Research

  5. Scattering analysis of point processes and random measures

    International Nuclear Information System (INIS)

    Hanisch, K.H.

    1984-01-01

    In the present paper scattering analysis of point processes and random measures is studied. Known formulae which connect the scattering intensity with the pair distribution function of the studied structures are proved in a rigorous manner with tools of the theory of point processes and random measures. For some special fibre processes the scattering intensity is computed. For a class of random measures, namely for 'grain-germ-models', a new formula is proved which yields the pair distribution function of the 'grain-germ-model' in terms of the pair distribution function of the underlying point process (the 'germs') and of the mean structure factor and the mean squared structure factor of the particles (the 'grains'). (author)

  6. Seeking a fingerprint: analysis of point processes in actigraphy recording

    Science.gov (United States)

    Gudowska-Nowak, Ewa; Ochab, Jeremi K.; Oleś, Katarzyna; Beldzik, Ewa; Chialvo, Dante R.; Domagalik, Aleksandra; Fąfrowicz, Magdalena; Marek, Tadeusz; Nowak, Maciej A.; Ogińska, Halszka; Szwed, Jerzy; Tyburczyk, Jacek

    2016-05-01

    Motor activity of humans displays complex temporal fluctuations which can be characterised by scale-invariant statistics, thus demonstrating that structure and fluctuations of such kinetics remain similar over a broad range of time scales. Previous studies on humans regularly deprived of sleep or suffering from sleep disorders predicted a change in the invariant scale parameters with respect to those for healthy subjects. In this study we investigate the signal patterns from actigraphy recordings by means of characteristic measures of fractional point processes. We analyse spontaneous locomotor activity of healthy individuals recorded during a week of regular sleep and a week of chronic partial sleep deprivation. Behavioural symptoms of lack of sleep can be evaluated by analysing statistics of duration times during active and resting states, and alteration of behavioural organisation can be assessed by analysis of power laws detected in the event count distribution, distribution of waiting times between consecutive movements and detrended fluctuation analysis of recorded time series. We claim that among different measures characterising complexity of the actigraphy recordings and their variations implied by chronic sleep distress, the exponents characterising slopes of survival functions in resting states are the most effective biomarkers distinguishing between healthy and sleep-deprived groups.

  7. Process for structural geologic analysis of topography and point data

    Science.gov (United States)

    Eliason, Jay R.; Eliason, Valerie L. C.

    1987-01-01

    A quantitative method of geologic structural analysis of digital terrain data is described for implementation on a computer. Assuming selected valley segments are controlled by the underlying geologic structure, topographic lows in the terrain data, defining valley bottoms, are detected, filtered and accumulated into a series line segments defining contiguous valleys. The line segments are then vectorized to produce vector segments, defining valley segments, which may be indicative of the underlying geologic structure. Coplanar analysis is performed on vector segment pairs to determine which vectors produce planes which represent underlying geologic structure. Point data such as fracture phenomena which can be related to fracture planes in 3-dimensional space can be analyzed to define common plane orientation and locations. The vectors, points, and planes are displayed in various formats for interpretation.

  8. Framework for adaptive multiscale analysis of nonhomogeneous point processes.

    Science.gov (United States)

    Helgason, Hannes; Bartroff, Jay; Abry, Patrice

    2011-01-01

    We develop the methodology for hypothesis testing and model selection in nonhomogeneous Poisson processes, with an eye toward the application of modeling and variability detection in heart beat data. Modeling the process' non-constant rate function using templates of simple basis functions, we develop the generalized likelihood ratio statistic for a given template and a multiple testing scheme to model-select from a family of templates. A dynamic programming algorithm inspired by network flows is used to compute the maximum likelihood template in a multiscale manner. In a numerical example, the proposed procedure is nearly as powerful as the super-optimal procedures that know the true template size and true partition, respectively. Extensions to general history-dependent point processes is discussed.

  9. Geometric anisotropic spatial point pattern analysis and Cox processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Toftaker, Håkon

    . In particular we study Cox process models with an elliptical pair correlation function, including shot noise Cox processes and log Gaussian Cox processes, and we develop estimation procedures using summary statistics and Bayesian methods. Our methodology is illustrated on real and synthetic datasets of spatial...

  10. The Hinkley Point decision: An analysis of the policy process

    International Nuclear Information System (INIS)

    Thomas, Stephen

    2016-01-01

    In 2006, the British government launched a policy to build nuclear power reactors based on a claim that the power produced would be competitive with fossil fuel and would require no public subsidy. A decade later, it is not clear how many, if any, orders will be placed and the claims on costs and subsidies have proved false. Despite this failure to deliver, the policy is still being pursued with undiminished determination. The finance model that is now proposed is seen as a model other European countries can follow so the success or otherwise of the British nuclear programme will have implications outside the UK. This paper contends that the checks and balances that should weed out misguided policies, have failed. It argues that the most serious failure is with the civil service and its inability to provide politicians with high quality advice – truth to power. It concludes that the failure is likely to be due to the unwillingness of politicians to listen to opinions that conflict with their beliefs. Other weaknesses include the lack of energy expertise in the media, the unwillingness of the public to engage in the policy process and the impotence of Parliamentary Committees. - Highlights: •Britain's nuclear power policy is failing due to high costs and problems of finance. •This has implications for European countries who want to use the same financing model. •The continued pursuit of a failing policy is due to poor advice from civil servants. •Lack of expertise in the media and lack of public engagement have contributed. •Parliamentary processes have not provided proper critical scrutiny.

  11. Definition of distance for nonlinear time series analysis of marked point process data

    Energy Technology Data Exchange (ETDEWEB)

    Iwayama, Koji, E-mail: koji@sat.t.u-tokyo.ac.jp [Research Institute for Food and Agriculture, Ryukoku Univeristy, 1-5 Yokotani, Seta Oe-cho, Otsu-Shi, Shiga 520-2194 (Japan); Hirata, Yoshito; Aihara, Kazuyuki [Institute of Industrial Science, The University of Tokyo, 4-6-1 Komaba, Meguro-ku, Tokyo 153-8505 (Japan)

    2017-01-30

    Marked point process data are time series of discrete events accompanied with some values, such as economic trades, earthquakes, and lightnings. A distance for marked point process data allows us to apply nonlinear time series analysis to such data. We propose a distance for marked point process data which can be calculated much faster than the existing distance when the number of marks is small. Furthermore, under some assumptions, the Kullback–Leibler divergences between posterior distributions for neighbors defined by this distance are small. We performed some numerical simulations showing that analysis based on the proposed distance is effective. - Highlights: • A new distance for marked point process data is proposed. • The distance can be computed fast enough for a small number of marks. • The method to optimize parameter values of the distance is also proposed. • Numerical simulations indicate that the analysis based on the distance is effective.

  12. Poisson branching point processes

    International Nuclear Information System (INIS)

    Matsuo, K.; Teich, M.C.; Saleh, B.E.A.

    1984-01-01

    We investigate the statistical properties of a special branching point process. The initial process is assumed to be a homogeneous Poisson point process (HPP). The initiating events at each branching stage are carried forward to the following stage. In addition, each initiating event independently contributes a nonstationary Poisson point process (whose rate is a specified function) located at that point. The additional contributions from all points of a given stage constitute a doubly stochastic Poisson point process (DSPP) whose rate is a filtered version of the initiating point process at that stage. The process studied is a generalization of a Poisson branching process in which random time delays are permitted in the generation of events. Particular attention is given to the limit in which the number of branching stages is infinite while the average number of added events per event of the previous stage is infinitesimal. In the special case when the branching is instantaneous this limit of continuous branching corresponds to the well-known Yule--Furry process with an initial Poisson population. The Poisson branching point process provides a useful description for many problems in various scientific disciplines, such as the behavior of electron multipliers, neutron chain reactions, and cosmic ray showers

  13. Second-order analysis of structured inhomogeneous spatio-temporal point processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Ghorbani, Mohammad

    Statistical methodology for spatio-temporal point processes is in its infancy. We consider second-order analysis based on pair correlation functions and K-functions for first general inhomogeneous spatio-temporal point processes and second inhomogeneous spatio-temporal Cox processes. Assuming...... spatio-temporal separability of the intensity function, we clarify different meanings of second-order spatio-temporal separability. One is second-order spatio-temporal independence and relates e.g. to log-Gaussian Cox processes with an additive covariance structure of the underlying spatio......-temporal Gaussian process. Another concerns shot-noise Cox processes with a separable spatio-temporal covariance density. We propose diagnostic procedures for checking hypotheses of second-order spatio-temporal separability, which we apply on simulated and real data (the UK 2001 epidemic foot and mouth disease data)....

  14. Hazard analysis and critical control point (HACCP) for an ultrasound food processing operation.

    Science.gov (United States)

    Chemat, Farid; Hoarau, Nicolas

    2004-05-01

    Emerging technologies, such as ultrasound (US), used for food and drink production often cause hazards for product safety. Classical quality control methods are inadequate to control these hazards. Hazard analysis of critical control points (HACCP) is the most secure and cost-effective method for controlling possible product contamination or cross-contamination, due to physical or chemical hazard during production. The following case study on the application of HACCP to an US food-processing operation demonstrates how the hazards at the critical control points of the process are effectively controlled through the implementation of HACCP.

  15. Implementation of hazard analysis and critical control point (HACCP) in dried anchovy production process

    Science.gov (United States)

    Citraresmi, A. D. P.; Wahyuni, E. E.

    2018-03-01

    The aim of this study was to inspect the implementation of Hazard Analysis and Critical Control Point (HACCP) for identification and prevention of potential hazards in the production process of dried anchovy at PT. Kelola Mina Laut (KML), Lobuk unit, Sumenep. Cold storage process is needed in each anchovy processing step in order to maintain its physical and chemical condition. In addition, the implementation of quality assurance system should be undertaken to maintain product quality. The research was conducted using a survey method, by following the whole process of making anchovy from the receiving raw materials to the packaging of final product. The method of data analysis used was descriptive analysis method. Implementation of HACCP at PT. KML, Lobuk unit, Sumenep was conducted by applying Pre Requisite Programs (PRP) and preparation stage consisting of 5 initial stages and 7 principles of HACCP. The results showed that CCP was found in boiling process flow with significant hazard of Listeria monocytogenesis bacteria and final sorting process with significant hazard of foreign material contamination in the product. Actions taken were controlling boiling temperature of 100 – 105°C for 3 - 5 minutes and training for sorting process employees.

  16. Hierarchical spatial point process analysis for a plant community with high biodiversity

    DEFF Research Database (Denmark)

    Illian, Janine B.; Møller, Jesper; Waagepetersen, Rasmus

    2009-01-01

    A complex multivariate spatial point pattern of a plant community with high biodiversity is modelled using a hierarchical multivariate point process model. In the model, interactions between plants with different post-fire regeneration strategies are of key interest. We consider initially a maxim...

  17. Analysis of the stochastic channel model by Saleh & Valenzuela via the theory of point processes

    DEFF Research Database (Denmark)

    Jakobsen, Morten Lomholt; Pedersen, Troels; Fleury, Bernard Henri

    2012-01-01

    and underlying features, like the intensity function of the component delays and the delaypower intensity. The flexibility and clarity of the mathematical instruments utilized to obtain these results lead us to conjecture that the theory of spatial point processes provides a unifying mathematical framework...

  18. Multiscale change-point analysis of inhomogeneous Poisson processes using unbalanced wavelet decompositions

    NARCIS (Netherlands)

    Jansen, M.H.; Di Bucchianico, A.; Mattheij, R.M.M.; Peletier, M.A.

    2006-01-01

    We present a continuous wavelet analysis of count data with timevarying intensities. The objective is to extract intervals with significant intensities from background intervals. This includes the precise starting point of the significant interval, its exact duration and the (average) level of

  19. Spatial point process analysis for a plant community with high biodiversity

    DEFF Research Database (Denmark)

    Illian, Janine; Møller, Jesper; Waagepetersen, Rasmus Plenge

    A complex multivariate spatial point pattern for a plant community with high biodiversity is modelled using a hierarchical multivariate point process model. In the model, interactions between plants with different post-fire regeneration strategies are of key interest. We consider initially...... a maximum likelihood approach to inference where problems arise due to unknown interaction radii for the plants. We next demonstrate that a Bayesian approach provides a flexible framework for incorporating prior information concerning the interaction radii. From an ecological perspective, we are able both...

  20. Analysis of residual stress state in sheet metal parts processed by single point incremental forming

    Science.gov (United States)

    Maaß, F.; Gies, S.; Dobecki, M.; Brömmelhoff, K.; Tekkaya, A. E.; Reimers, W.

    2018-05-01

    The mechanical properties of formed metal components are highly affected by the prevailing residual stress state. A selective induction of residual compressive stresses in the component, can improve the product properties such as the fatigue strength. By means of single point incremental forming (SPIF), the residual stress state can be influenced by adjusting the process parameters during the manufacturing process. To achieve a fundamental understanding of the residual stress formation caused by the SPIF process, a valid numerical process model is essential. Within the scope of this paper the significance of kinematic hardening effects on the determined residual stress state is presented based on numerical simulations. The effect of the unclamping step after the manufacturing process is also analyzed. An average deviation of the residual stress amplitudes in the clamped and unclamped condition of 18 % reveals, that the unclamping step needs to be considered to reach a high numerical prediction quality.

  1. Analysis of multi-species point patterns using multivariate log Gaussian Cox processes

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Guan, Yongtao; Jalilian, Abdollah

    Multivariate log Gaussian Cox processes are flexible models for multivariate point patterns. However, they have so far only been applied in bivariate cases. In this paper we move beyond the bivariate case in order to model multi-species point patterns of tree locations. In particular we address t...... of the data. The selected number of common latent fields provides an index of complexity of the multivariate covariance structure. Hierarchical clustering is used to identify groups of species with similar patterns of dependence on the common latent fields.......Multivariate log Gaussian Cox processes are flexible models for multivariate point patterns. However, they have so far only been applied in bivariate cases. In this paper we move beyond the bivariate case in order to model multi-species point patterns of tree locations. In particular we address...... the problems of identifying parsimonious models and of extracting biologically relevant information from the fitted models. The latent multivariate Gaussian field is decomposed into components given in terms of random fields common to all species and components which are species specific. This allows...

  2. Quality control for electron beam processing of polymeric materials by end-point analysis

    International Nuclear Information System (INIS)

    DeGraff, E.; McLaughlin, W.L.

    1981-01-01

    Properties of certain plastics, e.g. polytetrafluoroethylene, polyethylene, ethylene vinyl acetate copolymer, can be modified selectively by ionizing radiation. One of the advantages of this treatment over chemical methods is better control of the process and the end-product properties. The most convenient method of dosimetry for monitoring quality control is post-irradiation evaluation of the plastic itself, e.g., melt index and melt point determination. It is shown that by proper calibration in terms of total dose and sufficiently reproducible radiation effects, such product test methods provide convenient and meaningful analyses. Other appropriate standardized analytical methods include stress-crack resistance, stress-strain-to-fracture testing and solubility determination. Standard routine dosimetry over the dose and dose rate ranges of interest confirm that measured product end points can be correlated with calibrated values of absorbed dose in the product within uncertainty limits of the measurements. (author)

  3. Point process modeling and estimation: Advances in the analysis of dynamic neural spiking data

    Science.gov (United States)

    Deng, Xinyi

    2016-08-01

    A common interest of scientists in many fields is to understand the relationship between the dynamics of a physical system and the occurrences of discrete events within such physical system. Seismologists study the connection between mechanical vibrations of the Earth and the occurrences of earthquakes so that future earthquakes can be better predicted. Astrophysicists study the association between the oscillating energy of celestial regions and the emission of photons to learn the Universe's various objects and their interactions. Neuroscientists study the link between behavior and the millisecond-timescale spike patterns of neurons to understand higher brain functions. Such relationships can often be formulated within the framework of state-space models with point process observations. The basic idea is that the dynamics of the physical systems are driven by the dynamics of some stochastic state variables and the discrete events we observe in an interval are noisy observations with distributions determined by the state variables. This thesis proposes several new methodological developments that advance the framework of state-space models with point process observations at the intersection of statistics and neuroscience. In particular, we develop new methods 1) to characterize the rhythmic spiking activity using history-dependent structure, 2) to model population spike activity using marked point process models, 3) to allow for real-time decision making, and 4) to take into account the need for dimensionality reduction for high-dimensional state and observation processes. We applied these methods to a novel problem of tracking rhythmic dynamics in the spiking of neurons in the subthalamic nucleus of Parkinson's patients with the goal of optimizing placement of deep brain stimulation electrodes. We developed a decoding algorithm that can make decision in real-time (for example, to stimulate the neurons or not) based on various sources of information present in

  4. Thinning spatial point processes into Poisson processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Schoenberg, Frederic Paik

    2010-01-01

    are identified, and where we simulate backwards and forwards in order to obtain the thinned process. In the case of a Cox process, a simple independent thinning technique is proposed. In both cases, the thinning results in a Poisson process if and only if the true Papangelou conditional intensity is used, and......In this paper we describe methods for randomly thinning certain classes of spatial point processes. In the case of a Markov point process, the proposed method involves a dependent thinning of a spatial birth-and-death process, where clans of ancestors associated with the original points......, thus, can be used as a graphical exploratory tool for inspecting the goodness-of-fit of a spatial point process model. Several examples, including clustered and inhibitive point processes, are considered....

  5. Thinning spatial point processes into Poisson processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Schoenberg, Frederic Paik

    , and where one simulates backwards and forwards in order to obtain the thinned process. In the case of a Cox process, a simple independent thinning technique is proposed. In both cases, the thinning results in a Poisson process if and only if the true Papangelou conditional intensity is used, and thus can......This paper describes methods for randomly thinning certain classes of spatial point processes. In the case of a Markov point process, the proposed method involves a dependent thinning of a spatial birth-and-death process, where clans of ancestors associated with the original points are identified...... be used as a diagnostic for assessing the goodness-of-fit of a spatial point process model. Several examples, including clustered and inhibitive point processes, are considered....

  6. Detecting determinism from point processes.

    Science.gov (United States)

    Andrzejak, Ralph G; Mormann, Florian; Kreuz, Thomas

    2014-12-01

    The detection of a nonrandom structure from experimental data can be crucial for the classification, understanding, and interpretation of the generating process. We here introduce a rank-based nonlinear predictability score to detect determinism from point process data. Thanks to its modular nature, this approach can be adapted to whatever signature in the data one considers indicative of deterministic structure. After validating our approach using point process signals from deterministic and stochastic model dynamics, we show an application to neuronal spike trains recorded in the brain of an epilepsy patient. While we illustrate our approach in the context of temporal point processes, it can be readily applied to spatial point processes as well.

  7. Change point analysis and assessment

    DEFF Research Database (Denmark)

    Müller, Sabine; Neergaard, Helle; Ulhøi, John Parm

    2011-01-01

    The aim of this article is to develop an analytical framework for studying processes such as continuous innovation and business development in high-tech SME clusters that transcends the traditional qualitative-quantitative divide. It integrates four existing and well-recognized approaches...... to studying events, processes and change, mamely change-point analysis, event-history analysis, critical-incident technique and sequence analysis....

  8. Aftershock identification problem via the nearest-neighbor analysis for marked point processes

    Science.gov (United States)

    Gabrielov, A.; Zaliapin, I.; Wong, H.; Keilis-Borok, V.

    2007-12-01

    The centennial observations on the world seismicity have revealed a wide variety of clustering phenomena that unfold in the space-time-energy domain and provide most reliable information about the earthquake dynamics. However, there is neither a unifying theory nor a convenient statistical apparatus that would naturally account for the different types of seismic clustering. In this talk we present a theoretical framework for nearest-neighbor analysis of marked processes and obtain new results on hierarchical approach to studying seismic clustering introduced by Baiesi and Paczuski (2004). Recall that under this approach one defines an asymmetric distance D in space-time-energy domain such that the nearest-neighbor spanning graph with respect to D becomes a time- oriented tree. We demonstrate how this approach can be used to detect earthquake clustering. We apply our analysis to the observed seismicity of California and synthetic catalogs from ETAS model and show that the earthquake clustering part is statistically different from the homogeneous part. This finding may serve as a basis for an objective aftershock identification procedure.

  9. Strong approximations and sequential change-point analysis for diffusion processes

    DEFF Research Database (Denmark)

    Mihalache, Stefan-Radu

    2012-01-01

    In this paper ergodic diffusion processes depending on a parameter in the drift are considered under the assumption that the processes can be observed continuously. Strong approximations by Wiener processes for a stochastic integral and for the estimator process constructed by the one...

  10. Second-order analysis of inhomogeneous spatial point processes with proportional intensity functions

    DEFF Research Database (Denmark)

    Guan, Yongtao; Waagepetersen, Rasmus; Beale, Colin M.

    2008-01-01

    of the intensity functions. The first approach is based on nonparametric kernel-smoothing, whereas the second approach uses a conditional likelihood estimation approach to fit a parametric model for the pair correlation function. A great advantage of the proposed methods is that they do not require the often...... to two spatial point patterns regarding the spatial distributions of birds in the U.K.'s Peak District in 1990 and 2004....

  11. Hazard rate model and statistical analysis of a compound point process

    Czech Academy of Sciences Publication Activity Database

    Volf, Petr

    2005-01-01

    Roč. 41, č. 6 (2005), s. 773-786 ISSN 0023-5954 R&D Projects: GA ČR(CZ) GA402/04/1294 Institutional research plan: CEZ:AV0Z10750506 Keywords : couting process * compound process * Cox regression model * intensity Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.343, year: 2005

  12. Fixed-point signal processing

    CERN Document Server

    Padgett, Wayne T

    2009-01-01

    This book is intended to fill the gap between the ""ideal precision"" digital signal processing (DSP) that is widely taught, and the limited precision implementation skills that are commonly required in fixed-point processors and field programmable gate arrays (FPGAs). These skills are often neglected at the university level, particularly for undergraduates. We have attempted to create a resource both for a DSP elective course and for the practicing engineer with a need to understand fixed-point implementation. Although we assume a background in DSP, Chapter 2 contains a review of basic theory

  13. Modelling estimation and analysis of dynamic processes from image sequences using temporal random closed sets and point processes with application to the cell exocytosis and endocytosis

    OpenAIRE

    Díaz Fernández, Ester

    2010-01-01

    In this thesis, new models and methodologies are introduced for the analysis of dynamic processes characterized by image sequences with spatial temporal overlapping. The spatial temporal overlapping exists in many natural phenomena and should be addressed properly in several Science disciplines such as Microscopy, Material Sciences, Biology, Geostatistics or Communication Networks. This work is related to the Point Process and Random Closed Set theories, within Stochastic Ge...

  14. Bayesian analysis of spatial point processes in the neighbourhood of Voronoi networks

    DEFF Research Database (Denmark)

    Skare, Øivind; Møller, Jesper; Jensen, Eva Bjørn Vedel

    2007-01-01

    A model for an inhomogeneous Poisson process with high intensity near the edges of a Voronoi tessellation in 2D or 3D is proposed. The model is analysed in a Bayesian setting with priors on nuclei of the Voronoi tessellation and other model parameters. An MCMC algorithm is constructed to sample...

  15. Bayesian analysis of spatial point processes in the neighbourhood of Voronoi networks

    DEFF Research Database (Denmark)

    Skare, Øivind; Møller, Jesper; Vedel Jensen, Eva B.

    A model for an inhomogeneous Poisson process with high intensity near the edges of a Voronoi tessellation in 2D or 3D is proposed. The model is analysed in a Bayesian setting with priors on nuclei of the Voronoi tessellation and other model parameters. An MCMC algorithm is constructed to sample...

  16. Comparison of second-generation processes for the conversion of sugarcane bagasse to liquid biofuels in terms of energy efficiency, pinch point analysis and Life Cycle Analysis

    International Nuclear Information System (INIS)

    Petersen, A.M.; Melamu, Rethabi; Knoetze, J.H.; Görgens, J.F.

    2015-01-01

    Highlights: • Process evaluation of thermochemical and biological routes for bagasse to fuels. • Pinch point analysis increases overall efficiencies by reducing utility consumption. • Advanced biological route increased efficiency and local environmental impacts. • Thermochemical routes have the highest efficiencies and low life cycle impacts. - Abstract: Three alternative processes for the production of liquid transportation biofuels from sugar cane bagasse were compared, on the perspective of energy efficiencies using process modelling, Process Environmental Assessments and Life Cycle Assessment. Bio-ethanol via two biological processes was considered, i.e. Separate Hydrolysis and Fermentation (Process 1) and Simultaneous Saccharification and Fermentation (Process 2), in comparison to Gasification and Fischer Tropsch synthesis for the production of synthetic fuels (Process 3). The energy efficiency of each process scenario was maximised by pinch point analysis for heat integration. The more advanced bio-ethanol process was Process 2 and it had a higher energy efficiency at 42.3%. Heat integration was critical for the Process 3, whereby the energy efficiency was increased from 51.6% to 55.7%. For both the Process Environmental and Life Cycle Assessment, Process 3 had the least potential for detrimental environmental impacts, due to its relatively high energy efficiency. Process 2 had the greatest Process Environmental Impact due to the intensive use of processing chemicals. Regarding the Life Cycle Assessments, Process 1 was the most severe due to its low energy efficiency

  17. Modeling fixation locations using spatial point processes.

    Science.gov (United States)

    Barthelmé, Simon; Trukenbrod, Hans; Engbert, Ralf; Wichmann, Felix

    2013-10-01

    Whenever eye movements are measured, a central part of the analysis has to do with where subjects fixate and why they fixated where they fixated. To a first approximation, a set of fixations can be viewed as a set of points in space; this implies that fixations are spatial data and that the analysis of fixation locations can be beneficially thought of as a spatial statistics problem. We argue that thinking of fixation locations as arising from point processes is a very fruitful framework for eye-movement data, helping turn qualitative questions into quantitative ones. We provide a tutorial introduction to some of the main ideas of the field of spatial statistics, focusing especially on spatial Poisson processes. We show how point processes help relate image properties to fixation locations. In particular we show how point processes naturally express the idea that image features' predictability for fixations may vary from one image to another. We review other methods of analysis used in the literature, show how they relate to point process theory, and argue that thinking in terms of point processes substantially extends the range of analyses that can be performed and clarify their interpretation.

  18. Processing Terrain Point Cloud Data

    KAUST Repository

    DeVore, Ronald

    2013-01-10

    Terrain point cloud data are typically acquired through some form of Light Detection And Ranging sensing. They form a rich resource that is important in a variety of applications including navigation, line of sight, and terrain visualization. Processing terrain data has not received the attention of other forms of surface reconstruction or of image processing. The goal of terrain data processing is to convert the point cloud into a succinct representation system that is amenable to the various application demands. The present paper presents a platform for terrain processing built on the following principles: (i) measuring distortion in the Hausdorff metric, which we argue is a good match for the application demands, (ii) a multiscale representation based on tree approximation using local polynomial fitting. The basic elements held in the nodes of the tree can be efficiently encoded, transmitted, visualized, and utilized for the various target applications. Several challenges emerge because of the variable resolution of the data, missing data, occlusions, and noise. Techniques for identifying and handling these challenges are developed. © 2013 Society for Industrial and Applied Mathematics.

  19. Parametric statistical change point analysis

    CERN Document Server

    Chen, Jie

    2000-01-01

    This work is an in-depth study of the change point problem from a general point of view and a further examination of change point analysis of the most commonly used statistical models Change point problems are encountered in such disciplines as economics, finance, medicine, psychology, signal processing, and geology, to mention only several The exposition is clear and systematic, with a great deal of introductory material included Different models are presented in each chapter, including gamma and exponential models, rarely examined thus far in the literature Other models covered in detail are the multivariate normal, univariate normal, regression, and discrete models Extensive examples throughout the text emphasize key concepts and different methodologies are used, namely the likelihood ratio criterion, and the Bayesian and information criterion approaches A comprehensive bibliography and two indices complete the study

  20. Processing Terrain Point Cloud Data

    KAUST Repository

    DeVore, Ronald; Petrova, Guergana; Hielsberg, Matthew; Owens, Luke; Clack, Billy; Sood, Alok

    2013-01-01

    Terrain point cloud data are typically acquired through some form of Light Detection And Ranging sensing. They form a rich resource that is important in a variety of applications including navigation, line of sight, and terrain visualization

  1. Inhomogeneous Markov point processes by transformation

    DEFF Research Database (Denmark)

    Jensen, Eva B. Vedel; Nielsen, Linda Stougaard

    2000-01-01

    We construct parametrized models for point processes, allowing for both inhomogeneity and interaction. The inhomogeneity is obtained by applying parametrized transformations to homogeneous Markov point processes. An interesting model class, which can be constructed by this transformation approach......, is that of exponential inhomogeneous Markov point processes. Statistical inference For such processes is discussed in some detail....

  2. Testing Local Independence between Two Point Processes

    DEFF Research Database (Denmark)

    Allard, Denis; Brix, Anders; Chadæuf, Joël

    2001-01-01

    Independence test, Inhomogeneous point processes, Local test, Monte Carlo, Nonstationary, Rotations, Spatial pattern, Tiger bush......Independence test, Inhomogeneous point processes, Local test, Monte Carlo, Nonstationary, Rotations, Spatial pattern, Tiger bush...

  3. Hygienic-sanitary working practices and implementation of a Hazard Analysis and Critical Control Point (HACCP plan in lobster processing industries

    Directory of Open Access Journals (Sweden)

    Cristina Farias da Fonseca

    2013-03-01

    Full Text Available This study aimed to verify the hygienic-sanitary working practices and to create and implement a Hazard Analysis Critical Control Point (HACCP in two lobster processing industries in Pernambuco State, Brazil. The industries studied process frozen whole lobsters, frozen whole cooked lobsters, and frozen lobster tails for exportation. The application of the hygienic-sanitary checklist in the industries analyzed achieved conformity rates over 96% to the aspects evaluated. The use of the Hazard Analysis Critical Control Point (HACCP plan resulted in the detection of two critical control points (CCPs including the receiving and classification steps in the processing of frozen lobster and frozen lobster tails, and an additional critical control point (CCP was detected during the cooking step of processing of the whole frozen cooked lobster. The proper implementation of the Hazard Analysis Critical Control Point (HACCP plan in the lobster processing industries studied proved to be the safest and most cost-effective method to monitor each critical control point (CCP hazards.

  4. Parametric methods for spatial point processes

    DEFF Research Database (Denmark)

    Møller, Jesper

    is studied in Section 4, and Bayesian inference in Section 5. On one hand, as the development in computer technology and computational statistics continues,computationally-intensive simulation-based methods for likelihood inference probably will play a increasing role for statistical analysis of spatial...... inference procedures for parametric spatial point process models. The widespread use of sensible but ad hoc methods based on summary statistics of the kind studied in Chapter 4.3 have through the last two decades been supplied by likelihood based methods for parametric spatial point process models......(This text is submitted for the volume ‘A Handbook of Spatial Statistics' edited by A.E. Gelfand, P. Diggle, M. Fuentes, and P. Guttorp, to be published by Chapmand and Hall/CRC Press, and planned to appear as Chapter 4.4 with the title ‘Parametric methods'.) 1 Introduction This chapter considers...

  5. Lévy based Cox point processes

    DEFF Research Database (Denmark)

    Hellmund, Gunnar; Prokesová, Michaela; Jensen, Eva Bjørn Vedel

    2008-01-01

    In this paper we introduce Lévy-driven Cox point processes (LCPs) as Cox point processes with driving intensity function Λ defined by a kernel smoothing of a Lévy basis (an independently scattered, infinitely divisible random measure). We also consider log Lévy-driven Cox point processes (LLCPs......) with Λ equal to the exponential of such a kernel smoothing. Special cases are shot noise Cox processes, log Gaussian Cox processes, and log shot noise Cox processes. We study the theoretical properties of Lévy-based Cox processes, including moment properties described by nth-order product densities...

  6. State estimation for temporal point processes

    NARCIS (Netherlands)

    van Lieshout, Maria Nicolette Margaretha

    2015-01-01

    This paper is concerned with combined inference for point processes on the real line observed in a broken interval. For such processes, the classic history-based approach cannot be used. Instead, we adapt tools from sequential spatial point processes. For a range of models, the marginal and

  7. Poisson point processes imaging, tracking, and sensing

    CERN Document Server

    Streit, Roy L

    2010-01-01

    This overview of non-homogeneous and multidimensional Poisson point processes and their applications features mathematical tools and applications from emission- and transmission-computed tomography to multiple target tracking and distributed sensor detection.

  8. Statistical aspects of determinantal point processes

    DEFF Research Database (Denmark)

    Lavancier, Frédéric; Møller, Jesper; Rubak, Ege

    The statistical aspects of determinantal point processes (DPPs) seem largely unexplored. We review the appealing properties of DDPs, demonstrate that they are useful models for repulsiveness, detail a simulation procedure, and provide freely available software for simulation and statistical infer...

  9. Point Information Gain and Multidimensional Data Analysis

    Directory of Open Access Journals (Sweden)

    Renata Rychtáriková

    2016-10-01

    Full Text Available We generalize the point information gain (PIG and derived quantities, i.e., point information gain entropy (PIE and point information gain entropy density (PIED, for the case of the Rényi entropy and simulate the behavior of PIG for typical distributions. We also use these methods for the analysis of multidimensional datasets. We demonstrate the main properties of PIE/PIED spectra for the real data with the examples of several images and discuss further possible utilizations in other fields of data processing.

  10. Modern Statistics for Spatial Point Processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Waagepetersen, Rasmus

    2007-01-01

    We summarize and discuss the current state of spatial point process theory and directions for future research, making an analogy with generalized linear models and random effect models, and illustrating the theory with various examples of applications. In particular, we consider Poisson, Gibbs...

  11. Modern statistics for spatial point processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Waagepetersen, Rasmus

    We summarize and discuss the current state of spatial point process theory and directions for future research, making an analogy with generalized linear models and random effect models, and illustrating the theory with various examples of applications. In particular, we consider Poisson, Gibbs...

  12. Stochastic analysis for Poisson point processes Malliavin calculus, Wiener-Itô chaos expansions and stochastic geometry

    CERN Document Server

    Peccati, Giovanni

    2016-01-01

    Stochastic geometry is the branch of mathematics that studies geometric structures associated with random configurations, such as random graphs, tilings and mosaics. Due to its close ties with stereology and spatial statistics, the results in this area are relevant for a large number of important applications, e.g. to the mathematical modeling and statistical analysis of telecommunication networks, geostatistics and image analysis. In recent years – due mainly to the impetus of the authors and their collaborators – a powerful connection has been established between stochastic geometry and the Malliavin calculus of variations, which is a collection of probabilistic techniques based on the properties of infinite-dimensional differential operators. This has led in particular to the discovery of a large number of new quantitative limit theorems for high-dimensional geometric objects. This unique book presents an organic collection of authoritative surveys written by the principal actors in this rapidly evolvi...

  13. Extreme values, regular variation and point processes

    CERN Document Server

    Resnick, Sidney I

    1987-01-01

    Extremes Values, Regular Variation and Point Processes is a readable and efficient account of the fundamental mathematical and stochastic process techniques needed to study the behavior of extreme values of phenomena based on independent and identically distributed random variables and vectors It presents a coherent treatment of the distributional and sample path fundamental properties of extremes and records It emphasizes the core primacy of three topics necessary for understanding extremes the analytical theory of regularly varying functions; the probabilistic theory of point processes and random measures; and the link to asymptotic distribution approximations provided by the theory of weak convergence of probability measures in metric spaces The book is self-contained and requires an introductory measure-theoretic course in probability as a prerequisite Almost all sections have an extensive list of exercises which extend developments in the text, offer alternate approaches, test mastery and provide for enj...

  14. Determinantal point process models on the sphere

    DEFF Research Database (Denmark)

    Møller, Jesper; Nielsen, Morten; Porcu, Emilio

    defined on Sd × Sd . We review the appealing properties of such processes, including their specific moment properties, density expressions and simulation procedures. Particularly, we characterize and construct isotropic DPPs models on Sd , where it becomes essential to specify the eigenvalues......We consider determinantal point processes on the d-dimensional unit sphere Sd . These are finite point processes exhibiting repulsiveness and with moment properties determined by a certain determinant whose entries are specified by a so-called kernel which we assume is a complex covariance function...... and eigenfunctions in a spectral representation for the kernel, and we figure out how repulsive isotropic DPPs can be. Moreover, we discuss the shortcomings of adapting existing models for isotropic covariance functions and consider strategies for developing new models, including a useful spectral approach....

  15. Estimating Function Approaches for Spatial Point Processes

    Science.gov (United States)

    Deng, Chong

    Spatial point pattern data consist of locations of events that are often of interest in biological and ecological studies. Such data are commonly viewed as a realization from a stochastic process called spatial point process. To fit a parametric spatial point process model to such data, likelihood-based methods have been widely studied. However, while maximum likelihood estimation is often too computationally intensive for Cox and cluster processes, pairwise likelihood methods such as composite likelihood, Palm likelihood usually suffer from the loss of information due to the ignorance of correlation among pairs. For many types of correlated data other than spatial point processes, when likelihood-based approaches are not desirable, estimating functions have been widely used for model fitting. In this dissertation, we explore the estimating function approaches for fitting spatial point process models. These approaches, which are based on the asymptotic optimal estimating function theories, can be used to incorporate the correlation among data and yield more efficient estimators. We conducted a series of studies to demonstrate that these estmating function approaches are good alternatives to balance the trade-off between computation complexity and estimating efficiency. First, we propose a new estimating procedure that improves the efficiency of pairwise composite likelihood method in estimating clustering parameters. Our approach combines estimating functions derived from pairwise composite likeli-hood estimation and estimating functions that account for correlations among the pairwise contributions. Our method can be used to fit a variety of parametric spatial point process models and can yield more efficient estimators for the clustering parameters than pairwise composite likelihood estimation. We demonstrate its efficacy through a simulation study and an application to the longleaf pine data. Second, we further explore the quasi-likelihood approach on fitting

  16. Food safety and nutritional quality for the prevention of non communicable diseases: the Nutrient, hazard Analysis and Critical Control Point process (NACCP).

    Science.gov (United States)

    Di Renzo, Laura; Colica, Carmen; Carraro, Alberto; Cenci Goga, Beniamino; Marsella, Luigi Tonino; Botta, Roberto; Colombo, Maria Laura; Gratteri, Santo; Chang, Ting Fa Margherita; Droli, Maurizio; Sarlo, Francesca; De Lorenzo, Antonino

    2015-04-23

    The important role of food and nutrition in public health is being increasingly recognized as crucial for its potential impact on health-related quality of life and the economy, both at the societal and individual levels. The prevalence of non-communicable diseases calls for a reformulation of our view of food. The Hazard Analysis and Critical Control Point (HACCP) system, first implemented in the EU with the Directive 43/93/CEE, later replaced by Regulation CE 178/2002 and Regulation CE 852/2004, is the internationally agreed approach for food safety control. Our aim is to develop a new procedure for the assessment of the Nutrient, hazard Analysis and Critical Control Point (NACCP) process, for total quality management (TMQ), and optimize nutritional levels. NACCP was based on four general principles: i) guarantee of health maintenance; ii) evaluate and assure the nutritional quality of food and TMQ; iii) give correct information to the consumers; iv) ensure an ethical profit. There are three stages for the application of the NACCP process: 1) application of NACCP for quality principles; 2) application of NACCP for health principals; 3) implementation of the NACCP process. The actions are: 1) identification of nutritional markers, which must remain intact throughout the food supply chain; 2) identification of critical control points which must monitored in order to minimize the likelihood of a reduction in quality; 3) establishment of critical limits to maintain adequate levels of nutrient; 4) establishment, and implementation of effective monitoring procedures of critical control points; 5) establishment of corrective actions; 6) identification of metabolic biomarkers; 7) evaluation of the effects of food intake, through the application of specific clinical trials; 8) establishment of procedures for consumer information; 9) implementation of the Health claim Regulation EU 1924/2006; 10) starting a training program. We calculate the risk assessment as follows

  17. Point cloud processing for smart systems

    Directory of Open Access Journals (Sweden)

    Jaromír Landa

    2013-01-01

    Full Text Available High population as well as the economical tension emphasises the necessity of effective city management – from land use planning to urban green maintenance. The management effectiveness is based on precise knowledge of the city environment. Point clouds generated by mobile and terrestrial laser scanners provide precise data about objects in the scanner vicinity. From these data pieces the state of the roads, buildings, trees and other objects important for this decision-making process can be obtained. Generally, they can support the idea of “smart” or at least “smarter” cities.Unfortunately the point clouds do not provide this type of information automatically. It has to be extracted. This extraction is done by expert personnel or by object recognition software. As the point clouds can represent large areas (streets or even cities, usage of expert personnel to identify the required objects can be very time-consuming, therefore cost ineffective. Object recognition software allows us to detect and identify required objects semi-automatically or automatically.The first part of the article reviews and analyses the state of current art point cloud object recognition techniques. The following part presents common formats used for point cloud storage and frequently used software tools for point cloud processing. Further, a method for extraction of geospatial information about detected objects is proposed. Therefore, the method can be used not only to recognize the existence and shape of certain objects, but also to retrieve their geospatial properties. These objects can be later directly used in various GIS systems for further analyses.

  18. Analysis of irregularly distributed points

    DEFF Research Database (Denmark)

    Hartelius, Karsten

    1996-01-01

    conditional modes are applied to this problem. The Kalman filter is described as a powerfull tool for modelling two-dimensional data. Motivated by the development of the reduced update Kalman filter we propose a reduced update Kalman smoother which offers considerable computa- tional savings. Kriging...... on hybridisation analysis, which comprise matching a grid to an arrayed set of DNA- clones spotted onto a hybridisation filter. The line process has proven to perform a satisfactorly modelling of shifted fields (subgrids) in the hybridisation grid, and a two-staged hierarchical grid matching scheme which...

  19. Statistical aspects of determinantal point processes

    DEFF Research Database (Denmark)

    Lavancier, Frédéric; Møller, Jesper; Rubak, Ege Holger

    The statistical aspects of determinantal point processes (DPPs) seem largely unexplored. We review the appealing properties of DDPs, demonstrate that they are useful models for repulsiveness, detail a simulation procedure, and provide freely available software for simulation and statistical...... inference. We pay special attention to stationary DPPs, where we give a simple condition ensuring their existence, construct parametric models, describe how they can be well approximated so that the likelihood can be evaluated and realizations can be simulated, and discuss how statistical inference...

  20. Transforming spatial point processes into Poisson processes using random superposition

    DEFF Research Database (Denmark)

    Møller, Jesper; Berthelsen, Kasper Klitgaaard

    with a complementary spatial point process Y  to obtain a Poisson process X∪Y  with intensity function β. Underlying this is a bivariate spatial birth-death process (Xt,Yt) which converges towards the distribution of (X,Y). We study the joint distribution of X and Y, and their marginal and conditional distributions....... In particular, we introduce a fast and easy simulation procedure for Y conditional on X. This may be used for model checking: given a model for the Papangelou intensity of the original spatial point process, this model is used to generate the complementary process, and the resulting superposition is a Poisson...... process with intensity function β if and only if the true Papangelou intensity is used. Whether the superposition is actually such a Poisson process can easily be examined using well known results and fast simulation procedures for Poisson processes. We illustrate this approach to model checking...

  1. [Design of a Hazard Analysis and Critical Control Points (HACCP) plan to assure the safety of a bologna product produced by a meat processing plant].

    Science.gov (United States)

    Bou Rached, Lizet; Ascanio, Norelis; Hernández, Pilar

    2004-03-01

    The Hazard Analysis and Critical Control Point (HACCP) is a systematic integral program used to identify and estimate the hazards (microbiological, chemical and physical) and the risks generated during the primary production, processing, storage, distribution, expense and consumption of foods. To establish a program of HACCP has advantages, being some of them: to emphasize more in the prevention than in the detection, to diminish the costs, to minimize the risk of manufacturing faulty products, to allow bigger trust to the management, to strengthen the national and international competitiveness, among others. The present work is a proposal based on the design of an HACCP program to guarantee the safety of the Bologna Special Type elaborated by a meat products industry, through the determination of hazards (microbiological, chemical or physical), the identification of critical control points (CCP), the establishment of critical limits, plan corrective actions and the establishment of documentation and verification procedures. The used methodology was based in the application of the seven basic principles settled down by the Codex Alimentarius, obtaining the design of this program. In view of the fact that recently the meat products are linked with pathogens like E. coli O157:H7 and Listeria monocytogenes, these were contemplated as microbiological hazard for the establishment of the HACCP plan whose application will guarantee the obtaining of a safe product.

  2. Post-Processing in the Material-Point Method

    DEFF Research Database (Denmark)

    Andersen, Søren; Andersen, Lars Vabbersgaard

    The material-point method (MPM) is a numerical method for dynamic or static analysis of solids using a discretization in time and space. The method has shown to be successful in modelling physical problems involving large deformations, which are difficult to model with traditional numerical tools...... such as the finite element method. In the material-point method, a set of material points is utilized to track the problem in time and space, while a computational background grid is utilized to obtain spatial derivatives relevant to the physical problem. Currently, the research within the material-point method......-point method. The first idea involves associating a volume with each material point and displaying the deformation of this volume. In the discretization process, the physical domain is divided into a number of smaller volumes each represented by a simple shape; here quadrilaterals are chosen for the presented...

  3. Risk analysis of hematopoietic stem cell transplant process: failure mode, effect, and criticality analysis and hazard analysis critical control point methods integration based on guidelines to good manufacturing practice for medicinal product ANNEX 20 (February 2008).

    Science.gov (United States)

    Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F

    2010-01-01

    The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module. Copyright 2010 Elsevier Inc. All rights reserved.

  4. Microbiological performance of Hazard Analysis Critical Control Point (HACCP)-based food safety management systems: A case of Nile perch processing company

    NARCIS (Netherlands)

    Kussaga, J.B.; Luning, P.A.; Tiisekwa, B.P.M.; Jacxsens, L.

    2017-01-01

    This study aimed at giving insight into microbiological safety output of a Hazard Analysis Critical Control Point (HACCP)-based Food Safety Management System (FSMS) of a Nile perch exporting company by using a combined assessment, This study aimed at giving insight into microbiological safety output

  5. Critical point analysis of phase envelope diagram

    Energy Technology Data Exchange (ETDEWEB)

    Soetikno, Darmadi; Siagian, Ucok W. R. [Department of Petroleum Engineering, Institut Teknologi Bandung, Jl. Ganesha 10, Bandung 40132 (Indonesia); Kusdiantara, Rudy, E-mail: rkusdiantara@s.itb.ac.id; Puspita, Dila, E-mail: rkusdiantara@s.itb.ac.id; Sidarto, Kuntjoro A., E-mail: rkusdiantara@s.itb.ac.id; Soewono, Edy; Gunawan, Agus Y. [Department of Mathematics, Institut Teknologi Bandung, Jl. Ganesha 10, Bandung 40132 (Indonesia)

    2014-03-24

    Phase diagram or phase envelope is a relation between temperature and pressure that shows the condition of equilibria between the different phases of chemical compounds, mixture of compounds, and solutions. Phase diagram is an important issue in chemical thermodynamics and hydrocarbon reservoir. It is very useful for process simulation, hydrocarbon reactor design, and petroleum engineering studies. It is constructed from the bubble line, dew line, and critical point. Bubble line and dew line are composed of bubble points and dew points, respectively. Bubble point is the first point at which the gas is formed when a liquid is heated. Meanwhile, dew point is the first point where the liquid is formed when the gas is cooled. Critical point is the point where all of the properties of gases and liquids are equal, such as temperature, pressure, amount of substance, and others. Critical point is very useful in fuel processing and dissolution of certain chemicals. Here in this paper, we will show the critical point analytically. Then, it will be compared with numerical calculations of Peng-Robinson equation by using Newton-Raphson method. As case studies, several hydrocarbon mixtures are simulated using by Matlab.

  6. Critical point analysis of phase envelope diagram

    International Nuclear Information System (INIS)

    Soetikno, Darmadi; Siagian, Ucok W. R.; Kusdiantara, Rudy; Puspita, Dila; Sidarto, Kuntjoro A.; Soewono, Edy; Gunawan, Agus Y.

    2014-01-01

    Phase diagram or phase envelope is a relation between temperature and pressure that shows the condition of equilibria between the different phases of chemical compounds, mixture of compounds, and solutions. Phase diagram is an important issue in chemical thermodynamics and hydrocarbon reservoir. It is very useful for process simulation, hydrocarbon reactor design, and petroleum engineering studies. It is constructed from the bubble line, dew line, and critical point. Bubble line and dew line are composed of bubble points and dew points, respectively. Bubble point is the first point at which the gas is formed when a liquid is heated. Meanwhile, dew point is the first point where the liquid is formed when the gas is cooled. Critical point is the point where all of the properties of gases and liquids are equal, such as temperature, pressure, amount of substance, and others. Critical point is very useful in fuel processing and dissolution of certain chemicals. Here in this paper, we will show the critical point analytically. Then, it will be compared with numerical calculations of Peng-Robinson equation by using Newton-Raphson method. As case studies, several hydrocarbon mixtures are simulated using by Matlab

  7. Some properties of point processes in statistical optics

    International Nuclear Information System (INIS)

    Picinbono, B.; Bendjaballah, C.

    2010-01-01

    The analysis of the statistical properties of the point process (PP) of photon detection times can be used to determine whether or not an optical field is classical, in the sense that its statistical description does not require the methods of quantum optics. This determination is, however, more difficult than ordinarily admitted and the first aim of this paper is to illustrate this point by using some results of the PP theory. For example, it is well known that the analysis of the photodetection of classical fields exhibits the so-called bunching effect. But this property alone cannot be used to decide the nature of a given optical field. Indeed, we have presented examples of point processes for which a bunching effect appears and yet they cannot be obtained from a classical field. These examples are illustrated by computer simulations. Similarly, it is often admitted that for fields with very low light intensity the bunching or antibunching can be described by using the statistical properties of the distance between successive events of the point process, which simplifies the experimental procedure. We have shown that, while this property is valid for classical PPs, it has no reason to be true for nonclassical PPs, and we have presented some examples of this situation also illustrated by computer simulations.

  8. Some probabilistic properties of fractional point processes

    KAUST Repository

    Garra, Roberto

    2017-05-16

    In this article, the first hitting times of generalized Poisson processes N-f (t), related to Bernstein functions f are studied. For the spacefractional Poisson processes, N alpha (t), t > 0 ( corresponding to f = x alpha), the hitting probabilities P{T-k(alpha) < infinity} are explicitly obtained and analyzed. The processes N-f (t) are time-changed Poisson processes N( H-f (t)) with subordinators H-f (t) and here we study N(Sigma H-n(j= 1)f j (t)) and obtain probabilistic features of these extended counting processes. A section of the paper is devoted to processes of the form N( G(H,v) (t)) where G(H,v) (t) are generalized grey Brownian motions. This involves the theory of time-dependent fractional operators of the McBride form. While the time-fractional Poisson process is a renewal process, we prove that the space-time Poisson process is no longer a renewal process.

  9. A logistic regression estimating function for spatial Gibbs point processes

    DEFF Research Database (Denmark)

    Baddeley, Adrian; Coeurjolly, Jean-François; Rubak, Ege

    We propose a computationally efficient logistic regression estimating function for spatial Gibbs point processes. The sample points for the logistic regression consist of the observed point pattern together with a random pattern of dummy points. The estimating function is closely related to the p......We propose a computationally efficient logistic regression estimating function for spatial Gibbs point processes. The sample points for the logistic regression consist of the observed point pattern together with a random pattern of dummy points. The estimating function is closely related...

  10. Some probabilistic properties of fractional point processes

    KAUST Repository

    Garra, Roberto; Orsingher, Enzo; Scavino, Marco

    2017-01-01

    P{T-k(alpha) < infinity} are explicitly obtained and analyzed. The processes N-f (t) are time-changed Poisson processes N( H-f (t)) with subordinators H-f (t) and here we study N(Sigma H-n(j= 1)f j (t)) and obtain probabilistic features

  11. Intensity-dependent point spread image processing

    International Nuclear Information System (INIS)

    Cornsweet, T.N.; Yellott, J.I.

    1984-01-01

    There is ample anatomical, physiological and psychophysical evidence that the mammilian retina contains networks that mediate interactions among neighboring receptors, resulting in intersecting transformations between input images and their corresponding neural output patterns. The almost universally accepted view is that the principal form of interaction involves lateral inhibition, resulting in an output pattern that is the convolution of the input with a ''Mexican hat'' or difference-of-Gaussians spread function, having a positive center and a negative surround. A closely related process is widely applied in digital image processing, and in photography as ''unsharp masking''. The authors show that a simple and fundamentally different process, involving no inhibitory or subtractive terms can also account for the physiological and psychophysical findings that have been attributed to lateral inhibition. This process also results in a number of fundamental effects that occur in mammalian vision and that would be of considerable significance in robotic vision, but which cannot be explained by lateral inhibitory interaction

  12. A case study on point process modelling in disease mapping

    DEFF Research Database (Denmark)

    Møller, Jesper; Waagepetersen, Rasmus Plenge; Benes, Viktor

    2005-01-01

    of the risk on the covariates. Instead of using the common areal level approaches we base the analysis on a Bayesian approach for a log Gaussian Cox point process with covariates. Posterior characteristics for a discretized version of the log Gaussian Cox process are computed using Markov chain Monte Carlo...... methods. A particular problem which is thoroughly discussed is to determine a model for the background population density. The risk map shows a clear dependency with the population intensity models and the basic model which is adopted for the population intensity determines what covariates influence...... the risk of TBE. Model validation is based on the posterior predictive distribution of various summary statistics....

  13. Statistical properties of several models of fractional random point processes

    Science.gov (United States)

    Bendjaballah, C.

    2011-08-01

    Statistical properties of several models of fractional random point processes have been analyzed from the counting and time interval statistics points of view. Based on the criterion of the reduced variance, it is seen that such processes exhibit nonclassical properties. The conditions for these processes to be treated as conditional Poisson processes are examined. Numerical simulations illustrate part of the theoretical calculations.

  14. Microbial profile and critical control points during processing of 'robo ...

    African Journals Online (AJOL)

    Microbial profile and critical control points during processing of 'robo' snack from ... the relevant critical control points especially in relation to raw materials and ... to the quality of the various raw ingredients used were the roasting using earthen

  15. Point processes and the position distribution of infinite boson systems

    International Nuclear Information System (INIS)

    Fichtner, K.H.; Freudenberg, W.

    1987-01-01

    It is shown that to each locally normal state of a boson system one can associate a point process that can be interpreted as the position distribution of the state. The point process contains all information one can get by position measurements and is determined by the latter. On the other hand, to each so-called Σ/sup c/-point process Q they relate a locally normal state with position distribution Q

  16. Simple computation of reaction–diffusion processes on point clouds

    KAUST Repository

    Macdonald, Colin B.; Merriman, Barry; Ruuth, Steven J.

    2013-01-01

    The study of reaction-diffusion processes is much more complicated on general curved surfaces than on standard Cartesian coordinate spaces. Here we show how to formulate and solve systems of reaction-diffusion equations on surfaces in an extremely simple way, using only the standard Cartesian form of differential operators, and a discrete unorganized point set to represent the surface. Our method decouples surface geometry from the underlying differential operators. As a consequence, it becomes possible to formulate and solve rather general reaction-diffusion equations on general surfaces without having to consider the complexities of differential geometry or sophisticated numerical analysis. To illustrate the generality of the method, computations for surface diffusion, pattern formation, excitable media, and bulk-surface coupling are provided for a variety of complex point cloud surfaces.

  17. Simple computation of reaction–diffusion processes on point clouds

    KAUST Repository

    Macdonald, Colin B.

    2013-05-20

    The study of reaction-diffusion processes is much more complicated on general curved surfaces than on standard Cartesian coordinate spaces. Here we show how to formulate and solve systems of reaction-diffusion equations on surfaces in an extremely simple way, using only the standard Cartesian form of differential operators, and a discrete unorganized point set to represent the surface. Our method decouples surface geometry from the underlying differential operators. As a consequence, it becomes possible to formulate and solve rather general reaction-diffusion equations on general surfaces without having to consider the complexities of differential geometry or sophisticated numerical analysis. To illustrate the generality of the method, computations for surface diffusion, pattern formation, excitable media, and bulk-surface coupling are provided for a variety of complex point cloud surfaces.

  18. Self-exciting point process in modeling earthquake occurrences

    International Nuclear Information System (INIS)

    Pratiwi, H.; Slamet, I.; Respatiwulan; Saputro, D. R. S.

    2017-01-01

    In this paper, we present a procedure for modeling earthquake based on spatial-temporal point process. The magnitude distribution is expressed as truncated exponential and the event frequency is modeled with a spatial-temporal point process that is characterized uniquely by its associated conditional intensity process. The earthquakes can be regarded as point patterns that have a temporal clustering feature so we use self-exciting point process for modeling the conditional intensity function. The choice of main shocks is conducted via window algorithm by Gardner and Knopoff and the model can be fitted by maximum likelihood method for three random variables. (paper)

  19. Interior point algorithms theory and analysis

    CERN Document Server

    Ye, Yinyu

    2011-01-01

    The first comprehensive review of the theory and practice of one of today's most powerful optimization techniques. The explosive growth of research into and development of interior point algorithms over the past two decades has significantly improved the complexity of linear programming and yielded some of today's most sophisticated computing techniques. This book offers a comprehensive and thorough treatment of the theory, analysis, and implementation of this powerful computational tool. Interior Point Algorithms provides detailed coverage of all basic and advanced aspects of the subject.

  20. A Marked Point Process Framework for Extracellular Electrical Potentials

    Directory of Open Access Journals (Sweden)

    Carlos A. Loza

    2017-12-01

    Full Text Available Neuromodulations are an important component of extracellular electrical potentials (EEP, such as the Electroencephalogram (EEG, Electrocorticogram (ECoG and Local Field Potentials (LFP. This spatially temporal organized multi-frequency transient (phasic activity reflects the multiscale spatiotemporal synchronization of neuronal populations in response to external stimuli or internal physiological processes. We propose a novel generative statistical model of a single EEP channel, where the collected signal is regarded as the noisy addition of reoccurring, multi-frequency phasic events over time. One of the main advantages of the proposed framework is the exceptional temporal resolution in the time location of the EEP phasic events, e.g., up to the sampling period utilized in the data collection. Therefore, this allows for the first time a description of neuromodulation in EEPs as a Marked Point Process (MPP, represented by their amplitude, center frequency, duration, and time of occurrence. The generative model for the multi-frequency phasic events exploits sparseness and involves a shift-invariant implementation of the clustering technique known as k-means. The cost function incorporates a robust estimation component based on correntropy to mitigate the outliers caused by the inherent noise in the EEP. Lastly, the background EEP activity is explicitly modeled as the non-sparse component of the collected signal to further improve the delineation of the multi-frequency phasic events in time. The framework is validated using two publicly available datasets: the DREAMS sleep spindles database and one of the Brain-Computer Interface (BCI competition datasets. The results achieve benchmark performance and provide novel quantitative descriptions based on power, event rates and timing in order to assess behavioral correlates beyond the classical power spectrum-based analysis. This opens the possibility for a unifying point process framework of

  1. Non-parametric Bayesian inference for inhomogeneous Markov point processes

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper; Johansen, Per Michael

    is a shot noise process, and the interaction function for a pair of points depends only on the distance between the two points and is a piecewise linear function modelled by a marked Poisson process. Simulation of the resulting posterior using a Metropolis-Hastings algorithm in the "conventional" way...

  2. A tutorial on Palm distributions for spatial point processes

    DEFF Research Database (Denmark)

    Coeurjolly, Jean-Francois; Møller, Jesper; Waagepetersen, Rasmus Plenge

    2017-01-01

    This tutorial provides an introduction to Palm distributions for spatial point processes. Initially, in the context of finite point processes, we give an explicit definition of Palm distributions in terms of their density functions. Then we review Palm distributions in the general case. Finally, we...

  3. Statistical representation of a spray as a point process

    International Nuclear Information System (INIS)

    Subramaniam, S.

    2000-01-01

    The statistical representation of a spray as a finite point process is investigated. One objective is to develop a better understanding of how single-point statistical information contained in descriptions such as the droplet distribution function (ddf), relates to the probability density functions (pdfs) associated with the droplets themselves. Single-point statistical information contained in the droplet distribution function (ddf) is shown to be related to a sequence of single surrogate-droplet pdfs, which are in general different from the physical single-droplet pdfs. It is shown that the ddf contains less information than the fundamental single-point statistical representation of the spray, which is also described. The analysis shows which events associated with the ensemble of spray droplets can be characterized by the ddf, and which cannot. The implications of these findings for the ddf approach to spray modeling are discussed. The results of this study also have important consequences for the initialization and evolution of direct numerical simulations (DNS) of multiphase flows, which are usually initialized on the basis of single-point statistics such as the droplet number density in physical space. If multiphase DNS are initialized in this way, this implies that even the initial representation contains certain implicit assumptions concerning the complete ensemble of realizations, which are invalid for general multiphase flows. Also the evolution of a DNS initialized in this manner is shown to be valid only if an as yet unproven commutation hypothesis holds true. Therefore, it is questionable to what extent DNS that are initialized in this manner constitute a direct simulation of the physical droplets. Implications of these findings for large eddy simulations of multiphase flows are also discussed. (c) 2000 American Institute of Physics

  4. Critical Control Points in the Processing of Cassava Tuber for Ighu ...

    African Journals Online (AJOL)

    Determination of the critical control points in the processing of cassava tuber into Ighu was carried out. The critical control points were determined according to the Codex guidelines for the application of the HACCP system by conducting hazard analysis. Hazard analysis involved proper examination of each processing step ...

  5. SHAPE FROM TEXTURE USING LOCALLY SCALED POINT PROCESSES

    Directory of Open Access Journals (Sweden)

    Eva-Maria Didden

    2015-09-01

    Full Text Available Shape from texture refers to the extraction of 3D information from 2D images with irregular texture. This paper introduces a statistical framework to learn shape from texture where convex texture elements in a 2D image are represented through a point process. In a first step, the 2D image is preprocessed to generate a probability map corresponding to an estimate of the unnormalized intensity of the latent point process underlying the texture elements. The latent point process is subsequently inferred from the probability map in a non-parametric, model free manner. Finally, the 3D information is extracted from the point pattern by applying a locally scaled point process model where the local scaling function represents the deformation caused by the projection of a 3D surface onto a 2D image.

  6. Tipping point analysis of ocean acoustic noise

    Science.gov (United States)

    Livina, Valerie N.; Brouwer, Albert; Harris, Peter; Wang, Lian; Sotirakopoulos, Kostas; Robinson, Stephen

    2018-02-01

    We apply tipping point analysis to a large record of ocean acoustic data to identify the main components of the acoustic dynamical system and study possible bifurcations and transitions of the system. The analysis is based on a statistical physics framework with stochastic modelling, where we represent the observed data as a composition of deterministic and stochastic components estimated from the data using time-series techniques. We analyse long-term and seasonal trends, system states and acoustic fluctuations to reconstruct a one-dimensional stochastic equation to approximate the acoustic dynamical system. We apply potential analysis to acoustic fluctuations and detect several changes in the system states in the past 14 years. These are most likely caused by climatic phenomena. We analyse trends in sound pressure level within different frequency bands and hypothesize a possible anthropogenic impact on the acoustic environment. The tipping point analysis framework provides insight into the structure of the acoustic data and helps identify its dynamic phenomena, correctly reproducing the probability distribution and scaling properties (power-law correlations) of the time series.

  7. Tipping point analysis of ocean acoustic noise

    Directory of Open Access Journals (Sweden)

    V. N. Livina

    2018-02-01

    Full Text Available We apply tipping point analysis to a large record of ocean acoustic data to identify the main components of the acoustic dynamical system and study possible bifurcations and transitions of the system. The analysis is based on a statistical physics framework with stochastic modelling, where we represent the observed data as a composition of deterministic and stochastic components estimated from the data using time-series techniques. We analyse long-term and seasonal trends, system states and acoustic fluctuations to reconstruct a one-dimensional stochastic equation to approximate the acoustic dynamical system. We apply potential analysis to acoustic fluctuations and detect several changes in the system states in the past 14 years. These are most likely caused by climatic phenomena. We analyse trends in sound pressure level within different frequency bands and hypothesize a possible anthropogenic impact on the acoustic environment. The tipping point analysis framework provides insight into the structure of the acoustic data and helps identify its dynamic phenomena, correctly reproducing the probability distribution and scaling properties (power-law correlations of the time series.

  8. Fourier analysis and stochastic processes

    CERN Document Server

    Brémaud, Pierre

    2014-01-01

    This work is unique as it provides a uniform treatment of the Fourier theories of functions (Fourier transforms and series, z-transforms), finite measures (characteristic functions, convergence in distribution), and stochastic processes (including arma series and point processes). It emphasises the links between these three themes. The chapter on the Fourier theory of point processes and signals structured by point processes is a novel addition to the literature on Fourier analysis of stochastic processes. It also connects the theory with recent lines of research such as biological spike signals and ultrawide-band communications. Although the treatment is mathematically rigorous, the convivial style makes the book accessible to a large audience. In particular, it will be interesting to anyone working in electrical engineering and communications, biology (point process signals) and econometrics (arma models). A careful review of the prerequisites (integration and probability theory in the appendix, Hilbert spa...

  9. Benchmarking of radiological departments. Starting point for successful process optimization

    International Nuclear Information System (INIS)

    Busch, Hans-Peter

    2010-01-01

    Continuous optimization of the process of organization and medical treatment is part of the successful management of radiological departments. The focus of this optimization can be cost units such as CT and MRI or the radiological parts of total patient treatment. Key performance indicators for process optimization are cost- effectiveness, service quality and quality of medical treatment. The potential for improvements can be seen by comparison (benchmark) with other hospitals and radiological departments. Clear definitions of key data and criteria are absolutely necessary for comparability. There is currently little information in the literature regarding the methodology and application of benchmarks especially from the perspective of radiological departments and case-based lump sums, even though benchmarking has frequently been applied to radiological departments by hospital management. The aim of this article is to describe and discuss systematic benchmarking as an effective starting point for successful process optimization. This includes the description of the methodology, recommendation of key parameters and discussion of the potential for cost-effectiveness analysis. The main focus of this article is cost-effectiveness (efficiency and effectiveness) with respect to cost units and treatment processes. (orig.)

  10. Mechanistic spatio-temporal point process models for marked point processes, with a view to forest stand data

    DEFF Research Database (Denmark)

    Møller, Jesper; Ghorbani, Mohammad; Rubak, Ege Holger

    We show how a spatial point process, where to each point there is associated a random quantitative mark, can be identified with a spatio-temporal point process specified by a conditional intensity function. For instance, the points can be tree locations, the marks can express the size of trees......, and the conditional intensity function can describe the distribution of a tree (i.e., its location and size) conditionally on the larger trees. This enable us to construct parametric statistical models which are easily interpretable and where likelihood-based inference is tractable. In particular, we consider maximum...

  11. Music analysis and point-set compression

    DEFF Research Database (Denmark)

    Meredith, David

    A musical analysis represents a particular way of understanding certain aspects of the structure of a piece of music. The quality of an analysis can be evaluated to some extent by the degree to which knowledge of it improves performance on tasks such as mistake spotting, memorising a piece...... as the minimum description length principle and relates closely to certain ideas in the theory of Kolmogorov complexity. Inspired by this general principle, the hypothesis explored in this paper is that the best ways of understanding (or explanations for) a piece of music are those that are represented...... by the shortest possible descriptions of the piece. With this in mind, two compression algorithms are presented, COSIATEC and SIATECCompress. Each of these algorithms takes as input an in extenso description of a piece of music as a set of points in pitch-time space representing notes. Each algorithm...

  12. Numerical analysis on pump turbine runaway points

    International Nuclear Information System (INIS)

    Guo, L; Liu, J T; Wang, L Q; Jiao, L; Li, Z F

    2012-01-01

    To research the character of pump turbine runaway points with different guide vane opening, a hydraulic model was established based on a pumped storage power station. The RNG k-ε model and SMPLEC algorithms was used to simulate the internal flow fields. The result of the simulation was compared with the test data and good correspondence was got between experimental data and CFD result. Based on this model, internal flow analysis was carried out. The result show that when the pump turbine ran at the runway speed, lots of vortexes appeared in the flow passage of the runner. These vortexes could always be observed even if the guide vane opening changes. That is an important way of energy loss in the runaway condition. Pressure on two sides of the runner blades were almost the same. So the runner power is very low. High speed induced large centrifugal force and the small guide vane opening gave the water velocity a large tangential component, then an obvious water ring could be observed between the runner blades and guide vanes in small guide vane opening condition. That ring disappeared when the opening bigger than 20°. These conclusions can provide a theory basis for the analysis and simulation of the pump turbine runaway points.

  13. Multivariate Product-Shot-noise Cox Point Process Models

    DEFF Research Database (Denmark)

    Jalilian, Abdollah; Guan, Yongtao; Mateu, Jorge

    We introduce a new multivariate product-shot-noise Cox process which is useful for model- ing multi-species spatial point patterns with clustering intra-specific interactions and neutral, negative or positive inter-specific interactions. The auto and cross pair correlation functions of the process...... can be obtained in closed analytical forms and approximate simulation of the process is straightforward. We use the proposed process to model interactions within and among five tree species in the Barro Colorado Island plot....

  14. PROCESSING UAV AND LIDAR POINT CLOUDS IN GRASS GIS

    Directory of Open Access Journals (Sweden)

    V. Petras

    2016-06-01

    Full Text Available Today’s methods of acquiring Earth surface data, namely lidar and unmanned aerial vehicle (UAV imagery, non-selectively collect or generate large amounts of points. Point clouds from different sources vary in their properties such as number of returns, density, or quality. We present a set of tools with applications for different types of points clouds obtained by a lidar scanner, structure from motion technique (SfM, and a low-cost 3D scanner. To take advantage of the vertical structure of multiple return lidar point clouds, we demonstrate tools to process them using 3D raster techniques which allow, for example, the development of custom vegetation classification methods. Dense point clouds obtained from UAV imagery, often containing redundant points, can be decimated using various techniques before further processing. We implemented and compared several decimation techniques in regard to their performance and the final digital surface model (DSM. Finally, we will describe the processing of a point cloud from a low-cost 3D scanner, namely Microsoft Kinect, and its application for interaction with physical models. All the presented tools are open source and integrated in GRASS GIS, a multi-purpose open source GIS with remote sensing capabilities. The tools integrate with other open source projects, specifically Point Data Abstraction Library (PDAL, Point Cloud Library (PCL, and OpenKinect libfreenect2 library to benefit from the open source point cloud ecosystem. The implementation in GRASS GIS ensures long term maintenance and reproducibility by the scientific community but also by the original authors themselves.

  15. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Science.gov (United States)

    2010-04-01

    ... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... Provisions § 123.6 Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. (a) Hazard... fish or fishery product being processed in the absence of those controls. (b) The HACCP plan. Every...

  16. Dew point vs bubble point : a misunderstood constraint on gravity drainage processes

    Energy Technology Data Exchange (ETDEWEB)

    Nenninger, J. [N-Solv Corp., Calgary, AB (Canada); Gunnewiek, L. [Hatch Ltd., Mississauga, ON (Canada)

    2009-07-01

    This study demonstrated that gravity drainage processes that use blended fluids such as solvents have an inherently unstable material balance due to differences between dew point and bubble point compositions. The instability can lead to the accumulation of volatile components within the chamber, and impair mass and heat transfer processes. Case studies were used to demonstrate the large temperature gradients within the vapour chamber caused by temperature differences between the bubble point and dew point for blended fluids. A review of published data showed that many experiments on in-situ processes do not account for unstable material balances caused by a lack of steam trap control. A study of temperature profiles during steam assisted gravity drainage (SAGD) studies showed significant temperature depressions caused by methane accumulations at the outside perimeter of the steam chamber. It was demonstrated that the condensation of large volumes of purified solvents provided an efficient mechanism for the removal of methane from the chamber. It was concluded that gravity drainage processes can be optimized by using pure propane during the injection process. 22 refs., 1 tab., 18 figs.

  17. A CASE STUDY ON POINT PROCESS MODELLING IN DISEASE MAPPING

    Directory of Open Access Journals (Sweden)

    Viktor Beneš

    2011-05-01

    Full Text Available We consider a data set of locations where people in Central Bohemia have been infected by tick-borne encephalitis (TBE, and where population census data and covariates concerning vegetation and altitude are available. The aims are to estimate the risk map of the disease and to study the dependence of the risk on the covariates. Instead of using the common area level approaches we base the analysis on a Bayesian approach for a log Gaussian Cox point process with covariates. Posterior characteristics for a discretized version of the log Gaussian Cox process are computed using Markov chain Monte Carlo methods. A particular problem which is thoroughly discussed is to determine a model for the background population density. The risk map shows a clear dependency with the population intensity models and the basic model which is adopted for the population intensity determines what covariates influence the risk of TBE. Model validation is based on the posterior predictive distribution of various summary statistics.

  18. A MARKED POINT PROCESS MODEL FOR VEHICLE DETECTION IN AERIAL LIDAR POINT CLOUDS

    Directory of Open Access Journals (Sweden)

    A. Börcs

    2012-07-01

    Full Text Available In this paper we present an automated method for vehicle detection in LiDAR point clouds of crowded urban areas collected from an aerial platform. We assume that the input cloud is unordered, but it contains additional intensity and return number information which are jointly exploited by the proposed solution. Firstly, the 3-D point set is segmented into ground, vehicle, building roof, vegetation and clutter classes. Then the points with the corresponding class labels and intensity values are projected to the ground plane, where the optimal vehicle configuration is described by a Marked Point Process (MPP model of 2-D rectangles. Finally, the Multiple Birth and Death algorithm is utilized to find the configuration with the highest confidence.

  19. ALTERNATIVE METHODOLOGIES FOR THE ESTIMATION OF LOCAL POINT DENSITY INDEX: MOVING TOWARDS ADAPTIVE LIDAR DATA PROCESSING

    Directory of Open Access Journals (Sweden)

    Z. Lari

    2012-07-01

    Full Text Available Over the past few years, LiDAR systems have been established as a leading technology for the acquisition of high density point clouds over physical surfaces. These point clouds will be processed for the extraction of geo-spatial information. Local point density is one of the most important properties of the point cloud that highly affects the performance of data processing techniques and the quality of extracted information from these data. Therefore, it is necessary to define a standard methodology for the estimation of local point density indices to be considered for the precise processing of LiDAR data. Current definitions of local point density indices, which only consider the 2D neighbourhood of individual points, are not appropriate for 3D LiDAR data and cannot be applied for laser scans from different platforms. In order to resolve the drawbacks of these methods, this paper proposes several approaches for the estimation of the local point density index which take the 3D relationship among the points and the physical properties of the surfaces they belong to into account. In the simplest approach, an approximate value of the local point density for each point is defined while considering the 3D relationship among the points. In the other approaches, the local point density is estimated by considering the 3D neighbourhood of the point in question and the physical properties of the surface which encloses this point. The physical properties of the surfaces enclosing the LiDAR points are assessed through eigen-value analysis of the 3D neighbourhood of individual points and adaptive cylinder methods. This paper will discuss these approaches and highlight their impact on various LiDAR data processing activities (i.e., neighbourhood definition, region growing, segmentation, boundary detection, and classification. Experimental results from airborne and terrestrial LiDAR data verify the efficacy of considering local point density variation for

  20. Multiplicative point process as a model of trading activity

    Science.gov (United States)

    Gontis, V.; Kaulakys, B.

    2004-11-01

    Signals consisting of a sequence of pulses show that inherent origin of the 1/ f noise is a Brownian fluctuation of the average interevent time between subsequent pulses of the pulse sequence. In this paper, we generalize the model of interevent time to reproduce a variety of self-affine time series exhibiting power spectral density S( f) scaling as a power of the frequency f. Furthermore, we analyze the relation between the power-law correlations and the origin of the power-law probability distribution of the signal intensity. We introduce a stochastic multiplicative model for the time intervals between point events and analyze the statistical properties of the signal analytically and numerically. Such model system exhibits power-law spectral density S( f)∼1/ fβ for various values of β, including β= {1}/{2}, 1 and {3}/{2}. Explicit expressions for the power spectra in the low-frequency limit and for the distribution density of the interevent time are obtained. The counting statistics of the events is analyzed analytically and numerically, as well. The specific interest of our analysis is related with the financial markets, where long-range correlations of price fluctuations largely depend on the number of transactions. We analyze the spectral density and counting statistics of the number of transactions. The model reproduces spectral properties of the real markets and explains the mechanism of power-law distribution of trading activity. The study provides evidence that the statistical properties of the financial markets are enclosed in the statistics of the time interval between trades. A multiplicative point process serves as a consistent model generating this statistics.

  1. Corner-point criterion for assessing nonlinear image processing imagers

    Science.gov (United States)

    Landeau, Stéphane; Pigois, Laurent; Foing, Jean-Paul; Deshors, Gilles; Swiathy, Greggory

    2017-10-01

    Range performance modeling of optronics imagers attempts to characterize the ability to resolve details in the image. Today, digital image processing is systematically used in conjunction with the optoelectronic system to correct its defects or to exploit tiny detection signals to increase performance. In order to characterize these processing having adaptive and non-linear properties, it becomes necessary to stimulate the imagers with test patterns whose properties are similar to the actual scene image ones, in terms of dynamic range, contours, texture and singular points. This paper presents an approach based on a Corner-Point (CP) resolution criterion, derived from the Probability of Correct Resolution (PCR) of binary fractal patterns. The fundamental principle lies in the respectful perception of the CP direction of one pixel minority value among the majority value of a 2×2 pixels block. The evaluation procedure considers the actual image as its multi-resolution CP transformation, taking the role of Ground Truth (GT). After a spatial registration between the degraded image and the original one, the degradation is statistically measured by comparing the GT with the degraded image CP transformation, in terms of localized PCR at the region of interest. The paper defines this CP criterion and presents the developed evaluation techniques, such as the measurement of the number of CP resolved on the target, the transformation CP and its inverse transform that make it possible to reconstruct an image of the perceived CPs. Then, this criterion is compared with the standard Johnson criterion, in the case of a linear blur and noise degradation. The evaluation of an imaging system integrating an image display and a visual perception is considered, by proposing an analysis scheme combining two methods: a CP measurement for the highly non-linear part (imaging) with real signature test target and conventional methods for the more linear part (displaying). The application to

  2. Pointo - a Low Cost Solution to Point Cloud Processing

    Science.gov (United States)

    Houshiar, H.; Winkler, S.

    2017-11-01

    With advance in technology access to data especially 3D point cloud data becomes more and more an everyday task. 3D point clouds are usually captured with very expensive tools such as 3D laser scanners or very time consuming methods such as photogrammetry. Most of the available softwares for 3D point cloud processing are designed for experts and specialists in this field and are usually very large software packages containing variety of methods and tools. This results in softwares that are usually very expensive to acquire and also very difficult to use. Difficulty of use is caused by complicated user interfaces that is required to accommodate a large list of features. The aim of these complex softwares is to provide a powerful tool for a specific group of specialist. However they are not necessary required by the majority of the up coming average users of point clouds. In addition to complexity and high costs of these softwares they generally rely on expensive and modern hardware and only compatible with one specific operating system. Many point cloud customers are not point cloud processing experts or willing to spend the high acquisition costs of these expensive softwares and hardwares. In this paper we introduce a solution for low cost point cloud processing. Our approach is designed to accommodate the needs of the average point cloud user. To reduce the cost and complexity of software our approach focuses on one functionality at a time in contrast with most available softwares and tools that aim to solve as many problems as possible at the same time. Our simple and user oriented design improve the user experience and empower us to optimize our methods for creation of an efficient software. In this paper we introduce Pointo family as a series of connected softwares to provide easy to use tools with simple design for different point cloud processing requirements. PointoVIEWER and PointoCAD are introduced as the first components of the Pointo family to provide a

  3. Numerical analysis of the heat and mass transfer processes in selected M-Cycle heat exchangers for the dew point evaporative cooling

    International Nuclear Information System (INIS)

    Pandelidis, Demis; Anisimov, Sergey

    2015-01-01

    Highlights: • The comparative numerical study of the eight M-Cycle heat exchangers was presented. • The mathematical model is compared against the experimental data. • The results show, that the original M-Cycle heat and mass exchanger can be improved. • The effectiveness of the heat and mass exchangers depends strongly on the inlet air parameters. - Abstract: This paper investigates a mathematical simulation of heat and mass transfer in eight different types of the Maisotsenko Cycle (M-Cycle) heat and mass exchangers (HMXs) used for indirect evaporative air cooling. A two-dimensional heat and mass transfer model is developed to perform the thermal calculations of the indirect evaporative cooling process and quantifying the overall performance. The mathematical model was validated against experimental data. A numerical simulation reveals many unique features of the considered HMXs, enabling an accurate prediction of their performance. Results of the model allow for comparison of the analyzed devices in order to improve the performance of the original HMX

  4. Investigation of Random Switching Driven by a Poisson Point Process

    DEFF Research Database (Denmark)

    Simonsen, Maria; Schiøler, Henrik; Leth, John-Josef

    2015-01-01

    This paper investigates the switching mechanism of a two-dimensional switched system, when the switching events are generated by a Poisson point process. A model, in the shape of a stochastic process, for such a system is derived and the distribution of the trajectory's position is developed...... together with marginal density functions for the coordinate functions. Furthermore, the joint probability distribution is given explicitly....

  5. On estimation of the intensity function of a point process

    NARCIS (Netherlands)

    Lieshout, van M.N.M.

    2010-01-01

    Abstract. Estimation of the intensity function of spatial point processes is a fundamental problem. In this paper, we interpret the Delaunay tessellation field estimator recently introduced by Schaap and Van de Weygaert as an adaptive kernel estimator and give explicit expressions for the mean and

  6. Spatio-temporal point process filtering methods with an application

    Czech Academy of Sciences Publication Activity Database

    Frcalová, B.; Beneš, V.; Klement, Daniel

    2010-01-01

    Roč. 21, 3-4 (2010), s. 240-252 ISSN 1180-4009 R&D Projects: GA AV ČR(CZ) IAA101120604 Institutional research plan: CEZ:AV0Z50110509 Keywords : cox point process * filtering * spatio-temporal modelling * spike Subject RIV: BA - General Mathematics Impact factor: 0.750, year: 2010

  7. A case study on point process modelling in disease mapping

    Czech Academy of Sciences Publication Activity Database

    Beneš, Viktor; Bodlák, M.; Moller, J.; Waagepetersen, R.

    2005-01-01

    Roč. 24, č. 3 (2005), s. 159-168 ISSN 1580-3139 R&D Projects: GA MŠk 0021620839; GA ČR GA201/03/0946 Institutional research plan: CEZ:AV0Z10750506 Keywords : log Gaussian Cox point process * Bayesian estimation Subject RIV: BB - Applied Statistics, Operational Research

  8. A J–function for inhomogeneous point processes

    NARCIS (Netherlands)

    M.N.M. van Lieshout (Marie-Colette)

    2010-01-01

    htmlabstractWe propose new summary statistics for intensity-reweighted moment stationary point processes that generalise the well known J-, empty space, and nearest-neighbour distance dis- tribution functions, represent them in terms of generating functionals and conditional intensities, and relate

  9. Microbial profile and critical control points during processing of 'robo ...

    African Journals Online (AJOL)

    STORAGESEVER

    2009-05-18

    May 18, 2009 ... frying, surface fat draining, open-air cooling, and holding/packaging in polyethylene films during sales and distribution. The product was, however, classified under category III with respect to risk and the significance of monitoring and evaluation of quality using the hazard analysis critical control point.

  10. Impedance analysis of acupuncture points and pathways

    International Nuclear Information System (INIS)

    Teplan, Michal; Kukucka, Marek; Ondrejkovicová, Alena

    2011-01-01

    Investigation of impedance characteristics of acupuncture points from acoustic to radio frequency range is addressed. Discernment and localization of acupuncture points in initial single subject study was unsuccessfully attempted by impedance map technique. Vector impedance analyses determined possible resonant zones in MHz region.

  11. Maintenance Process Strategic Analysis

    Science.gov (United States)

    Jasiulewicz-Kaczmarek, M.; Stachowiak, A.

    2016-08-01

    The performance and competitiveness of manufacturing companies is dependent on the availability, reliability and productivity of their production facilities. Low productivity, downtime, and poor machine performance is often linked to inadequate plant maintenance, which in turn can lead to reduced production levels, increasing costs, lost market opportunities, and lower profits. These pressures have given firms worldwide the motivation to explore and embrace proactive maintenance strategies over the traditional reactive firefighting methods. The traditional view of maintenance has shifted into one of an overall view that encompasses Overall Equipment Efficiency, Stakeholders Management and Life Cycle assessment. From practical point of view it requires changes in approach to maintenance represented by managers and changes in actions performed within maintenance area. Managers have to understand that maintenance is not only about repairs and conservations of machines and devices, but also actions striving for more efficient resources management and care for safety and health of employees. The purpose of the work is to present strategic analysis based on SWOT analysis to identify the opportunities and strengths of maintenance process, to benefit from them as much as possible, as well as to identify weaknesses and threats, so that they could be eliminated or minimized.

  12. Shot-noise-weighted processes : a new family of spatial point processes

    NARCIS (Netherlands)

    M.N.M. van Lieshout (Marie-Colette); I.S. Molchanov (Ilya)

    1995-01-01

    textabstractThe paper suggests a new family of of spatial point processes distributions. They are defined by means of densities with respect to the Poisson point process within a bounded set. These densities are given in terms of a functional of the shot-noise process with a given influence

  13. Two-step estimation for inhomogeneous spatial point processes

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Guan, Yongtao

    This paper is concerned with parameter estimation for inhomogeneous spatial point processes with a regression model for the intensity function and tractable second order properties (K-function). Regression parameters are estimated using a Poisson likelihood score estimating function and in a second...... step minimum contrast estimation is applied for the residual clustering parameters. Asymptotic normality of parameter estimates is established under certain mixing conditions and we exemplify how the results may be applied in ecological studies of rain forests....

  14. Music analysis and point-set compression

    DEFF Research Database (Denmark)

    Meredith, David

    2015-01-01

    COSIATEC, SIATECCompress and Forth’s algorithm are point-set compression algorithms developed for discovering repeated patterns in music, such as themes and motives that would be of interest to a music analyst. To investigate their effectiveness and versatility, these algorithms were evaluated...... on three analytical tasks that depend on the discovery of repeated patterns: classifying folk song melodies into tune families, discovering themes and sections in polyphonic music, and discovering subject and countersubject entries in fugues. Each algorithm computes a compressed encoding of a point......-set representation of a musical object in the form of a list of compact patterns, each pattern being given with a set of vectors indicating its occurrences. However, the algorithms adopt different strategies in their attempts to discover encodings that maximize compression.The best-performing algorithm on the folk...

  15. Spatial Mixture Modelling for Unobserved Point Processes: Examples in Immunofluorescence Histology.

    Science.gov (United States)

    Ji, Chunlin; Merl, Daniel; Kepler, Thomas B; West, Mike

    2009-12-04

    We discuss Bayesian modelling and computational methods in analysis of indirectly observed spatial point processes. The context involves noisy measurements on an underlying point process that provide indirect and noisy data on locations of point outcomes. We are interested in problems in which the spatial intensity function may be highly heterogenous, and so is modelled via flexible nonparametric Bayesian mixture models. Analysis aims to estimate the underlying intensity function and the abundance of realized but unobserved points. Our motivating applications involve immunological studies of multiple fluorescent intensity images in sections of lymphatic tissue where the point processes represent geographical configurations of cells. We are interested in estimating intensity functions and cell abundance for each of a series of such data sets to facilitate comparisons of outcomes at different times and with respect to differing experimental conditions. The analysis is heavily computational, utilizing recently introduced MCMC approaches for spatial point process mixtures and extending them to the broader new context here of unobserved outcomes. Further, our example applications are problems in which the individual objects of interest are not simply points, but rather small groups of pixels; this implies a need to work at an aggregate pixel region level and we develop the resulting novel methodology for this. Two examples with with immunofluorescence histology data demonstrate the models and computational methodology.

  16. Point Cloud Analysis for Conservation and Enhancement of Modernist Architecture

    Science.gov (United States)

    Balzani, M.; Maietti, F.; Mugayar Kühl, B.

    2017-02-01

    Documentation of cultural assets through improved acquisition processes for advanced 3D modelling is one of the main challenges to be faced in order to address, through digital representation, advanced analysis on shape, appearance and conservation condition of cultural heritage. 3D modelling can originate new avenues in the way tangible cultural heritage is studied, visualized, curated, displayed and monitored, improving key features such as analysis and visualization of material degradation and state of conservation. An applied research focused on the analysis of surface specifications and material properties by means of 3D laser scanner survey has been developed within the project of Digital Preservation of FAUUSP building, Faculdade de Arquitetura e Urbanismo da Universidade de São Paulo, Brazil. The integrated 3D survey has been performed by the DIAPReM Center of the Department of Architecture of the University of Ferrara in cooperation with the FAUUSP. The 3D survey has allowed the realization of a point cloud model of the external surfaces, as the basis to investigate in detail the formal characteristics, geometric textures and surface features. The digital geometric model was also the basis for processing the intensity values acquired by laser scanning instrument; this method of analysis was an essential integration to the macroscopic investigations in order to manage additional information related to surface characteristics displayable on the point cloud.

  17. Efficient point cloud data processing in shipbuilding: Reformative component extraction method and registration method

    Directory of Open Access Journals (Sweden)

    Jingyu Sun

    2014-07-01

    Full Text Available To survive in the current shipbuilding industry, it is of vital importance for shipyards to have the ship components’ accuracy evaluated efficiently during most of the manufacturing steps. Evaluating components’ accuracy by comparing each component’s point cloud data scanned by laser scanners and the ship’s design data formatted in CAD cannot be processed efficiently when (1 extract components from point cloud data include irregular obstacles endogenously, or when (2 registration of the two data sets have no clear direction setting. This paper presents reformative point cloud data processing methods to solve these problems. K-d tree construction of the point cloud data fastens a neighbor searching of each point. Region growing method performed on the neighbor points of the seed point extracts the continuous part of the component, while curved surface fitting and B-spline curved line fitting at the edge of the continuous part recognize the neighbor domains of the same component divided by obstacles’ shadows. The ICP (Iterative Closest Point algorithm conducts a registration of the two sets of data after the proper registration’s direction is decided by principal component analysis. By experiments conducted at the shipyard, 200 curved shell plates are extracted from the scanned point cloud data, and registrations are conducted between them and the designed CAD data using the proposed methods for an accuracy evaluation. Results show that the methods proposed in this paper support the accuracy evaluation targeted point cloud data processing efficiently in practice.

  18. Badge Office Process Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Haurykiewicz, John Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dinehart, Timothy Grant [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parker, Robert Young [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-12

    The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with information and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.

  19. Energy risk management through self-exciting marked point process

    International Nuclear Information System (INIS)

    Herrera, Rodrigo

    2013-01-01

    Crude oil is a dynamically traded commodity that affects many economies. We propose a collection of marked self-exciting point processes with dependent arrival rates for extreme events in oil markets and related risk measures. The models treat the time among extreme events in oil markets as a stochastic process. The main advantage of this approach is its capability to capture the short, medium and long-term behavior of extremes without involving an arbitrary stochastic volatility model or a prefiltration of the data, as is common in extreme value theory applications. We make use of the proposed model in order to obtain an improved estimate for the Value at Risk in oil markets. Empirical findings suggest that the reliability and stability of Value at Risk estimates improve as a result of finer modeling approach. This is supported by an empirical application in the representative West Texas Intermediate (WTI) and Brent crude oil markets. - Highlights: • We propose marked self-exciting point processes for extreme events in oil markets. • This approach captures the short and long-term behavior of extremes. • We improve the estimates for the VaR in the WTI and Brent crude oil markets

  20. Weak convergence of marked point processes generated by crossings of multivariate jump processes

    DEFF Research Database (Denmark)

    Tamborrino, Massimiliano; Sacerdote, Laura; Jacobsen, Martin

    2014-01-01

    We consider the multivariate point process determined by the crossing times of the components of a multivariate jump process through a multivariate boundary, assuming to reset each component to an initial value after its boundary crossing. We prove that this point process converges weakly...... process converging to a multivariate Ornstein–Uhlenbeck process is discussed as a guideline for applying diffusion limits for jump processes. We apply our theoretical findings to neural network modeling. The proposed model gives a mathematical foundation to the generalization of the class of Leaky...

  1. Variational approach for spatial point process intensity estimation

    DEFF Research Database (Denmark)

    Coeurjolly, Jean-Francois; Møller, Jesper

    is assumed to be of log-linear form β+θ⊤z(u) where z is a spatial covariate function and the focus is on estimating θ. The variational estimator is very simple to implement and quicker than alternative estimation procedures. We establish its strong consistency and asymptotic normality. We also discuss its...... finite-sample properties in comparison with the maximum first order composite likelihood estimator when considering various inhomogeneous spatial point process models and dimensions as well as settings were z is completely or only partially known....

  2. Two-step estimation for inhomogeneous spatial point processes

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Guan, Yongtao

    2009-01-01

    The paper is concerned with parameter estimation for inhomogeneous spatial point processes with a regression model for the intensity function and tractable second-order properties (K-function). Regression parameters are estimated by using a Poisson likelihood score estimating function and in the ...... and in the second step minimum contrast estimation is applied for the residual clustering parameters. Asymptotic normality of parameter estimates is established under certain mixing conditions and we exemplify how the results may be applied in ecological studies of rainforests....

  3. Multiple Monte Carlo Testing with Applications in Spatial Point Processes

    DEFF Research Database (Denmark)

    Mrkvička, Tomáš; Myllymäki, Mari; Hahn, Ute

    with a function as the test statistic, 3) several Monte Carlo tests with functions as test statistics. The rank test has correct (global) type I error in each case and it is accompanied with a p-value and with a graphical interpretation which shows which subtest or which distances of the used test function......(s) lead to the rejection at the prescribed significance level of the test. Examples of null hypothesis from point process and random set statistics are used to demonstrate the strength of the rank envelope test. The examples include goodness-of-fit test with several test functions, goodness-of-fit test...

  4. Fixed point theory, variational analysis, and optimization

    CERN Document Server

    Al-Mezel, Saleh Abdullah R; Ansari, Qamrul Hasan

    2015-01-01

    ""There is a real need for this book. It is useful for people who work in areas of nonlinear analysis, optimization theory, variational inequalities, and mathematical economics.""-Nan-Jing Huang, Sichuan University, Chengdu, People's Republic of China

  5. CLINSULF sub-dew-point process for sulphur recovery

    Energy Technology Data Exchange (ETDEWEB)

    Heisel, M.; Marold, F.

    1988-01-01

    In a 2-reactor system, the CLINSULF process allows very high sulphur recovery rates. When operated at 100/sup 0/C at the outlet, i.e. below the sulphur solidification point, a sulphur recovery rate of more than 99.2% was achieved in a 2-reactor series. Assuming a 70% sulphur recovery in an upstream Claus furnace plus sulphur condenser, an overall sulphur recovery of more than 99.8% results for the 2-reactor system. This is approximately 2% higher than in conventional Claus plus SDP units, which mostly consist of 4 reactors or more. This means the the CLINSULF SSP process promises to be an improvement both in respect of efficiency and low investment cost.

  6. Self-Exciting Point Process Modeling of Conversation Event Sequences

    Science.gov (United States)

    Masuda, Naoki; Takaguchi, Taro; Sato, Nobuo; Yano, Kazuo

    Self-exciting processes of Hawkes type have been used to model various phenomena including earthquakes, neural activities, and views of online videos. Studies of temporal networks have revealed that sequences of social interevent times for individuals are highly bursty. We examine some basic properties of event sequences generated by the Hawkes self-exciting process to show that it generates bursty interevent times for a wide parameter range. Then, we fit the model to the data of conversation sequences recorded in company offices in Japan. In this way, we can estimate relative magnitudes of the self excitement, its temporal decay, and the base event rate independent of the self excitation. These variables highly depend on individuals. We also point out that the Hawkes model has an important limitation that the correlation in the interevent times and the burstiness cannot be independently modulated.

  7. Imitation learning of Non-Linear Point-to-Point Robot Motions using Dirichlet Processes

    DEFF Research Database (Denmark)

    Krüger, Volker; Tikhanoff, Vadim; Natale, Lorenzo

    2012-01-01

    In this paper we discuss the use of the infinite Gaussian mixture model and Dirichlet processes for learning robot movements from demonstrations. Starting point of this work is an earlier paper where the authors learn a non-linear dynamic robot movement model from a small number of observations....... The model in that work is learned using a classical finite Gaussian mixture model (FGMM) where the Gaussian mixtures are appropriately constrained. The problem with this approach is that one needs to make a good guess for how many mixtures the FGMM should use. In this work, we generalize this approach...... our algorithm on the same data that was used in [5], where the authors use motion capture devices to record the demonstrations. As further validation we test our approach on novel data acquired on our iCub in a different demonstration scenario in which the robot is physically driven by the human...

  8. Process energy analysis

    International Nuclear Information System (INIS)

    Kaiser, V.

    1993-01-01

    In Chapter 2 process energy cost analysis for chemical processing is treated in a general way, independent of the specific form of energy and power production. Especially, energy data collection and data treatment, energy accounting (metering, balance setting), specific energy input, and utility energy costs and prices are discussed. (R.P.) 14 refs., 4 figs., 16 tabs

  9. Integrated modeling and analysis methodology for precision pointing applications

    Science.gov (United States)

    Gutierrez, Homero L.

    2002-07-01

    Space-based optical systems that perform tasks such as laser communications, Earth imaging, and astronomical observations require precise line-of-sight (LOS) pointing. A general approach is described for integrated modeling and analysis of these types of systems within the MATLAB/Simulink environment. The approach can be applied during all stages of program development, from early conceptual design studies to hardware implementation phases. The main objective is to predict the dynamic pointing performance subject to anticipated disturbances and noise sources. Secondary objectives include assessing the control stability, levying subsystem requirements, supporting pointing error budgets, and performing trade studies. The integrated model resides in Simulink, and several MATLAB graphical user interfaces (GUI"s) allow the user to configure the model, select analysis options, run analyses, and process the results. A convenient parameter naming and storage scheme, as well as model conditioning and reduction tools and run-time enhancements, are incorporated into the framework. This enables the proposed architecture to accommodate models of realistic complexity.

  10. Point Cluster Analysis Using a 3D Voronoi Diagram with Applications in Point Cloud Segmentation

    Directory of Open Access Journals (Sweden)

    Shen Ying

    2015-08-01

    Full Text Available Three-dimensional (3D point analysis and visualization is one of the most effective methods of point cluster detection and segmentation in geospatial datasets. However, serious scattering and clotting characteristics interfere with the visual detection of 3D point clusters. To overcome this problem, this study proposes the use of 3D Voronoi diagrams to analyze and visualize 3D points instead of the original data item. The proposed algorithm computes the cluster of 3D points by applying a set of 3D Voronoi cells to describe and quantify 3D points. The decompositions of point cloud of 3D models are guided by the 3D Voronoi cell parameters. The parameter values are mapped from the Voronoi cells to 3D points to show the spatial pattern and relationships; thus, a 3D point cluster pattern can be highlighted and easily recognized. To capture different cluster patterns, continuous progressive clusters and segmentations are tested. The 3D spatial relationship is shown to facilitate cluster detection. Furthermore, the generated segmentations of real 3D data cases are exploited to demonstrate the feasibility of our approach in detecting different spatial clusters for continuous point cloud segmentation.

  11. Tipping point analysis of a large ocean ambient sound record

    Science.gov (United States)

    Livina, Valerie N.; Harris, Peter; Brower, Albert; Wang, Lian; Sotirakopoulos, Kostas; Robinson, Stephen

    2017-04-01

    We study a long (2003-2015) high-resolution (250Hz) sound pressure record provided by the Comprehensive Nuclear-Test-Ban Treaty Organisation (CTBTO) from the hydro-acoustic station Cape Leeuwin (Australia). We transform the hydrophone waveforms into five bands of 10-min-average sound pressure levels (including the third-octave band) and apply tipping point analysis techniques [1-3]. We report the results of the analysis of fluctuations and trends in the data and discuss the BigData challenges in processing this record, including handling data segments of large size and possible HPC solutions. References: [1] Livina et al, GRL 2007, [2] Livina et al, Climate of the Past 2010, [3] Livina et al, Chaos 2015.

  12. Chemical process hazards analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-01

    The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

  13. Mean-field inference of Hawkes point processes

    International Nuclear Information System (INIS)

    Bacry, Emmanuel; Gaïffas, Stéphane; Mastromatteo, Iacopo; Muzy, Jean-François

    2016-01-01

    We propose a fast and efficient estimation method that is able to accurately recover the parameters of a d-dimensional Hawkes point-process from a set of observations. We exploit a mean-field approximation that is valid when the fluctuations of the stochastic intensity are small. We show that this is notably the case in situations when interactions are sufficiently weak, when the dimension of the system is high or when the fluctuations are self-averaging due to the large number of past events they involve. In such a regime the estimation of a Hawkes process can be mapped on a least-squares problem for which we provide an analytic solution. Though this estimator is biased, we show that its precision can be comparable to the one of the maximum likelihood estimator while its computation speed is shown to be improved considerably. We give a theoretical control on the accuracy of our new approach and illustrate its efficiency using synthetic datasets, in order to assess the statistical estimation error of the parameters. (paper)

  14. Effect of processing conditions on oil point pressure of moringa oleifera seed.

    Science.gov (United States)

    Aviara, N A; Musa, W B; Owolarafe, O K; Ogunsina, B S; Oluwole, F A

    2015-07-01

    Seed oil expression is an important economic venture in rural Nigeria. The traditional techniques of carrying out the operation is not only energy sapping and time consuming but also wasteful. In order to reduce the tedium involved in the expression of oil from moringa oleifera seed and develop efficient equipment for carrying out the operation, the oil point pressure of the seed was determined under different processing conditions using a laboratory press. The processing conditions employed were moisture content (4.78, 6.00, 8.00 and 10.00 % wet basis), heating temperature (50, 70, 85 and 100 °C) and heating time (15, 20, 25 and 30 min). Results showed that the oil point pressure increased with increase in seed moisture content, but decreased with increase in heating temperature and heating time within the above ranges. Highest oil point pressure value of 1.1239 MPa was obtained at the processing conditions of 10.00 % moisture content, 50 °C heating temperature and 15 min heating time. The lowest oil point pressure obtained was 0.3164 MPa and it occurred at the moisture content of 4.78 %, heating temperature of 100 °C and heating time of 30 min. Analysis of Variance (ANOVA) showed that all the processing variables and their interactions had significant effect on the oil point pressure of moringa oleifera seed at 1 % level of significance. This was further demonstrated using Response Surface Methodology (RSM). Tukey's test and Duncan's Multiple Range Analysis successfully separated the means and a multiple regression equation was used to express the relationship existing between the oil point pressure of moringa oleifera seed and its moisture content, processing temperature, heating time and their interactions. The model yielded coefficients that enabled the oil point pressure of the seed to be predicted with very high coefficient of determination.

  15. Single Point Vulnerability Analysis of Automatic Seismic Trip System

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Seo Bin; Chung, Soon Il; Lee, Yong Suk [FNC Technology Co., Yongin (Korea, Republic of); Choi, Byung Pil [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    Single Point Vulnerability (SPV) analysis is a process used to identify individual equipment whose failure alone will result in a reactor trip, turbine generator failure, or power reduction of more than 50%. Automatic Seismic Trip System (ASTS) is a newly installed system to ensure the safety of plant when earthquake occurs. Since this system directly shuts down the reactor, the failure or malfunction of its system component can cause a reactor trip more frequently than other systems. Therefore, an SPV analysis of ASTS is necessary to maintain its essential performance. To analyze SPV for ASTS, failure mode and effect analysis (FMEA) and fault tree analysis (FTA) was performed. In this study, FMEA and FTA methods were performed to select SPV equipment of ASTS. D/O, D/I, A/I card, seismic sensor, and trip relay had an effect on the reactor trip but their single failure will not cause reactor trip. In conclusion, ASTS is excluded as SPV. These results can be utilized as the basis data for ways to enhance facility reliability such as design modification and improvement of preventive maintenance procedure.

  16. Single Point Vulnerability Analysis of Automatic Seismic Trip System

    International Nuclear Information System (INIS)

    Oh, Seo Bin; Chung, Soon Il; Lee, Yong Suk; Choi, Byung Pil

    2016-01-01

    Single Point Vulnerability (SPV) analysis is a process used to identify individual equipment whose failure alone will result in a reactor trip, turbine generator failure, or power reduction of more than 50%. Automatic Seismic Trip System (ASTS) is a newly installed system to ensure the safety of plant when earthquake occurs. Since this system directly shuts down the reactor, the failure or malfunction of its system component can cause a reactor trip more frequently than other systems. Therefore, an SPV analysis of ASTS is necessary to maintain its essential performance. To analyze SPV for ASTS, failure mode and effect analysis (FMEA) and fault tree analysis (FTA) was performed. In this study, FMEA and FTA methods were performed to select SPV equipment of ASTS. D/O, D/I, A/I card, seismic sensor, and trip relay had an effect on the reactor trip but their single failure will not cause reactor trip. In conclusion, ASTS is excluded as SPV. These results can be utilized as the basis data for ways to enhance facility reliability such as design modification and improvement of preventive maintenance procedure

  17. A scalable and multi-purpose point cloud server (PCS) for easier and faster point cloud data management and processing

    Science.gov (United States)

    Cura, Rémi; Perret, Julien; Paparoditis, Nicolas

    2017-05-01

    In addition to more traditional geographical data such as images (rasters) and vectors, point cloud data are becoming increasingly available. Such data are appreciated for their precision and true three-Dimensional (3D) nature. However, managing point clouds can be difficult due to scaling problems and specificities of this data type. Several methods exist but are usually fairly specialised and solve only one aspect of the management problem. In this work, we propose a comprehensive and efficient point cloud management system based on a database server that works on groups of points (patches) rather than individual points. This system is specifically designed to cover the basic needs of point cloud users: fast loading, compressed storage, powerful patch and point filtering, easy data access and exporting, and integrated processing. Moreover, the proposed system fully integrates metadata (like sensor position) and can conjointly use point clouds with other geospatial data, such as images, vectors, topology and other point clouds. Point cloud (parallel) processing can be done in-base with fast prototyping capabilities. Lastly, the system is built on open source technologies; therefore it can be easily extended and customised. We test the proposed system with several billion points obtained from Lidar (aerial and terrestrial) and stereo-vision. We demonstrate loading speeds in the ˜50 million pts/h per process range, transparent-for-user and greater than 2 to 4:1 compression ratio, patch filtering in the 0.1 to 1 s range, and output in the 0.1 million pts/s per process range, along with classical processing methods, such as object detection.

  18. Analysis of Stress Updates in the Material-point Method

    DEFF Research Database (Denmark)

    Andersen, Søren; Andersen, Lars

    2009-01-01

    The material-point method (MPM) is a new numerical method for analysis of large strain engineering problems. The MPM applies a dual formulation, where the state of the problem (mass, stress, strain, velocity etc.) is tracked using a finite set of material points while the governing equations...... are solved on a background computational grid. Several references state, that one of the main advantages of the material-point method is the easy application of complicated material behaviour as the constitutive response is updated individually for each material point. However, as discussed here, the MPM way...

  19. Washing and chilling as critical control points in pork slaughter hazard analysis and critical control point (HACCP) systems.

    Science.gov (United States)

    Bolton, D J; Pearce, R A; Sheridan, J J; Blair, I S; McDowell, D A; Harrington, D

    2002-01-01

    The aim of this research was to examine the effects of preslaughter washing, pre-evisceration washing, final carcass washing and chilling on final carcass quality and to evaluate these operations as possible critical control points (CCPs) within a pork slaughter hazard analysis and critical control point (HACCP) system. This study estimated bacterial numbers (total viable counts) and the incidence of Salmonella at three surface locations (ham, belly and neck) on 60 animals/carcasses processed through a small commercial pork abattoir (80 pigs d(-1)). Significant reductions (P HACCP in pork slaughter plants. This research will provide a sound scientific basis on which to develop and implement effective HACCP in pork abattoirs.

  20. Efficient LIDAR Point Cloud Data Managing and Processing in a Hadoop-Based Distributed Framework

    Science.gov (United States)

    Wang, C.; Hu, F.; Sha, D.; Han, X.

    2017-10-01

    Light Detection and Ranging (LiDAR) is one of the most promising technologies in surveying and mapping city management, forestry, object recognition, computer vision engineer and others. However, it is challenging to efficiently storage, query and analyze the high-resolution 3D LiDAR data due to its volume and complexity. In order to improve the productivity of Lidar data processing, this study proposes a Hadoop-based framework to efficiently manage and process LiDAR data in a distributed and parallel manner, which takes advantage of Hadoop's storage and computing ability. At the same time, the Point Cloud Library (PCL), an open-source project for 2D/3D image and point cloud processing, is integrated with HDFS and MapReduce to conduct the Lidar data analysis algorithms provided by PCL in a parallel fashion. The experiment results show that the proposed framework can efficiently manage and process big LiDAR data.

  1. Discrete Approximations of Determinantal Point Processes on Continuous Spaces: Tree Representations and Tail Triviality

    Science.gov (United States)

    Osada, Hirofumi; Osada, Shota

    2018-01-01

    We prove tail triviality of determinantal point processes μ on continuous spaces. Tail triviality has been proved for such processes only on discrete spaces, and hence we have generalized the result to continuous spaces. To do this, we construct tree representations, that is, discrete approximations of determinantal point processes enjoying a determinantal structure. There are many interesting examples of determinantal point processes on continuous spaces such as zero points of the hyperbolic Gaussian analytic function with Bergman kernel, and the thermodynamic limit of eigenvalues of Gaussian random matrices for Sine_2 , Airy_2 , Bessel_2 , and Ginibre point processes. Our main theorem proves all these point processes are tail trivial.

  2. Digital analyzer for point processes based on first-in-first-out memories

    Science.gov (United States)

    Basano, Lorenzo; Ottonello, Pasquale; Schiavi, Enore

    1992-06-01

    We present an entirely new version of a multipurpose instrument designed for the statistical analysis of point processes, especially those characterized by high bunching. A long sequence of pulses can be recorded in the RAM bank of a personal computer via a suitably designed front end which employs a pair of first-in-first-out (FIFO) memories; these allow one to build an analyzer that, besides being simpler from the electronic point of view, is capable of sustaining much higher intensity fluctuations of the point process. The overflow risk of the device is evaluated by treating the FIFO pair as a queueing system. The apparatus was tested using both a deterministic signal and a sequence of photoelectrons obtained from laser light scattered by random surfaces.

  3. Linear and quadratic models of point process systems: contributions of patterned input to output.

    Science.gov (United States)

    Lindsay, K A; Rosenberg, J R

    2012-08-01

    In the 1880's Volterra characterised a nonlinear system using a functional series connecting continuous input and continuous output. Norbert Wiener, in the 1940's, circumvented problems associated with the application of Volterra series to physical problems by deriving from it a new series of terms that are mutually uncorrelated with respect to Gaussian processes. Subsequently, Brillinger, in the 1970's, introduced a point-process analogue of Volterra's series connecting point-process inputs to the instantaneous rate of point-process output. We derive here a new series from this analogue in which its terms are mutually uncorrelated with respect to Poisson processes. This new series expresses how patterned input in a spike train, represented by third-order cross-cumulants, is converted into the instantaneous rate of an output point-process. Given experimental records of suitable duration, the contribution of arbitrary patterned input to an output process can, in principle, be determined. Solutions for linear and quadratic point-process models with one and two inputs and a single output are investigated. Our theoretical results are applied to isolated muscle spindle data in which the spike trains from the primary and secondary endings from the same muscle spindle are recorded in response to stimulation of one and then two static fusimotor axons in the absence and presence of a random length change imposed on the parent muscle. For a fixed mean rate of input spikes, the analysis of the experimental data makes explicit which patterns of two input spikes contribute to an output spike. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Equivalence of functional limit theorems for stationary point processes and their Palm distributions

    NARCIS (Netherlands)

    Nieuwenhuis, G.

    1989-01-01

    Let P be the distribution of a stationary point process on the real line and let P0 be its Palm distribution. In this paper we consider two types of functional limit theorems, those in terms of the number of points of the point process in (0, t] and those in terms of the location of the nth point

  5. Discussion of "Modern statistics for spatial point processes"

    DEFF Research Database (Denmark)

    Jensen, Eva Bjørn Vedel; Prokesová, Michaela; Hellmund, Gunnar

    2007-01-01

    ABSTRACT. The paper ‘Modern statistics for spatial point processes’ by Jesper Møller and Rasmus P. Waagepetersen is based on a special invited lecture given by the authors at the 21st Nordic Conference on Mathematical Statistics, held at Rebild, Denmark, in June 2006. At the conference, Antti...

  6. A Bayesian MCMC method for point process models with intractable normalising constants

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    2004-01-01

    to simulate from the "unknown distribution", perfect simulation algorithms become useful. We illustrate the method in cases whre the likelihood is given by a Markov point process model. Particularly, we consider semi-parametric Bayesian inference in connection to both inhomogeneous Markov point process models...... and pairwise interaction point processes....

  7. Fast Change Point Detection for Electricity Market Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Berkeley, UC; Gu, William; Choi, Jaesik; Gu, Ming; Simon, Horst; Wu, Kesheng

    2013-08-25

    Electricity is a vital part of our daily life; therefore it is important to avoid irregularities such as the California Electricity Crisis of 2000 and 2001. In this work, we seek to predict anomalies using advanced machine learning algorithms. These algorithms are effective, but computationally expensive, especially if we plan to apply them on hourly electricity market data covering a number of years. To address this challenge, we significantly accelerate the computation of the Gaussian Process (GP) for time series data. In the context of a Change Point Detection (CPD) algorithm, we reduce its computational complexity from O($n^{5}$) to O($n^{2}$). Our efficient algorithm makes it possible to compute the Change Points using the hourly price data from the California Electricity Crisis. By comparing the detected Change Points with known events, we show that the Change Point Detection algorithm is indeed effective in detecting signals preceding major events.

  8. INHOMOGENEITY IN SPATIAL COX POINT PROCESSES – LOCATION DEPENDENT THINNING IS NOT THE ONLY OPTION

    Directory of Open Access Journals (Sweden)

    Michaela Prokešová

    2010-11-01

    Full Text Available In the literature on point processes the by far most popular option for introducing inhomogeneity into a point process model is the location dependent thinning (resulting in a second-order intensity-reweighted stationary point process. This produces a very tractable model and there are several fast estimation procedures available. Nevertheless, this model dilutes the interaction (or the geometrical structure of the original homogeneous model in a special way. When concerning the Markov point processes several alternative inhomogeneous models were suggested and investigated in the literature. But it is not so for the Cox point processes, the canonical models for clustered point patterns. In the contribution we discuss several other options how to define inhomogeneous Cox point process models that result in point patterns with different types of geometric structure. We further investigate the possible parameter estimation procedures for such models.

  9. Marked point process for modelling seismic activity (case study in Sumatra and Java)

    Science.gov (United States)

    Pratiwi, Hasih; Sulistya Rini, Lia; Wayan Mangku, I.

    2018-05-01

    Earthquake is a natural phenomenon that is random, irregular in space and time. Until now the forecast of earthquake occurrence at a location is still difficult to be estimated so that the development of earthquake forecast methodology is still carried out both from seismology aspect and stochastic aspect. To explain the random nature phenomena, both in space and time, a point process approach can be used. There are two types of point processes: temporal point process and spatial point process. The temporal point process relates to events observed over time as a sequence of time, whereas the spatial point process describes the location of objects in two or three dimensional spaces. The points on the point process can be labelled with additional information called marks. A marked point process can be considered as a pair (x, m) where x is the point of location and m is the mark attached to the point of that location. This study aims to model marked point process indexed by time on earthquake data in Sumatra Island and Java Island. This model can be used to analyse seismic activity through its intensity function by considering the history process up to time before t. Based on data obtained from U.S. Geological Survey from 1973 to 2017 with magnitude threshold 5, we obtained maximum likelihood estimate for parameters of the intensity function. The estimation of model parameters shows that the seismic activity in Sumatra Island is greater than Java Island.

  10. Lasso and probabilistic inequalities for multivariate point processes

    DEFF Research Database (Denmark)

    Hansen, Niels Richard; Reynaud-Bouret, Patricia; Rivoirard, Vincent

    2015-01-01

    Due to its low computational cost, Lasso is an attractive regularization method for high-dimensional statistical settings. In this paper, we consider multivariate counting processes depending on an unknown function parameter to be estimated by linear combinations of a fixed dictionary. To select...... for multivariate Hawkes processes are proven, which allows us to check these assumptions by considering general dictionaries based on histograms, Fourier or wavelet bases. Motivated by problems of neuronal activity inference, we finally carry out a simulation study for multivariate Hawkes processes and compare our...... methodology with the adaptive Lasso procedure proposed by Zou in (J. Amer. Statist. Assoc. 101 (2006) 1418–1429). We observe an excellent behavior of our procedure. We rely on theoretical aspects for the essential question of tuning our methodology. Unlike adaptive Lasso of (J. Amer. Statist. Assoc. 101 (2006...

  11. Stability Analysis of Periodic Systems by Truncated Point Mappings

    Science.gov (United States)

    Guttalu, R. S.; Flashner, H.

    1996-01-01

    An approach is presented deriving analytical stability and bifurcation conditions for systems with periodically varying coefficients. The method is based on a point mapping(period to period mapping) representation of the system's dynamics. An algorithm is employed to obtain an analytical expression for the point mapping and its dependence on the system's parameters. The algorithm is devised to derive the coefficients of a multinominal expansion of the point mapping up to an arbitrary order in terms of the state variables and of the parameters. Analytical stability and bifurcation condition are then formulated and expressed as functional relations between the parameters. To demonstrate the application of the method, the parametric stability of Mathieu's equation and of a two-degree of freedom system are investigated. The results obtained by the proposed approach are compared to those obtained by perturbation analysis and by direct integration which we considered to the "exact solution". It is shown that, unlike perturbation analysis, the proposed method provides very accurate solution even for large valuesof the parameters. If an expansion of the point mapping in terms of a small parameter is performed the method is equivalent to perturbation analysis. Moreover, it is demonstrated that the method can be easily applied to multiple-degree-of-freedom systems using the same framework. This feature is an important advantage since most of the existing analysis methods apply mainly to single-degree-of-freedom systems and their extension to higher dimensions is difficult and computationally cumbersome.

  12. Point of Care Testing Services Delivery: Policy Analysis using a ...

    African Journals Online (AJOL)

    Annals of Biomedical Sciences ... The service providers (hospital management) and the testing personnel are faced with the task of trying to explain these problems. Objective of the study: To critically do a policy analysis of the problems of point of care testing with the aim of identifying the causes of these problems and ...

  13. Modelling financial high frequency data using point processes

    DEFF Research Database (Denmark)

    Hautsch, Nikolaus; Bauwens, Luc

    In this chapter written for a forthcoming Handbook of Financial Time Series to be published by Springer-Verlag, we review the econometric literature on dynamic duration and intensity processes applied to high frequency financial data, which was boosted by the work of Engle and Russell (1997...

  14. Lasso and probabilistic inequalities for multivariate point processes

    OpenAIRE

    Hansen, Niels Richard; Reynaud-Bouret, Patricia; Rivoirard, Vincent

    2012-01-01

    Due to its low computational cost, Lasso is an attractive regularization method for high-dimensional statistical settings. In this paper, we consider multivariate counting processes depending on an unknown function parameter to be estimated by linear combinations of a fixed dictionary. To select coefficients, we propose an adaptive $\\ell_{1}$-penalization methodology, where data-driven weights of the penalty are derived from new Bernstein type inequalities for martingales. Oracle inequalities...

  15. The S-Process Branching-Point at 205PB

    Science.gov (United States)

    Tonchev, Anton; Tsoneva, N.; Bhatia, C.; Arnold, C. W.; Goriely, S.; Hammond, S. L.; Kelley, J. H.; Kwan, E.; Lenske, H.; Piekarewicz, J.; Raut, R.; Rusev, G.; Shizuma, T.; Tornow, W.

    2017-09-01

    Accurate neutron-capture cross sections for radioactive nuclei near the line of beta stability are crucial for understanding s-process nucleosynthesis. However, neutron-capture cross sections for short-lived radionuclides are difficult to measure due to the fact that the measurements require both highly radioactive samples and intense neutron sources. We consider photon scattering using monoenergetic and 100% linearly polarized photon beams to obtain the photoabsorption cross section on 206Pb below the neutron separation energy. This observable becomes an essential ingredient in the Hauser-Feshbach statistical model for calculations of capture cross sections on 205Pb. The newly obtained photoabsorption information is also used to estimate the Maxwellian-averaged radiative cross section of 205Pb(n,g)206Pb at 30 keV. The astrophysical impact of this measurement on s-process nucleosynthesis will be discussed. This work was performed under the auspices of US DOE by LLNL under Contract DE-AC52-07NA27344.

  16. Analysis of hygienic critical control points in boar semen production.

    Science.gov (United States)

    Schulze, M; Ammon, C; Rüdiger, K; Jung, M; Grobbel, M

    2015-02-01

    The present study addresses the microbiological results of a quality control audit in artificial insemination (AI) boar studs in Germany and Austria. The raw and processed semen of 344 boars in 24 AI boar studs were analyzed. Bacteria were found in 26% (88 of 344) of the extended ejaculates and 66.7% (18 of 24) of the boar studs. The bacterial species found in the AI dose were not cultured from the respective raw semen in 95.5% (84 of 88) of the positive samples. These data, together with the fact that in most cases all the samples from one stud were contaminated with identical bacteria (species and resistance profile), indicate contamination during processing. Microbiological investigations of the equipment and the laboratory environment during semen processing in 21 AI boar studs revealed nine hygienic critical control points (HCCP), which were addressed after the first audit. On the basis of the analysis of the contamination rates of the ejaculate samples, improvements in the hygiene status were already present in the second audit (P = 0.0343, F-test). Significant differences were observed for heating cabinets (improvement, P = 0.0388) and manual operating elements (improvement, P = 0.0002). The odds ratio of finding contaminated ejaculates in the first and second audit was 1.68 (with the 95% confidence interval ranging from 1.04 to 2.69). Furthermore, an overall good hygienic status was shown for extenders, the inner face of dilution tank lids, dyes, and ultrapure water treatment plants. Among the nine HCCP considered, the most heavily contaminated samples, as assessed by the median scores throughout all the studs, were found in the sinks and/or drains. High numbers (>10(3) colony-forming units/cm(2)) of bacteria were found in the heating cabinets, ejaculate transfer, manual operating elements, and laboratory surfaces. In conclusion, the present study emphasizes the need for both training of the laboratory staff in monitoring HCCP in routine semen

  17. Hazard analysis and critical control point (HACCP) history and conceptual overview.

    Science.gov (United States)

    Hulebak, Karen L; Schlosser, Wayne

    2002-06-01

    The concept of Hazard Analysis and Critical Control Point (HACCP) is a system that enables the production of safe meat and poultry products through the thorough analysis of production processes, identification of all hazards that are likely to occur in the production establishment, the identification of critical points in the process at which these hazards may be introduced into product and therefore should be controlled, the establishment of critical limits for control at those points, the verification of these prescribed steps, and the methods by which the processing establishment and the regulatory authority can monitor how well process control through the HACCP plan is working. The history of the development of HACCP is reviewed, and examples of practical applications of HACCP are described.

  18. Point process analyses of variations in smoking rate by setting, mood, gender, and dependence

    Science.gov (United States)

    Shiffman, Saul; Rathbun, Stephen L.

    2010-01-01

    The immediate emotional and situational antecedents of ad libitum smoking are still not well understood. We re-analyzed data from Ecological Momentary Assessment using novel point-process analyses, to assess how craving, mood, and social setting influence smoking rate, as well as assessing the moderating effects of gender and nicotine dependence. 304 smokers recorded craving, mood, and social setting using electronic diaries when smoking and at random nonsmoking times over 16 days of smoking. Point-process analysis, which makes use of the known random sampling scheme for momentary variables, examined main effects of setting and interactions with gender and dependence. Increased craving was associated with higher rates of smoking, particularly among women. Negative affect was not associated with smoking rate, even in interaction with arousal, but restlessness was associated with substantially higher smoking rates. Women's smoking tended to be less affected by negative affect. Nicotine dependence had little moderating effect on situational influences. Smoking rates were higher when smokers were alone or with others smoking, and smoking restrictions reduced smoking rates. However, the presence of others smoking undermined the effects of restrictions. The more sensitive point-process analyses confirmed earlier findings, including the surprising conclusion that negative affect by itself was not related to smoking rates. Contrary to hypothesis, men's and not women's smoking was influenced by negative affect. Both smoking restrictions and the presence of others who are not smoking suppress smoking, but others’ smoking undermines the effects of restrictions. Point-process analyses of EMA data can bring out even small influences on smoking rate. PMID:21480683

  19. Implementation of 5S tools as a starting point in business process reengineering

    Directory of Open Access Journals (Sweden)

    Vorkapić Miloš 0000-0002-3463-8665

    2017-01-01

    Full Text Available The paper deals with the analysis of elements which represent a starting point in implementation of a business process reengineering. We have used Lean tools through the analysis of 5S model in our research. On the example of finalization of the finished transmitter in IHMT-CMT production, 5S tools were implemented with a focus on Quality elements although the theory shows that BPR and TQM are two opposite activities in an enterprise. We wanted to distinguish the significance of employees’ self-discipline which helps the process of product finalization to develop in time and without waste and losses. In addition, the employees keep their work place clean, tidy and functional.

  20. Spectrography analysis of stainless steel by the point to point technique

    International Nuclear Information System (INIS)

    Bona, A.

    1986-01-01

    A method for the determination of the elements Ni, Cr, Mn, Si, Mo, Nb, Cu, Co and V in stainless steel by emission spectrographic analysis using high voltage spark sources is presented. The 'point-to-point' technique is employed. The experimental parameters were optimized taking account a compromise between the detection sensitivity and the precision of the measurement. The parameters investigated were the high voltage capacitance, the inductance, the analytical and auxiliary gaps, the period of pre burn spark and the time of exposure. The edge shape of the counter electrodes and the type of polishing and diameter variation of the stailess steel eletrodes were evaluated in preliminary assays. In addition the degradation of the chemical power of the developer was also investigated. Counter electrodes of graphite, copper, aluminium and iron were employed and the counter electrode itself was used as an internal standard. In the case of graphite counter electrodes the iron lines were employed as internal standard. The relative errors were the criteria for evaluation of these experiments. The National Bureau of Standards - Certified reference stainless steel standards and the Eletrometal Acos Finos S.A. samples (certified by the supplier) were employed for drawing in the calibration systems and analytical curves. The best results were obtained using the convencional graphite counter electrodes. The inaccuracy and the imprecision of the proposed method varied from 2% to 15% and from 1% to 9% respectively. This present technique was compared to others instrumental techniques such as inductively coupled plasma, X-ray fluorescence and neutron activation analysis. The advantages and disadvantages for each case were discussed. (author) [pt

  1. Risk-analysis of global climate tipping points

    Energy Technology Data Exchange (ETDEWEB)

    Frieler, Katja; Meinshausen, Malte; Braun, N [Potsdam Institute for Climate Impact Research e.V., Potsdam (Germany). PRIMAP Research Group; and others

    2012-09-15

    There are many elements of the Earth system that are expected to change gradually with increasing global warming. Changes might prove to be reversible after global warming returns to lower levels. But there are others that have the potential of showing a threshold behavior. This means that these changes would imply a transition between qualitatively disparate states which can be triggered by only small shifts in background climate (2). These changes are often expected not to be reversible by returning to the current level of warming. The reason for that is, that many of them are characterized by self-amplifying processes that could lead to a new internally stable state which is qualitatively different from before. There are different elements of the climate system that are already identified as potential tipping elements. This group contains the mass losses of the Greenland and the West-Antarctic Ice Sheet, the decline of the Arctic summer sea ice, different monsoon systems, the degradation of coral reefs, the dieback of the Amazon rainforest, the thawing of the permafrost regions as well as the release of methane hydrates (3). Crucially, these tipping elements have regional to global scale effects on human society, biodiversity and/or ecosystem services. Several examples may have a discernable effect on global climate through a large-scale positive feedback. This means they would further amplify the human induced climate change. These tipping elements pose risks comparable to risks found in other fields of human activity: high-impact events that have at least a few percent chance to occur classify as high-risk events. In many of these examples adaptation options are limited and prevention of occurrence may be a more viable strategy. Therefore, a better understanding of the processes driving tipping points is essential. There might be other tipping elements even more critical but not yet identified. These may also lie within our socio-economic systems that are

  2. Screw compressor analysis from a vibration point-of-view

    Science.gov (United States)

    Hübel, D.; Žitek, P.

    2017-09-01

    Vibrations are a very typical feature of all compressors and are given great attention in the industry. The reason for this interest is primarily the negative influence that it can have on both the operating staff and the entire machine's service life. The purpose of this work is to describe the methodology of screw compressor analysis from a vibration point-of-view. This analysis is an essential part of the design of vibro-diagnostics of screw compressors with regard to their service life.

  3. Evaluation of in-line Raman data for end-point determination of a coating process: Comparison of Science-Based Calibration, PLS-regression and univariate data analysis.

    Science.gov (United States)

    Barimani, Shirin; Kleinebudde, Peter

    2017-10-01

    A multivariate analysis method, Science-Based Calibration (SBC), was used for the first time for endpoint determination of a tablet coating process using Raman data. Two types of tablet cores, placebo and caffeine cores, received a coating suspension comprising a polyvinyl alcohol-polyethylene glycol graft-copolymer and titanium dioxide to a maximum coating thickness of 80µm. Raman spectroscopy was used as in-line PAT tool. The spectra were acquired every minute and correlated to the amount of applied aqueous coating suspension. SBC was compared to another well-known multivariate analysis method, Partial Least Squares-regression (PLS) and a simpler approach, Univariate Data Analysis (UVDA). All developed calibration models had coefficient of determination values (R 2 ) higher than 0.99. The coating endpoints could be predicted with root mean square errors (RMSEP) less than 3.1% of the applied coating suspensions. Compared to PLS and UVDA, SBC proved to be an alternative multivariate calibration method with high predictive power. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Growth Curve Analysis and Change-Points Detection in Extremes

    KAUST Repository

    Meng, Rui

    2016-05-15

    The thesis consists of two coherent projects. The first project presents the results of evaluating salinity tolerance in barley using growth curve analysis where different growth trajectories are observed within barley families. The study of salinity tolerance in plants is crucial to understanding plant growth and productivity. Because fully-automated smarthouses with conveyor systems allow non-destructive and high-throughput phenotyping of large number of plants, it is now possible to apply advanced statistical tools to analyze daily measurements and to study salinity tolerance. To compare different growth patterns of barley variates, we use functional data analysis techniques to analyze the daily projected shoot areas. In particular, we apply the curve registration method to align all the curves from the same barley family in order to summarize the family-wise features. We also illustrate how to use statistical modeling to account for spatial variation in microclimate in smarthouses and for temporal variation across runs, which is crucial for identifying traits of the barley variates. In our analysis, we show that the concentrations of sodium and potassium in leaves are negatively correlated, and their interactions are associated with the degree of salinity tolerance. The second project studies change-points detection methods in extremes when multiple time series data are available. Motived by the scientific question of whether the chances to experience extreme weather are different in different seasons of a year, we develop a change-points detection model to study changes in extremes or in the tail of a distribution. Most of existing models identify seasons from multiple yearly time series assuming a season or a change-point location remains exactly the same across years. In this work, we propose a random effect model that allows the change-point to vary from year to year, following a given distribution. Both parametric and nonparametric methods are developed

  5. EFFICIENT LIDAR POINT CLOUD DATA MANAGING AND PROCESSING IN A HADOOP-BASED DISTRIBUTED FRAMEWORK

    Directory of Open Access Journals (Sweden)

    C. Wang

    2017-10-01

    Full Text Available Light Detection and Ranging (LiDAR is one of the most promising technologies in surveying and mapping,city management, forestry, object recognition, computer vision engineer and others. However, it is challenging to efficiently storage, query and analyze the high-resolution 3D LiDAR data due to its volume and complexity. In order to improve the productivity of Lidar data processing, this study proposes a Hadoop-based framework to efficiently manage and process LiDAR data in a distributed and parallel manner, which takes advantage of Hadoop’s storage and computing ability. At the same time, the Point Cloud Library (PCL, an open-source project for 2D/3D image and point cloud processing, is integrated with HDFS and MapReduce to conduct the Lidar data analysis algorithms provided by PCL in a parallel fashion. The experiment results show that the proposed framework can efficiently manage and process big LiDAR data.

  6. Development and evaluation of spatial point process models for epidermal nerve fibers.

    Science.gov (United States)

    Olsbo, Viktor; Myllymäki, Mari; Waller, Lance A; Särkkä, Aila

    2013-06-01

    We propose two spatial point process models for the spatial structure of epidermal nerve fibers (ENFs) across human skin. The models derive from two point processes, Φb and Φe, describing the locations of the base and end points of the fibers. Each point of Φe (the end point process) is connected to a unique point in Φb (the base point process). In the first model, both Φe and Φb are Poisson processes, yielding a null model of uniform coverage of the skin by end points and general baseline results and reference values for moments of key physiologic indicators. The second model provides a mechanistic model to generate end points for each base, and we model the branching structure more directly by defining Φe as a cluster process conditioned on the realization of Φb as its parent points. In both cases, we derive distributional properties for observable quantities of direct interest to neurologists such as the number of fibers per base, and the direction and range of fibers on the skin. We contrast both models by fitting them to data from skin blister biopsy images of ENFs and provide inference regarding physiological properties of ENFs. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. A note on the statistical analysis of point judgment matrices

    Directory of Open Access Journals (Sweden)

    MG Kabera

    2013-06-01

    Full Text Available The Analytic Hierarchy Process is a multicriteria decision making technique developed by Saaty in the 1970s. The core of the approach is the pairwise comparison of objects according to a single criterion using a 9-point ratio scale and the estimation of weights associated with these objects based on the resultant judgment matrix. In the present paper some statistical approaches to extracting the weights of objects from a judgment matrix are reviewed and new ideas which are rooted in the traditional method of paired comparisons are introduced.

  8. Percolation analysis for cosmic web with discrete points

    Science.gov (United States)

    Zhang, Jiajun; Cheng, Dalong; Chu, Ming-Chung

    2018-01-01

    Percolation analysis has long been used to quantify the connectivity of the cosmic web. Most of the previous work is based on density fields on grids. By smoothing into fields, we lose information about galaxy properties like shape or luminosity. The lack of mathematical modeling also limits our understanding for the percolation analysis. To overcome these difficulties, we have studied percolation analysis based on discrete points. Using a friends-of-friends (FoF) algorithm, we generate the S -b b relation, between the fractional mass of the largest connected group (S ) and the FoF linking length (b b ). We propose a new model, the probability cloud cluster expansion theory to relate the S -b b relation with correlation functions. We show that the S -b b relation reflects a combination of all orders of correlation functions. Using N-body simulation, we find that the S -b b relation is robust against redshift distortion and incompleteness in observation. From the Bolshoi simulation, with halo abundance matching (HAM), we have generated a mock galaxy catalog. Good matching of the projected two-point correlation function with observation is confirmed. However, comparing the mock catalog with the latest galaxy catalog from Sloan Digital Sky Survey (SDSS) Data Release (DR)12, we have found significant differences in their S -b b relations. This indicates that the mock galaxy catalog cannot accurately retain higher-order correlation functions than the two-point correlation function, which reveals the limit of the HAM method. As a new measurement, the S -b b relation is applicable to a wide range of data types, fast to compute, and robust against redshift distortion and incompleteness and contains information of all orders of correlation functions.

  9. Distinguishing different types of inhomogeneity in Neyman-Scott point processes

    Czech Academy of Sciences Publication Activity Database

    Mrkvička, Tomáš

    2014-01-01

    Roč. 16, č. 2 (2014), s. 385-395 ISSN 1387-5841 Institutional support: RVO:60077344 Keywords : clustering * growing clusters * inhomogeneous cluster centers * inhomogeneous point process * location dependent scaling * Neyman-Scott point process Subject RIV: BA - General Mathematics Impact factor: 0.913, year: 2014

  10. The importance of topographically corrected null models for analyzing ecological point processes.

    Science.gov (United States)

    McDowall, Philip; Lynch, Heather J

    2017-07-01

    Analyses of point process patterns and related techniques (e.g., MaxEnt) make use of the expected number of occurrences per unit area and second-order statistics based on the distance between occurrences. Ecologists working with point process data often assume that points exist on a two-dimensional x-y plane or within a three-dimensional volume, when in fact many observed point patterns are generated on a two-dimensional surface existing within three-dimensional space. For many surfaces, however, such as the topography of landscapes, the projection from the surface to the x-y plane preserves neither area nor distance. As such, when these point patterns are implicitly projected to and analyzed in the x-y plane, our expectations of the point pattern's statistical properties may not be met. When used in hypothesis testing, we find that the failure to account for the topography of the generating surface may bias statistical tests that incorrectly identify clustering and, furthermore, may bias coefficients in inhomogeneous point process models that incorporate slope as a covariate. We demonstrate the circumstances under which this bias is significant, and present simple methods that allow point processes to be simulated with corrections for topography. These point patterns can then be used to generate "topographically corrected" null models against which observed point processes can be compared. © 2017 by the Ecological Society of America.

  11. Classification of Phase Transitions by Microcanonical Inflection-Point Analysis

    Science.gov (United States)

    Qi, Kai; Bachmann, Michael

    2018-05-01

    By means of the principle of minimal sensitivity we generalize the microcanonical inflection-point analysis method by probing derivatives of the microcanonical entropy for signals of transitions in complex systems. A strategy of systematically identifying and locating independent and dependent phase transitions of any order is proposed. The power of the generalized method is demonstrated in applications to the ferromagnetic Ising model and a coarse-grained model for polymer adsorption onto a substrate. The results shed new light on the intrinsic phase structure of systems with cooperative behavior.

  12. Relatively Inexact Proximal Point Algorithm and Linear Convergence Analysis

    Directory of Open Access Journals (Sweden)

    Ram U. Verma

    2009-01-01

    Full Text Available Based on a notion of relatively maximal (m-relaxed monotonicity, the approximation solvability of a general class of inclusion problems is discussed, while generalizing Rockafellar's theorem (1976 on linear convergence using the proximal point algorithm in a real Hilbert space setting. Convergence analysis, based on this new model, is simpler and compact than that of the celebrated technique of Rockafellar in which the Lipschitz continuity at 0 of the inverse of the set-valued mapping is applied. Furthermore, it can be used to generalize the Yosida approximation, which, in turn, can be applied to first-order evolution equations as well as evolution inclusions.

  13. Analysis on Single Point Vulnerabilities of Plant Control System

    International Nuclear Information System (INIS)

    Chi, Moon Goo; Lee, Eun Chan; Bae, Yeon Kyoung

    2011-01-01

    The Plant Control System (PCS) is a system that controls pumps, valves, dampers, etc. in nuclear power plants with an OPR-1000 design. When there is a failure or spurious actuation of the critical components in the PCS, it can result in unexpected plant trips or transients. From this viewpoint, single point vulnerabilities are evaluated in detail using failure mode effect analyses (FMEA) and fault tree analyses (FTA). This evaluation demonstrates that the PCS has many vulnerable components and the analysis results are provided for OPR-1000 plants for reliability improvements that can reduce their vulnerabilities

  14. Analysis on Single Point Vulnerabilities of Plant Control System

    Energy Technology Data Exchange (ETDEWEB)

    Chi, Moon Goo; Lee, Eun Chan; Bae, Yeon Kyoung [Korea Hydro and Nuclear Power Co., Daejeon (Korea, Republic of)

    2011-08-15

    The Plant Control System (PCS) is a system that controls pumps, valves, dampers, etc. in nuclear power plants with an OPR-1000 design. When there is a failure or spurious actuation of the critical components in the PCS, it can result in unexpected plant trips or transients. From this viewpoint, single point vulnerabilities are evaluated in detail using failure mode effect analyses (FMEA) and fault tree analyses (FTA). This evaluation demonstrates that the PCS has many vulnerable components and the analysis results are provided for OPR-1000 plants for reliability improvements that can reduce their vulnerabilities.

  15. Analysis of a simple pendulum driven at its suspension point

    International Nuclear Information System (INIS)

    Yoshida, S; Findley, T

    2005-01-01

    To familiarize undergraduate students with the dynamics of a damped driven harmonic oscillator, a simple pendulum was set up and driven at its suspension point under different damping conditions. From the time domain analysis, the decay constant was estimated and used to predict the frequency response. The simple pendulum was then driven at a series of frequencies near the resonance. By measuring the maximum amplitude at each driving frequency, the frequency response was determined. With one free parameter, which was determined under the first damping condition, the predicted frequency responses showed good agreement with the measured frequency responses under all damping conditions

  16. Assessment of Peer Mediation Process from Conflicting Students’ Point of Views

    Directory of Open Access Journals (Sweden)

    Fulya TÜRK

    2016-12-01

    Full Text Available The purpose of this study was to analyze peer mediation process that was applied in a high school on conflicting students’ point of views. This research was carried out in a high school in Denizli. After ten sessions of training in peer mediation, peer mediators mediated peers’ real conflicts. In the research, 41 students (28 girls, 13 boys who got help at least once were interviewed as a party to the conflict. Through semistructured interviews with conflicting students, the mediation process has been evaluated through the point of views of students. Eight questions were asked about the conflicting parties. Verbal data obtained from interviews were analyzed using the content analysis. When conflicting students’ opinions and experiences about peer mediation were analyzed, it is seen that they were satisfied regarding the process, they have resolved their conflicts in a constructive and peaceful way, their friendship has been continuing as before. All of these results also indicate that peer mediation is an effective method of resolving student conflicts constructively

  17. An introduction to nonlinear analysis and fixed point theory

    CERN Document Server

    Pathak, Hemant Kumar

    2018-01-01

    This book systematically introduces the theory of nonlinear analysis, providing an overview of topics such as geometry of Banach spaces, differential calculus in Banach spaces, monotone operators, and fixed point theorems. It also discusses degree theory, nonlinear matrix equations, control theory, differential and integral equations, and inclusions. The book presents surjectivity theorems, variational inequalities, stochastic game theory and mathematical biology, along with a large number of applications of these theories in various other disciplines. Nonlinear analysis is characterised by its applications in numerous interdisciplinary fields, ranging from engineering to space science, hydromechanics to astrophysics, chemistry to biology, theoretical mechanics to biomechanics and economics to stochastic game theory. Organised into ten chapters, the book shows the elegance of the subject and its deep-rooted concepts and techniques, which provide the tools for developing more realistic and accurate models for ...

  18. Uncertainty analysis of point by point sampling complex surfaces using touch probe CMMs

    DEFF Research Database (Denmark)

    Barini, Emanuele; Tosello, Guido; De Chiffre, Leonardo

    2007-01-01

    The paper describes a study concerning point by point scanning of complex surfaces using tactile CMMs. A four factors-two level full factorial experiment was carried out, involving measurements on a complex surface configuration item comprising a sphere, a cylinder and a cone, combined in a singl...

  19. Throughput analysis of point-to-multi-point hybric FSO/RF network

    KAUST Repository

    Rakia, Tamer; Gebali, Fayez; Yang, Hong-Chuan; Alouini, Mohamed-Slim

    2017-01-01

    This paper presents and analyzes a point-to-multi-point (P2MP) network that uses a number of free-space optical (FSO) links for data transmission from the central node to the different remote nodes. A common backup radio-frequency (RF) link is used

  20. Farmer cooperatives in the food economy of Western Europe: an analysis from the Marketing point of view

    NARCIS (Netherlands)

    Meulenberg, M.T.G.

    1979-01-01

    This paper is concerned with an analysis of farmer cooperatives in Western Europe from the marketing point of view. The analysis is restricted to marketing and processing cooperatives. First some basic characteristics of farmer cooperatives are discussed from a systems point of view. Afterwards

  1. Novel evaluation metrics for sparse spatio-temporal point process hotspot predictions - a crime case study

    OpenAIRE

    Adepeju, M.; Rosser, G.; Cheng, T.

    2016-01-01

    Many physical and sociological processes are represented as discrete events in time and space. These spatio-temporal point processes are often sparse, meaning that they cannot be aggregated and treated with conventional regression models. Models based on the point process framework may be employed instead for prediction purposes. Evaluating the predictive performance of these models poses a unique challenge, as the same sparseness prevents the use of popular measures such as the root mean squ...

  2. Spatially explicit spectral analysis of point clouds and geospatial data

    Science.gov (United States)

    Buscombe, Daniel D.

    2015-01-01

    The increasing use of spatially explicit analyses of high-resolution spatially distributed data (imagery and point clouds) for the purposes of characterising spatial heterogeneity in geophysical phenomena necessitates the development of custom analytical and computational tools. In recent years, such analyses have become the basis of, for example, automated texture characterisation and segmentation, roughness and grain size calculation, and feature detection and classification, from a variety of data types. In this work, much use has been made of statistical descriptors of localised spatial variations in amplitude variance (roughness), however the horizontal scale (wavelength) and spacing of roughness elements is rarely considered. This is despite the fact that the ratio of characteristic vertical to horizontal scales is not constant and can yield important information about physical scaling relationships. Spectral analysis is a hitherto under-utilised but powerful means to acquire statistical information about relevant amplitude and wavelength scales, simultaneously and with computational efficiency. Further, quantifying spatially distributed data in the frequency domain lends itself to the development of stochastic models for probing the underlying mechanisms which govern the spatial distribution of geological and geophysical phenomena. The software packagePySESA (Python program for Spatially Explicit Spectral Analysis) has been developed for generic analyses of spatially distributed data in both the spatial and frequency domains. Developed predominantly in Python, it accesses libraries written in Cython and C++ for efficiency. It is open source and modular, therefore readily incorporated into, and combined with, other data analysis tools and frameworks with particular utility for supporting research in the fields of geomorphology, geophysics, hydrography, photogrammetry and remote sensing. The analytical and computational structure of the toolbox is

  3. Spatially explicit spectral analysis of point clouds and geospatial data

    Science.gov (United States)

    Buscombe, Daniel

    2016-01-01

    The increasing use of spatially explicit analyses of high-resolution spatially distributed data (imagery and point clouds) for the purposes of characterising spatial heterogeneity in geophysical phenomena necessitates the development of custom analytical and computational tools. In recent years, such analyses have become the basis of, for example, automated texture characterisation and segmentation, roughness and grain size calculation, and feature detection and classification, from a variety of data types. In this work, much use has been made of statistical descriptors of localised spatial variations in amplitude variance (roughness), however the horizontal scale (wavelength) and spacing of roughness elements is rarely considered. This is despite the fact that the ratio of characteristic vertical to horizontal scales is not constant and can yield important information about physical scaling relationships. Spectral analysis is a hitherto under-utilised but powerful means to acquire statistical information about relevant amplitude and wavelength scales, simultaneously and with computational efficiency. Further, quantifying spatially distributed data in the frequency domain lends itself to the development of stochastic models for probing the underlying mechanisms which govern the spatial distribution of geological and geophysical phenomena. The software package PySESA (Python program for Spatially Explicit Spectral Analysis) has been developed for generic analyses of spatially distributed data in both the spatial and frequency domains. Developed predominantly in Python, it accesses libraries written in Cython and C++ for efficiency. It is open source and modular, therefore readily incorporated into, and combined with, other data analysis tools and frameworks with particular utility for supporting research in the fields of geomorphology, geophysics, hydrography, photogrammetry and remote sensing. The analytical and computational structure of the toolbox is described

  4. Hygienic-sanitary working practices and implementation of a Hazard Analysis and Critical Control Point (HACCP plan in lobster processing industries Condições higiênico-sanitárias e implementação do plano de Análise de Perigos e Pontos Críticos de Controle (APPCC em indústrias processadoras de lagosta

    Directory of Open Access Journals (Sweden)

    Cristina Farias da Fonseca

    2013-03-01

    Full Text Available This study aimed to verify the hygienic-sanitary working practices and to create and implement a Hazard Analysis Critical Control Point (HACCP) in two lobster processing industries in Pernambuco State, Brazil. The industries studied process frozen whole lobsters, frozen whole cooked lobsters, and frozen lobster tails for exportation. The application of the hygienic-sanitary checklist in the industries analyzed achieved conformity rates over 96% to the aspects evaluated. The use of the Hazard Analysis Critical Control Point (HACCP) plan resulted in the detection of two critical control points (CCPs) including the receiving and classification steps in the processing of frozen lobster and frozen lobster tails, and an additional critical control point (CCP) was detected during the cooking step of processing of the whole frozen cooked lobster. The proper implementation of the Hazard Analysis Critical Control Point (HACCP) plan in the lobster processing industries studied proved to be the safest and most cost-effective method to monitor each critical control point (CCP) hazards.Objetivou-se com este estudo verificar as condições higiênico-sanitárias e criar um plano de Análise de Perigos e Pontos Críticos de Controle (APPCC) para implantação em duas indústrias de processamento de lagosta no Estado de Pernambuco, Brasil. As indústrias estudadas processam lagosta inteira congelada, lagostas inteiras cozidas congeladas e caudas de lagosta congelada para exportação. A aplicação de um checklist de controle higiênico-sanitário nas indústrias visitadas resultou em uma classificação global de conformidades maior que 96% dos aspectos analisados. O desenvolvimento do plano APPCC resultou na detecção de dois pontos críticos de controle (PCC), incluindo o recebimento e etapas de classificação, no processamento de lagosta congelada e caudas de lagosta congelada, e um PCC adicional foi detectado no processamento de lagosta inteira cozida

  5. Edit distance for marked point processes revisited: An implementation by binary integer programming

    Energy Technology Data Exchange (ETDEWEB)

    Hirata, Yoshito; Aihara, Kazuyuki [Institute of Industrial Science, The University of Tokyo, 4-6-1 Komaba, Meguro-ku, Tokyo 153-8505 (Japan)

    2015-12-15

    We implement the edit distance for marked point processes [Suzuki et al., Int. J. Bifurcation Chaos 20, 3699–3708 (2010)] as a binary integer program. Compared with the previous implementation using minimum cost perfect matching, the proposed implementation has two advantages: first, by using the proposed implementation, we can apply a wide variety of software and hardware, even spin glasses and coherent ising machines, to calculate the edit distance for marked point processes; second, the proposed implementation runs faster than the previous implementation when the difference between the numbers of events in two time windows for a marked point process is large.

  6. STRUCTURE LINE DETECTION FROM LIDAR POINT CLOUDS USING TOPOLOGICAL ELEVATION ANALYSIS

    Directory of Open Access Journals (Sweden)

    C. Y. Lo

    2012-07-01

    Full Text Available Airborne LIDAR point clouds, which have considerable points on object surfaces, are essential to building modeling. In the last two decades, studies have developed different approaches to identify structure lines using two main approaches, data-driven and modeldriven. These studies have shown that automatic modeling processes depend on certain considerations, such as used thresholds, initial value, designed formulas, and predefined cues. Following the development of laser scanning systems, scanning rates have increased and can provide point clouds with higher point density. Therefore, this study proposes using topological elevation analysis (TEA to detect structure lines instead of threshold-dependent concepts and predefined constraints. This analysis contains two parts: data pre-processing and structure line detection. To preserve the original elevation information, a pseudo-grid for generating digital surface models is produced during the first part. The highest point in each grid is set as the elevation value, and its original threedimensional position is preserved. In the second part, using TEA, the structure lines are identified based on the topology of local elevation changes in two directions. Because structure lines can contain certain geometric properties, their locations have small relieves in the radial direction and steep elevation changes in the circular direction. Following the proposed approach, TEA can be used to determine 3D line information without selecting thresholds. For validation, the TEA results are compared with those of the region growing approach. The results indicate that the proposed method can produce structure lines using dense point clouds.

  7. Accuracy of multi-point boundary crossing time analysis

    Directory of Open Access Journals (Sweden)

    J. Vogt

    2011-12-01

    Full Text Available Recent multi-spacecraft studies of solar wind discontinuity crossings using the timing (boundary plane triangulation method gave boundary parameter estimates that are significantly different from those of the well-established single-spacecraft minimum variance analysis (MVA technique. A large survey of directional discontinuities in Cluster data turned out to be particularly inconsistent in the sense that multi-point timing analyses did not identify any rotational discontinuities (RDs whereas the MVA results of the individual spacecraft suggested that RDs form the majority of events. To make multi-spacecraft studies of discontinuity crossings more conclusive, the present report addresses the accuracy of the timing approach to boundary parameter estimation. Our error analysis is based on the reciprocal vector formalism and takes into account uncertainties both in crossing times and in the spacecraft positions. A rigorous error estimation scheme is presented for the general case of correlated crossing time errors and arbitrary spacecraft configurations. Crossing time error covariances are determined through cross correlation analyses of the residuals. The principal influence of the spacecraft array geometry on the accuracy of the timing method is illustrated using error formulas for the simplified case of mutually uncorrelated and identical errors at different spacecraft. The full error analysis procedure is demonstrated for a solar wind discontinuity as observed by the Cluster FGM instrument.

  8. Material-Point-Method Analysis of Collapsing Slopes

    DEFF Research Database (Denmark)

    Andersen, Søren; Andersen, Lars

    2009-01-01

    To understand the dynamic evolution of landslides and predict their physical extent, a computational model is required that is capable of analysing complex material behaviour as well as large strains and deformations. Here, a model is presented based on the so-called generalised-interpolation mat......To understand the dynamic evolution of landslides and predict their physical extent, a computational model is required that is capable of analysing complex material behaviour as well as large strains and deformations. Here, a model is presented based on the so-called generalised......, a deformed material description is introduced, based on time integration of the deformation gradient and utilising Gauss quadrature over the volume associated with each material point. The method has been implemented in a Fortran code and employed for the analysis of a landslide that took place during...

  9. Semantic multimedia analysis and processing

    CERN Document Server

    Spyrou, Evaggelos; Mylonas, Phivos

    2014-01-01

    Broad in scope, Semantic Multimedia Analysis and Processing provides a complete reference of techniques, algorithms, and solutions for the design and the implementation of contemporary multimedia systems. Offering a balanced, global look at the latest advances in semantic indexing, retrieval, analysis, and processing of multimedia, the book features the contributions of renowned researchers from around the world. Its contents are based on four fundamental thematic pillars: 1) information and content retrieval, 2) semantic knowledge exploitation paradigms, 3) multimedia personalization, and 4)

  10. System implementation of hazard analysis and critical control points (HACCP) in a nitrogen production plant

    International Nuclear Information System (INIS)

    Barrantes Salazar, Alexandra

    2014-01-01

    System of hazard analysis and critical control points are deployed in a production plant of liquid nitrogen. The fact that the nitrogen has become a complement to food packaging to increase shelf life, or provide a surface that protect it from manipulation, has been the main objective. Analysis of critical control points for the nitrogen production plant has been the adapted methodology. The knowledge of both the standard and the production process, as well as the on site verification process, have been necessary. In addition, all materials and/or processing units that are found in contact with the raw material or the product under study were evaluated. Such a way that the intrinsic risks of each were detected, from the physical, chemical and biological points of view according to the origin or pollution source. For each found risk was evaluated the probability of occurrence according to the frequency and gravity of it, with these variables determined was achieved the definition of the type of risk detected. In the cases that was presented a greater risk or critical, these were subjected decision tree; with which is concluded the non determination of critical control points. However, for each one of them were established the maximum permitted limits. To generate each of the results it has literature or scientific reference of reliable provenance, where is indicated properly the support of the evaluated matter. In a general way, the material matrix and the process matrix are found without critical control points; so that the project is concluded in the analysis, and it has to generate without the monitoring system and verification. To increase this project is suggested in order to cover the packaging system of gaseous nitrogen, due to it was delimited to liquid nitrogen. Furthermore, the liquid nitrogen is a 100% automated and closed process so the introduction of contaminants is very reduced, unlike the gaseous nitrogen process. (author) [es

  11. Profitability Analysis of Soybean Oil Processes.

    Science.gov (United States)

    Cheng, Ming-Hsun; Rosentrater, Kurt A

    2017-10-07

    Soybean oil production is the basic process for soybean applications. Cash flow analysis is used to estimate the profitability of a manufacturing venture. Besides capital investments, operating costs, and revenues, the interest rate is the factor to estimate the net present value (NPV), break-even points, and payback time; which are benchmarks for profitability evaluation. The positive NPV and reasonable payback time represent a profitable process, and provide an acceptable projection for real operating. Additionally, the capacity of the process is another critical factor. The extruding-expelling process and hexane extraction are the two typical approaches used in industry. When the capacities of annual oil production are larger than 12 and 173 million kg respectively, these two processes are profitable. The solvent free approach, known as enzyme assisted aqueous extraction process (EAEP), is profitable when the capacity is larger than 17 million kg of annual oil production.

  12. Profitability Analysis of Soybean Oil Processes

    Directory of Open Access Journals (Sweden)

    Ming-Hsun Cheng

    2017-10-01

    Full Text Available Soybean oil production is the basic process for soybean applications. Cash flow analysis is used to estimate the profitability of a manufacturing venture. Besides capital investments, operating costs, and revenues, the interest rate is the factor to estimate the net present value (NPV, break-even points, and payback time; which are benchmarks for profitability evaluation. The positive NPV and reasonable payback time represent a profitable process, and provide an acceptable projection for real operating. Additionally, the capacity of the process is another critical factor. The extruding-expelling process and hexane extraction are the two typical approaches used in industry. When the capacities of annual oil production are larger than 12 and 173 million kg respectively, these two processes are profitable. The solvent free approach, known as enzyme assisted aqueous extraction process (EAEP, is profitable when the capacity is larger than 17 million kg of annual oil production.

  13. The application of hazard analysis and critical control points and risk management in the preparation of anti-cancer drugs.

    Science.gov (United States)

    Bonan, Brigitte; Martelli, Nicolas; Berhoune, Malik; Maestroni, Marie-Laure; Havard, Laurent; Prognon, Patrice

    2009-02-01

    To apply the Hazard analysis and Critical Control Points method to the preparation of anti-cancer drugs. To identify critical control points in our cancer chemotherapy process and to propose control measures and corrective actions to manage these processes. The Hazard Analysis and Critical Control Points application began in January 2004 in our centralized chemotherapy compounding unit. From October 2004 to August 2005, monitoring of the process nonconformities was performed to assess the method. According to the Hazard Analysis and Critical Control Points method, a multidisciplinary team was formed to describe and assess the cancer chemotherapy process. This team listed all of the critical points and calculated their risk indexes according to their frequency of occurrence, their severity and their detectability. The team defined monitoring, control measures and corrective actions for each identified risk. Finally, over a 10-month period, pharmacists reported each non-conformity of the process in a follow-up document. Our team described 11 steps in the cancer chemotherapy process. The team identified 39 critical control points, including 11 of higher importance with a high-risk index. Over 10 months, 16,647 preparations were performed; 1225 nonconformities were reported during this same period. The Hazard Analysis and Critical Control Points method is relevant when it is used to target a specific process such as the preparation of anti-cancer drugs. This method helped us to focus on the production steps, which can have a critical influence on product quality, and led us to improve our process.

  14. The cylindrical K-function and Poisson line cluster point processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Safavimanesh, Farzaneh; Rasmussen, Jakob G.

    Poisson line cluster point processes, is also introduced. Parameter estimation based on moment methods or Bayesian inference for this model is discussed when the underlying Poisson line process and the cluster memberships are treated as hidden processes. To illustrate the methodologies, we analyze two...

  15. Bridging the gap between a stationary point process and its Palm distribution

    NARCIS (Netherlands)

    Nieuwenhuis, G.

    1994-01-01

    In the context of stationary point processes measurements are usually made from a time point chosen at random or from an occurrence chosen at random. That is, either the stationary distribution P or its Palm distribution P° is the ruling probability measure. In this paper an approach is presented to

  16. PENERAPAN SISTEM HAZARD ANALYSIS CRITICAL CONTROL POINT (HACCP PADA PROSES PEMBUATAN KERIPIK TEMPE

    Directory of Open Access Journals (Sweden)

    Rahmi Yuniarti

    2015-06-01

    Full Text Available Malang is one of the industrial centers of tempe chips. To maintain the quality and food safety, analysis is required to identify the hazards during the production process. This study was conducted to identify the hazards during the production process of tempe chips and provide recommendations for developing a HACCP system. The phases of production process of tempe chips are started from slice the tempe, move it to the kitchen, coat it with flour dough, fry it in the pan, drain it, package it, and then storage it. There are 3 types of potential hazards in terms of biological, physical, and chemical during the production process. With the CCP identification, there are three processes that have Critical Control Point. There are the process of slicing tempe, immersion of tempe into the flour mixture and draining. Recommendations for the development of HACCP systems include recommendations related to employee hygiene, supporting equipment, 5-S analysis, and the production layout.

  17. A critical analysis of the tender points in fibromyalgia.

    Science.gov (United States)

    Harden, R Norman; Revivo, Gadi; Song, Sharon; Nampiaparampil, Devi; Golden, Gary; Kirincic, Marie; Houle, Timothy T

    2007-03-01

    To pilot methodologies designed to critically assess the American College of Rheumatology's (ACR) diagnostic criteria for fibromyalgia. Prospective, psychophysical testing. An urban teaching hospital. Twenty-five patients with fibromyalgia and 31 healthy controls (convenience sample). Pressure pain threshold was determined at the 18 ACR tender points and five sham points using an algometer (dolorimeter). The patients "algometric total scores" (sums of the patients' average pain thresholds at the 18 tender points) were derived, as well as pain thresholds across sham points. The "algometric total score" could differentiate patients with fibromyalgia from normals with an accuracy of 85.7% (P pain across sham points than across ACR tender points, sham points also could be used for diagnosis (85.7%; Ps tested vs other painful conditions. The points specified by the ACR were only modestly superior to sham points in making the diagnosis. Most importantly, this pilot suggests single points, smaller groups of points, or sham points may be as effective in diagnosing fibromyalgia as the use of all 18 points, and suggests methodologies to definitively test that hypothesis.

  18. Process and results of analytical framework and typology development for POINT

    DEFF Research Database (Denmark)

    Gudmundsson, Henrik; Lehtonen, Markku; Bauler, Tom

    2009-01-01

    POINT is a project about how indicators are used in practice; to what extent and in what way indicators actually influence, support, or hinder policy and decision making processes, and what could be done to enhance the positive role of indicators in such processes. The project needs an analytical......, a set of core concepts and associated typologies, a series of analytic schemes proposed, and a number of research propositions and questions for the subsequent empirical work in POINT....

  19. Analysis of Multicomponent Adsorption Close to a Dew Point

    DEFF Research Database (Denmark)

    Shapiro, Alexander; Stenby, Erling Halfdan

    1998-01-01

    We develop the potential theory of multicomponent adsorption close to a dew point. The approach is based on an asymptotic adsorption equation (AAE) which is valid in a vicinity of the dew point. By this equation the thickness of the liquid film is expressed through thermodynamic characteristics...... and the direct calculations, even if the mixture is not close to a dew point.Key Words: adsorption; potential theory; multicomponent; dew point....

  20. Throughput analysis of point-to-multi-point hybric FSO/RF network

    KAUST Repository

    Rakia, Tamer

    2017-07-31

    This paper presents and analyzes a point-to-multi-point (P2MP) network that uses a number of free-space optical (FSO) links for data transmission from the central node to the different remote nodes. A common backup radio-frequency (RF) link is used by the central node for data transmission to any remote node in case of the failure of any one of FSO links. We develop a cross-layer Markov chain model to study the throughput from central node to a tagged remote node. Numerical examples are presented to compare the performance of the proposed P2MP hybrid FSO/RF network with that of a P2MP FSO-only network and show that the P2MP Hybrid FSO/RF network achieves considerable performance improvement over the P2MP FSO-only network.

  1. Software quality testing process analysis

    OpenAIRE

    Mera Paz, Julián

    2016-01-01

    Introduction: This article is the result of reading, review, analysis of books, magazines and articles well known for their scientific and research quality, which have addressed the software quality testing process. The author, based on his work experience in software development companies, teaching and other areas, has compiled and selected information to argue and substantiate the importance of the software quality testing process. Methodology: the existing literature on the software qualit...

  2. Characterization results and Markov chain Monte Carlo algorithms including exact simulation for some spatial point processes

    DEFF Research Database (Denmark)

    Häggström, Olle; Lieshout, Marie-Colette van; Møller, Jesper

    1999-01-01

    The area-interaction process and the continuum random-cluster model are characterized in terms of certain functional forms of their respective conditional intensities. In certain cases, these two point process models can be derived from a bivariate point process model which in many respects...... is simpler to analyse and simulate. Using this correspondence we devise a two-component Gibbs sampler, which can be used for fast and exact simulation by extending the recent ideas of Propp and Wilson. We further introduce a Swendsen-Wang type algorithm. The relevance of the results within spatial statistics...

  3. SINGLE TREE DETECTION FROM AIRBORNE LASER SCANNING DATA USING A MARKED POINT PROCESS BASED METHOD

    Directory of Open Access Journals (Sweden)

    J. Zhang

    2013-05-01

    Full Text Available Tree detection and reconstruction is of great interest in large-scale city modelling. In this paper, we present a marked point process model to detect single trees from airborne laser scanning (ALS data. We consider single trees in ALS recovered canopy height model (CHM as a realization of point process of circles. Unlike traditional marked point process, we sample the model in a constraint configuration space by making use of image process techniques. A Gibbs energy is defined on the model, containing a data term which judge the fitness of the model with respect to the data, and prior term which incorporate the prior knowledge of object layouts. We search the optimal configuration through a steepest gradient descent algorithm. The presented hybrid framework was test on three forest plots and experiments show the effectiveness of the proposed method.

  4. On the stability and dynamics of stochastic spiking neuron models: Nonlinear Hawkes process and point process GLMs.

    Science.gov (United States)

    Gerhard, Felipe; Deger, Moritz; Truccolo, Wilson

    2017-02-01

    Point process generalized linear models (PP-GLMs) provide an important statistical framework for modeling spiking activity in single-neurons and neuronal networks. Stochastic stability is essential when sampling from these models, as done in computational neuroscience to analyze statistical properties of neuronal dynamics and in neuro-engineering to implement closed-loop applications. Here we show, however, that despite passing common goodness-of-fit tests, PP-GLMs estimated from data are often unstable, leading to divergent firing rates. The inclusion of absolute refractory periods is not a satisfactory solution since the activity then typically settles into unphysiological rates. To address these issues, we derive a framework for determining the existence and stability of fixed points of the expected conditional intensity function (CIF) for general PP-GLMs. Specifically, in nonlinear Hawkes PP-GLMs, the CIF is expressed as a function of the previous spike history and exogenous inputs. We use a mean-field quasi-renewal (QR) approximation that decomposes spike history effects into the contribution of the last spike and an average of the CIF over all spike histories prior to the last spike. Fixed points for stationary rates are derived as self-consistent solutions of integral equations. Bifurcation analysis and the number of fixed points predict that the original models can show stable, divergent, and metastable (fragile) dynamics. For fragile models, fluctuations of the single-neuron dynamics predict expected divergence times after which rates approach unphysiologically high values. This metric can be used to estimate the probability of rates to remain physiological for given time periods, e.g., for simulation purposes. We demonstrate the use of the stability framework using simulated single-neuron examples and neurophysiological recordings. Finally, we show how to adapt PP-GLM estimation procedures to guarantee model stability. Overall, our results provide a

  5. Pressure Points in Reading Comprehension: A Quantile Multiple Regression Analysis

    Science.gov (United States)

    Logan, Jessica

    2017-01-01

    The goal of this study was to examine how selected pressure points or areas of vulnerability are related to individual differences in reading comprehension and whether the importance of these pressure points varies as a function of the level of children's reading comprehension. A sample of 245 third-grade children were given an assessment battery…

  6. Interevent Time Distribution of Renewal Point Process, Case Study: Extreme Rainfall in South Sulawesi

    Science.gov (United States)

    Sunusi, Nurtiti

    2018-03-01

    The study of time distribution of occurrences of extreme rain phenomena plays a very important role in the analysis and weather forecast in an area. The timing of extreme rainfall is difficult to predict because its occurrence is random. This paper aims to determine the inter event time distribution of extreme rain events and minimum waiting time until the occurrence of next extreme event through a point process approach. The phenomenon of extreme rain events over a given period of time is following a renewal process in which the time for events is a random variable τ. The distribution of random variable τ is assumed to be a Pareto, Log Normal, and Gamma. To estimate model parameters, a moment method is used. Consider Rt as the time of the last extreme rain event at one location is the time difference since the last extreme rainfall event. if there are no extreme rain events up to t 0, there will be an opportunity for extreme rainfall events at (t 0, t 0 + δt 0). Furthermore from the three models reviewed, the minimum waiting time until the next extreme rainfall will be determined. The result shows that Log Nrmal model is better than Pareto and Gamma model for predicting the next extreme rainfall in South Sulawesi while the Pareto model can not be used.

  7. The neutron capture cross section of the ${s}$-process branch point isotope $^{63}$Ni

    CERN Multimedia

    Neutron capture nucleosynthesis in massive stars plays an important role in Galactic chemical evolution as well as for the analysis of abundance patterns in very old metal-poor halo stars. The so-called weak ${s}$-process component, which is responsible for most of the ${s}$ abundances between Fe and Sr, turned out to be very sensitive to the stellar neutron capture cross sections in this mass region and, in particular, of isotopes near the seed distribution around Fe. In this context, the unstable isotope $^{63}$Ni is of particular interest because it represents the first branching point in the reaction path of the ${s}$-process. We propose to measure this cross section at n_TOF from thermal energies up to 500 keV, covering the entire range of astrophysical interest. These data are needed to replace uncertain theoretical predicitons by first experimental information to understand the consequences of the $^{63}$Ni branching for the abundance pattern of the subsequent isotopes, especially for $^{63}$Cu and $^{...

  8. Exergy analysis of the LFC process

    International Nuclear Information System (INIS)

    Li, Qingsong; Lin, Yuankui

    2016-01-01

    Highlights: • Mengdong lignite was upgraded by liquids from coal (LFC) process at a laboratory-scale. • True boiling point distillation of tar was performed. • Basing on experimental data, the LFC process was simulated in Aspen Plus. • Amounts of exergy destruction and efficiencies of blocks were calculated. • Potential measures for improving the LFC process are suggested. - Abstract: Liquid from coal (LFC) is a pyrolysis technology for upgrading lignite. LFC is close to viability as a large-scale commercial technology and is strongly promoted by the Chinese government. This paper presents an exergy analysis of the LFC process producing semicoke and tar, simulated in Aspen Plus. The simulation included the drying unit, pyrolysis unit, tar recovery unit and combustion unit. To obtain the data required for the simulation, Mengdong lignite was upgraded using a laboratory-scale experimental facility based on LFC technology. True boiling point distillation of tar was performed. Based on thermodynamic data obtained from the simulation, chemical exergy and physical exergy were determined for process streams and exergy destruction was calculated. The exergy budget of the LFC process is presented as a Grassmann flow diagram. The overall exergy efficiency was 76.81%, with the combustion unit causing the highest exergy destruction. The study found that overall exergy efficiency can be increased by reducing moisture in lignite and making full use of physical exergy of pyrolysates. A feasible method for making full use of physical exergy of semicoke was suggested.

  9. From point process observations to collective neural dynamics: Nonlinear Hawkes process GLMs, low-dimensional dynamics and coarse graining.

    Science.gov (United States)

    Truccolo, Wilson

    2016-11-01

    This review presents a perspective on capturing collective dynamics in recorded neuronal ensembles based on multivariate point process models, inference of low-dimensional dynamics and coarse graining of spatiotemporal measurements. A general probabilistic framework for continuous time point processes reviewed, with an emphasis on multivariate nonlinear Hawkes processes with exogenous inputs. A point process generalized linear model (PP-GLM) framework for the estimation of discrete time multivariate nonlinear Hawkes processes is described. The approach is illustrated with the modeling of collective dynamics in neocortical neuronal ensembles recorded in human and non-human primates, and prediction of single-neuron spiking. A complementary approach to capture collective dynamics based on low-dimensional dynamics ("order parameters") inferred via latent state-space models with point process observations is presented. The approach is illustrated by inferring and decoding low-dimensional dynamics in primate motor cortex during naturalistic reach and grasp movements. Finally, we briefly review hypothesis tests based on conditional inference and spatiotemporal coarse graining for assessing collective dynamics in recorded neuronal ensembles. Published by Elsevier Ltd.

  10. Determination of the impact of RGB points cloud attribute quality on color-based segmentation process

    Directory of Open Access Journals (Sweden)

    Bartłomiej Kraszewski

    2015-06-01

    Full Text Available The article presents the results of research on the effect that radiometric quality of point cloud RGB attributes have on color-based segmentation. In the research, a point cloud with a resolution of 5 mm, received from FAROARO Photon 120 scanner, described the fragment of an office’s room and color images were taken by various digital cameras. The images were acquired by SLR Nikon D3X, and SLR Canon D200 integrated with the laser scanner, compact camera Panasonic TZ-30 and a mobile phone digital camera. Color information from images was spatially related to point cloud in FAROARO Scene software. The color-based segmentation of testing data was performed with the use of a developed application named “RGB Segmentation”. The application was based on public Point Cloud Libraries (PCL and allowed to extract subsets of points fulfilling the criteria of segmentation from the source point cloud using region growing method.Using the developed application, the segmentation of four tested point clouds containing different RGB attributes from various images was performed. Evaluation of segmentation process was performed based on comparison of segments acquired using the developed application and extracted manually by an operator. The following items were compared: the number of obtained segments, the number of correctly identified objects and the correctness of segmentation process. The best correctness of segmentation and most identified objects were obtained using the data with RGB attribute from Nikon D3X images. Based on the results it was found that quality of RGB attributes of point cloud had impact only on the number of identified objects. In case of correctness of the segmentation, as well as its error no apparent relationship between the quality of color information and the result of the process was found.[b]Keywords[/b]: terrestrial laser scanning, color-based segmentation, RGB attribute, region growing method, digital images, points cloud

  11. Probabilistic safety assessment and optimal control of hazardous technological systems. A marked point process approach

    International Nuclear Information System (INIS)

    Holmberg, J.

    1997-04-01

    The thesis models risk management as an optimal control problem for a stochastic process. The approach classes the decisions made by management into three categories according to the control methods of a point process: (1) planned process lifetime, (2) modification of the design, and (3) operational decisions. The approach is used for optimization of plant shutdown criteria and surveillance test strategies of a hypothetical nuclear power plant

  12. Probabilistic safety assessment and optimal control of hazardous technological systems. A marked point process approach

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, J [VTT Automation, Espoo (Finland)

    1997-04-01

    The thesis models risk management as an optimal control problem for a stochastic process. The approach classes the decisions made by management into three categories according to the control methods of a point process: (1) planned process lifetime, (2) modification of the design, and (3) operational decisions. The approach is used for optimization of plant shutdown criteria and surveillance test strategies of a hypothetical nuclear power plant. 62 refs. The thesis includes also five previous publications by author.

  13. Apparatus and method for implementing power saving techniques when processing floating point values

    Science.gov (United States)

    Kim, Young Moon; Park, Sang Phill

    2017-10-03

    An apparatus and method are described for reducing power when reading and writing graphics data. For example, one embodiment of an apparatus comprises: a graphics processor unit (GPU) to process graphics data including floating point data; a set of registers, at least one of the registers of the set partitioned to store the floating point data; and encode/decode logic to reduce a number of binary 1 values being read from the at least one register by causing a specified set of bit positions within the floating point data to be read out as 0s rather than 1s.

  14. Material-Point Method Analysis of Bending in Elastic Beams

    DEFF Research Database (Denmark)

    Andersen, Søren Mikkel; Andersen, Lars

    2007-01-01

    The aim of this paper is to test different types of spatial interpolation for the material-point method. The interpolations include quadratic elements and cubic splines. A brief introduction to the material-point method is given. Simple liner-elastic problems are tested, including the classical...... cantilevered beam problem. As shown in the paper, the use of negative shape functions is not consistent with the material-point method in its current form, necessitating other types of interpolation such as cubic splines in order to obtain smoother representations of field quantities. It is shown...

  15. Using a Virtual Experiment to Analyze Infiltration Process from Point to Grid-cell Size Scale

    Science.gov (United States)

    Barrios, M. I.

    2013-12-01

    The hydrological science requires the emergence of a consistent theoretical corpus driving the relationships between dominant physical processes at different spatial and temporal scales. However, the strong spatial heterogeneities and non-linearities of these processes make difficult the development of multiscale conceptualizations. Therefore, scaling understanding is a key issue to advance this science. This work is focused on the use of virtual experiments to address the scaling of vertical infiltration from a physically based model at point scale to a simplified physically meaningful modeling approach at grid-cell scale. Numerical simulations have the advantage of deal with a wide range of boundary and initial conditions against field experimentation. The aim of the work was to show the utility of numerical simulations to discover relationships between the hydrological parameters at both scales, and to use this synthetic experience as a media to teach the complex nature of this hydrological process. The Green-Ampt model was used to represent vertical infiltration at point scale; and a conceptual storage model was employed to simulate the infiltration process at the grid-cell scale. Lognormal and beta probability distribution functions were assumed to represent the heterogeneity of soil hydraulic parameters at point scale. The linkages between point scale parameters and the grid-cell scale parameters were established by inverse simulations based on the mass balance equation and the averaging of the flow at the point scale. Results have shown numerical stability issues for particular conditions and have revealed the complex nature of the non-linear relationships between models' parameters at both scales and indicate that the parameterization of point scale processes at the coarser scale is governed by the amplification of non-linear effects. The findings of these simulations have been used by the students to identify potential research questions on scale issues

  16. Putting to point the production process of iodine-131 by dry distillation (Preoperational tests)

    International Nuclear Information System (INIS)

    Alanis M, J.

    2002-12-01

    With the purpose of putting to point the process of production of 131 I, one of the objectives of carrying out the realization of operational tests of the production process of iodine-131, it was of verifying the operation of each one of the following components: heating systems, vacuum system, mechanical system and peripheral equipment that are part of the production process of iodine-131, another of the objectives, was settling down the optimal parameters that were applied in each process during the obtaining of iodine-131, it is necessary to point out that this objective is very important, since the components of the equipment are new and its behavior during the process is different to the equipment where its were carried out the experimental studies. (Author)

  17. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard Analysis and Critical Control Point (HACCP... SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS General Provisions § 120.8 Hazard Analysis and Critical Control Point (HACCP) plan. (a) HACCP plan. Each...

  18. The application of prototype point processes for the summary and description of California wildfires

    Science.gov (United States)

    Nichols, K.; Schoenberg, F.P.; Keeley, J.E.; Bray, A.; Diez, D.

    2011-01-01

    A method for summarizing repeated realizations of a space-time marked point process, known as prototyping, is discussed and applied to catalogues of wildfires in California. Prototype summaries are constructed for varying time intervals using California wildfire data from 1990 to 2006. Previous work on prototypes for temporal and space-time point processes is extended here to include methods for computing prototypes with marks and the incorporation of prototype summaries into hierarchical clustering algorithms, the latter of which is used to delineate fire seasons in California. Other results include summaries of patterns in the spatial-temporal distribution of wildfires within each wildfire season. ?? 2011 Blackwell Publishing Ltd.

  19. Mass measurement on the rp-process waiting point {sup 72}Kr

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez, D. [Gesellschaft fuer Schwerionenforschung mbH, Darmstadt (Germany); Kolhinen, V.S. [Jyvaeskylae Univ. (Finland); Audi, G. [CSNSM-IN2P3-Centre National de la Recherche Scientifique (CNRS), 91 - Orsay (FR)] [and others

    2004-06-01

    The mass of one of the three major waiting points in the astrophysical rp-process {sup 72}Kr was measured for the first time with the Penning trap mass spectrometer ISOLTRAP. The measurement yielded a relative mass uncertainty of {delta}m/m=1.2 x 10{sup -7} ({delta}m=8 keV). Other Kr isotopes, also needed for astrophysical calculations, were measured with more than one order of magnitude improved accuracy. We use the ISOLTRAP masses of{sup 72-74}Kr to reanalyze the role of the {sup 72}Kr waiting point in the rp-process during X-ray bursts. (orig.)

  20. Standard hazard analysis, critical control point and hotel management

    Directory of Open Access Journals (Sweden)

    Vujačić Vesna

    2017-01-01

    Full Text Available Tourism is a dynamic category which is continuously evolving in the world. Specificities that have to be respected in the execution in relation to the food industry are connected with the fact that the main differences which exist regarding the food serving procedure in catering, numerous complex recipes and production technologies, staff fluctuation, old equipment. For an effective and permanent implementation, the HACCP concept is very important for building a serious base. In this case, the base is represented by the people handling the food. This paper presents international ISO standards, the concept of HACCP and the importance of its application in the tourism and hospitality industry. The concept of HACCP is a food safety management system through the analysis and control of biological, chemical and physical hazards in the entire process, from raw material production, procurement, handling, to manufacturing, distribution and consumption of the finished product. The aim of this paper is to present the importance of the application of HACCP concept in tourism and hotel management as a recognizable international standard.

  1. Material-Point Analysis of Large-Strain Problems

    DEFF Research Database (Denmark)

    Andersen, Søren

    The aim of this thesis is to apply and improve the material-point method for modelling of geotechnical problems. One of the geotechnical phenomena that is a subject of active research is the study of landslides. A large amount of research is focused on determining when slopes become unstable. Hence......, it is possible to predict if a certain slope is stable using commercial finite element or finite difference software such as PLAXIS, ABAQUS or FLAC. However, the dynamics during a landslide are less explored. The material-point method (MPM) is a novel numerical method aimed at analysing problems involving...... materials subjected to large strains in a dynamical time–space domain. This thesis explores the material-point method with the specific aim of improving the performance for geotechnical problems. Large-strain geotechnical problems such as landslides pose a major challenge to model numerically. Employing...

  2. Preliminary hazards analysis -- vitrification process

    International Nuclear Information System (INIS)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P.

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility's construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment

  3. Preliminary hazards analysis -- vitrification process

    Energy Technology Data Exchange (ETDEWEB)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  4. Risk analysis: opening the process

    International Nuclear Information System (INIS)

    Hubert, Ph.; Mays, C.

    1998-01-01

    This conference on risk analysis took place in Paris, 11-14 october 1999. Over 200 paper where presented in the seven following sessions: perception; environment and health; persuasive risks; objects and products; personal and collective involvement; assessment and valuation; management. A rational approach to risk analysis has been developed in the three last decades. Techniques for risk assessment have been thoroughly enhanced, risk management approaches have been developed, decision making processes have been clarified, the social dimensions of risk perception and management have been investigated. Nevertheless this construction is being challenged by recent events which reveal how deficits in stakeholder involvement, openness and democratic procedures can undermine risk management actions. Indeed, the global process most components of risk analysis may be radically called into question. Food safety has lately been a prominent issue, but now debates appear, or old debates are revisited in the domains of public health, consumer products safety, waste management, environmental risks, nuclear installations, automobile safety and pollution. To meet the growing pressures for efficiency, openness, accountability, and multi-partner communication in risk analysis, institutional changes are underway in many European countries. However, the need for stakeholders to develop better insight into the process may lead to an evolution of all the components of risks analysis, even in its most (technical' steps. For stakeholders of different professional background, political projects, and responsibilities, risk identification procedures must be rendered understandable, quantitative risk assessment must be intelligible and accommodated in action proposals, ranging from countermeasures to educational programs to insurance mechanisms. Management formats must be open to local and political input and other types of operational feedback. (authors)

  5. End point detection in ion milling processes by sputter-induced optical emission spectroscopy

    International Nuclear Information System (INIS)

    Lu, C.; Dorian, M.; Tabei, M.; Elsea, A.

    1984-01-01

    The characteristic optical emission from the sputtered material during ion milling processes can provide an unambiguous indication of the presence of the specific etched species. By monitoring the intensity of a representative emission line, the etching process can be precisely terminated at an interface. Enhancement of the etching end point is possible by using a dual-channel photodetection system operating in a ratio or difference mode. The installation of the optical detection system to an existing etching chamber has been greatly facilitated by the use of optical fibers. Using a commercial ion milling system, experimental data for a number of etching processes have been obtained. The result demonstrates that sputter-induced optical emission spectroscopy offers many advantages over other techniques in detecting the etching end point of ion milling processes

  6. Material-point Method Analysis of Bending in Elastic Beams

    DEFF Research Database (Denmark)

    Andersen, Søren Mikkel; Andersen, Lars

    The aim of this paper is to test different types of spatial interpolation for the materialpoint method. The interpolations include quadratic elements and cubic splines. A brief introduction to the material-point method is given. Simple liner-elastic problems are tested, including the classical...... cantilevered beam problem. As shown in the paper, the use of negative shape functions is not consistent with the material-point method in its current form, necessitating other types of interpolation such as cubic splines in order to obtain smoother representations of field quantities. It is shown...

  7. Simultaneous reconstruction of multiple depth images without off-focus points in integral imaging using a graphics processing unit.

    Science.gov (United States)

    Yi, Faliu; Lee, Jieun; Moon, Inkyu

    2014-05-01

    The reconstruction of multiple depth images with a ray back-propagation algorithm in three-dimensional (3D) computational integral imaging is computationally burdensome. Further, a reconstructed depth image consists of a focus and an off-focus area. Focus areas are 3D points on the surface of an object that are located at the reconstructed depth, while off-focus areas include 3D points in free-space that do not belong to any object surface in 3D space. Generally, without being removed, the presence of an off-focus area would adversely affect the high-level analysis of a 3D object, including its classification, recognition, and tracking. Here, we use a graphics processing unit (GPU) that supports parallel processing with multiple processors to simultaneously reconstruct multiple depth images using a lookup table containing the shifted values along the x and y directions for each elemental image in a given depth range. Moreover, each 3D point on a depth image can be measured by analyzing its statistical variance with its corresponding samples, which are captured by the two-dimensional (2D) elemental images. These statistical variances can be used to classify depth image pixels as either focus or off-focus points. At this stage, the measurement of focus and off-focus points in multiple depth images is also implemented in parallel on a GPU. Our proposed method is conducted based on the assumption that there is no occlusion of the 3D object during the capture stage of the integral imaging process. Experimental results have demonstrated that this method is capable of removing off-focus points in the reconstructed depth image. The results also showed that using a GPU to remove the off-focus points could greatly improve the overall computational speed compared with using a CPU.

  8. ON THE ESTIMATION OF DISTANCE DISTRIBUTION FUNCTIONS FOR POINT PROCESSES AND RANDOM SETS

    Directory of Open Access Journals (Sweden)

    Dietrich Stoyan

    2011-05-01

    Full Text Available This paper discusses various estimators for the nearest neighbour distance distribution function D of a stationary point process and for the quadratic contact distribution function Hq of a stationary random closed set. It recommends the use of Hanisch's estimator of D, which is of Horvitz-Thompson type, and the minussampling estimator of Hq. This recommendation is based on simulations for Poisson processes and Boolean models.

  9. AKaplan-Meier estimators of distance distributions for spatial point processes

    NARCIS (Netherlands)

    Baddeley, A.J.; Gill, R.D.

    1997-01-01

    When a spatial point process is observed through a bounded window, edge effects hamper the estimation of characteristics such as the empty space function $F$, the nearest neighbour distance distribution $G$, and the reduced second order moment function $K$. Here we propose and study product-limit

  10. Two step estimation for Neyman-Scott point process with inhomogeneous cluster centers

    Czech Academy of Sciences Publication Activity Database

    Mrkvička, T.; Muška, Milan; Kubečka, Jan

    2014-01-01

    Roč. 24, č. 1 (2014), s. 91-100 ISSN 0960-3174 R&D Projects: GA ČR(CZ) GA206/07/1392 Institutional support: RVO:60077344 Keywords : bayesian method * clustering * inhomogeneous point process Subject RIV: EH - Ecology, Behaviour Impact factor: 1.623, year: 2014

  11. Dense range images from sparse point clouds using multi-scale processing

    NARCIS (Netherlands)

    Do, Q.L.; Ma, L.; With, de P.H.N.

    2013-01-01

    Multi-modal data processing based on visual and depth/range images has become relevant in computer vision for 3D reconstruction applications such as city modeling, robot navigation etc. In this paper, we generate highaccuracy dense range images from sparse point clouds to facilitate such

  12. Fast covariance estimation for innovations computed from a spatial Gibbs point process

    DEFF Research Database (Denmark)

    Coeurjolly, Jean-Francois; Rubak, Ege

    In this paper, we derive an exact formula for the covariance of two innovations computed from a spatial Gibbs point process and suggest a fast method for estimating this covariance. We show how this methodology can be used to estimate the asymptotic covariance matrix of the maximum pseudo...

  13. A Systematic Approach to Process Evaluation in the Central Oklahoma Turning Point (COTP) Partnership

    Science.gov (United States)

    Tolma, Eleni L.; Cheney, Marshall K.; Chrislip, David D.; Blankenship, Derek; Troup, Pam; Hann, Neil

    2011-01-01

    Formation is an important stage of partnership development. Purpose: To describe the systematic approach to process evaluation of a Turning Point initiative in central Oklahoma during the formation stage. The nine-month collaborative effort aimed to develop an action plan to promote health. Methods: A sound planning framework was used in the…

  14. Can the Hazard Assessment and Critical Control Points (HACCP) system be used to design process-based hygiene concepts?

    Science.gov (United States)

    Hübner, N-O; Fleßa, S; Haak, J; Wilke, F; Hübner, C; Dahms, C; Hoffmann, W; Kramer, A

    2011-01-01

    Recently, the HACCP (Hazard Analysis and Critical Control Points) concept was proposed as possible way to implement process-based hygiene concepts in clinical practice, but the extent to which this food safety concept can be transferred into the health care setting is unclear. We therefore discuss possible ways for a translation of the principles of the HACCP for health care settings. While a direct implementation of food processing concepts into health care is not very likely to be feasible and will probably not readily yield the intended results, the underlying principles of process-orientation, in-process safety control and hazard analysis based counter measures are transferable to clinical settings. In model projects the proposed concepts should be implemented, monitored, and evaluated under real world conditions.

  15. Analysis of Spatial Interpolation in the Material-Point Method

    DEFF Research Database (Denmark)

    Andersen, Søren; Andersen, Lars

    2010-01-01

    are obtained using quadratic elements. It is shown that for more complex problems, the use of partially negative shape functions is inconsistent with the material-point method in its current form, necessitating other types of interpolation such as cubic splines in order to obtain smoother representations...

  16. Ergonomic analysis of radiopharmaceuticals samples preparation process

    International Nuclear Information System (INIS)

    Gomes, Luciene Betzler C.; Santos, Isaac Luquetti dos; Fonseca, Antonio Carlos C. da; Pellini, Marcos Pinto; Rebelo, Ana Maria

    2005-01-01

    The doses of radioisotopes to be administrated in patients for diagnostic effect or therapy are prepared in the radiopharmacological sector. The preparation process adopts techniques that are aimed to reduce the exposition time of the professionals and the absorption of excessive doses for patients. The ergonomic analysis of this process contributes in the prevention of occupational illnesses and to prevent risks of accidents during the routines, providing welfare and security to the involved users and conferring to the process an adequate working standard. In this context it is perceived relevance of studies that deal with the analysis of factors that point with respect to the solution of problems and for establishing proposals that minimize risks in the exercise of the activities. Through a methodology that considers the application of the concepts of Ergonomics, it is searched the improvement of the effectiveness or the quality and reduction of the difficulties lived for the workers. The work prescribed, established through norms and procedures codified will be faced with the work effectively carried through, the real work, shaped to break the correct appreciation, with focus in the activities. This work has as objective to argue an ergonomic analysis of samples preparation process of radioisotopes in the Setor de Radiofarmacia do Hospital Universitario Clementino Fraga Filho da Universidade Federal do Rio de Janeiro (UFRJ). (author)

  17. Pasteurised milk and implementation of HACCP (Hazard Analysis Critical Control Point

    Directory of Open Access Journals (Sweden)

    T.B Murdiati

    2004-10-01

    Full Text Available The purpose of pasteurisation is to destroy pathogen bacteria without affecting the taste, flavor, and nutritional value. A study on the implementation of HACCP (Hazard Analysis Critical Control Point in producing pasteurized milk was carried out in four processing unit of pasteurised milk, one in Jakarta, two in Bandung and one in Bogor. The critical control points in the production line were identified. Milk samples were collected from the critical points and were analysed for the total number of microbes. Antibiotic residues were detected on raw milks. The study indicated that one unit in Bandung dan one unit in Jakarta produced pasteurized milk with lower number of microbes than the other units, due to better management and control applied along the chain of production. Penisilin residues was detected in raw milk used by unit in Bogor. Six critical points and the hazard might arise in those points were identified, as well as how to prevent the hazards. Quality assurance system such as HACCP would be able to produce high quality and safety of pasteurised milk, and should be implemented gradually.

  18. A Combined Control Chart for Identifying Out–Of–Control Points in Multivariate Processes

    Directory of Open Access Journals (Sweden)

    Marroquín–Prado E.

    2010-10-01

    Full Text Available The Hotelling's T2 control chart is widely used to identify out–of–control signals in multivariate processes. However, this chart is not sensitive to small shifts in the process mean vec tor. In this work we propose a control chart to identify out–of–control signals. The proposed chart is a combination of Hotelling's T2 chart, M chart proposed by Hayter et al. (1994 and a new chart based on Principal Components. The combination of these charts identifies any type and size of change in the process mean vector. Us ing simulation and the Average Run Length (ARL, the performance of the proposed control chart is evaluated. The ARL means the average points within control before an out–of–control point is detected, The results of the simulation show that the proposed chart is more sensitive that each one of the three charts individually

  19. Steam generators secondary side chemical cleaning at Point Lepreau using the Siemen's high temperature process

    International Nuclear Information System (INIS)

    Verma, K.; MacNeil, C.; Odar, S.

    1996-01-01

    The secondary sides of all four steam generators at the Point Lepreau Nuclear Generating Stations were cleaned during the 1995 annual outage run-down using the Siemens high temperature chemical cleaning process. Traditionally all secondary side chemical cleaning exercises in CANDU as well as the other nuclear power stations in North America have been conducted using a process developed in conjunction with the Electric Power Research Institute (EPRI). The Siemens high temperature process was applied for the first time in North America at the Point Lepreau Nuclear Generating Station (PLGS). The paper discusses experiences related to the pre and post award chemical cleaning activities, chemical cleaning application, post cleaning inspection results and waste handling activities. (author)

  20. Unified analysis of preconditioning methods for saddle point matrices

    Czech Academy of Sciences Publication Activity Database

    Axelsson, Owe

    2015-01-01

    Roč. 22, č. 2 (2015), s. 233-253 ISSN 1070-5325 R&D Projects: GA MŠk ED1.1.00/02.0070 Institutional support: RVO:68145535 Keywords : saddle point problems * preconditioning * spectral properties Subject RIV: BA - General Mathematics Impact factor: 1.431, year: 2015 http://onlinelibrary.wiley.com/doi/10.1002/nla.1947/pdf

  1. Bayesian inference for multivariate point processes observed at sparsely distributed times

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl; Møller, Jesper; Aukema, B.H.

    We consider statistical and computational aspects of simulation-based Bayesian inference for a multivariate point process which is only observed at sparsely distributed times. For specicity we consider a particular data set which has earlier been analyzed by a discrete time model involving unknown...... normalizing constants. We discuss the advantages and disadvantages of using continuous time processes compared to discrete time processes in the setting of the present paper as well as other spatial-temporal situations. Keywords: Bark beetle, conditional intensity, forest entomology, Markov chain Monte Carlo...

  2. The Oil Point Method - A tool for indicative environmental evaluation in material and process selection

    DEFF Research Database (Denmark)

    Bey, Niki

    2000-01-01

    to three essential assessment steps, the method enables rough environmental evaluations and supports in this way material- and process-related decision-making in the early stages of design. In its overall structure, the Oil Point Method is related to Life Cycle Assessment - except for two main differences...... of environmental evaluation and only approximate information about the product and its life cycle. This dissertation addresses this challenge in presenting a method, which is tailored to these requirements of designers - the Oil Point Method (OPM). In providing environmental key information and confining itself...

  3. Microbiological analysis of critical points in the chicken industry

    Directory of Open Access Journals (Sweden)

    Rogério Luis Cansian

    2005-05-01

    Full Text Available This work is focused on identifying microbial contamination in the scalding asepsis and cooling processes as well as in fresh sausages obtained. Salmonella was identified in two scald water samples but was absent in the water from chiller and in the final product, which might be explained in terms of chlorine addition and temperature reduction. The analysis revealed that MPN of Escherichia coli was in the range of O objetivo deste trabalho foi identificar a contaminação microbiana no processo de escaldagem, assepsia e resfriamento do frango (chiller, e em linguiças de frango produzidas a partir destes. As amostras foram coletadas em um frigorífico de aves, em sete datas e analisadas em triplicata. A presença de Salmonella foi detectada em duas amostras da água de escaldagem não estando mais presente na água do chiller e nem no produto final. Isto se deve à redução de temperatura da água e adição de cloro. O NMP de coliformes fecais variou entre < 1 a 11/ml na água de escaldagem e < 1 a 64/ml na água do chiller, que embora em padrões aceitáveis, mostram tendência de acréscimo no chiller, devido principalmente ao processo de evisceração. As contagens de Aeromonas variaram de 5 a 3,5x10¹UFC/ml na água de escaldagem e 9 a 3,7x10²UFC/ml na água do chiller. Este acréscimo se deve, provavelmente, por Aeromonas ser psicrófila e também devido a retirada das víceras. As análises de linguiça de frango mostraram acréscimo nas contagens de Aeromonas, apresentando até 2,5x10³UFC/g. Esta tendência de aumento de crescimento no produto final, aliado a capacidade de causar infecções de Aeromonas demonstram a necessidade de incluir a análise destas nas avaliações microbiológicas de alimentos.

  4. Image processing and analysis software development

    International Nuclear Information System (INIS)

    Shahnaz, R.

    1999-01-01

    The work presented in this project is aimed at developing a software 'IMAGE GALLERY' to investigate various image processing and analysis techniques. The work was divided into two parts namely the image processing techniques and pattern recognition, which further comprised of character and face recognition. Various image enhancement techniques including negative imaging, contrast stretching, compression of dynamic, neon, diffuse, emboss etc. have been studied. Segmentation techniques including point detection, line detection, edge detection have been studied. Also some of the smoothing and sharpening filters have been investigated. All these imaging techniques have been implemented in a window based computer program written in Visual Basic Neural network techniques based on Perception model have been applied for face and character recognition. (author)

  5. Analysis method of beam pointing stability based on optical transmission matrix

    Science.gov (United States)

    Wang, Chuanchuan; Huang, PingXian; Li, Xiaotong; Cen, Zhaofen

    2016-10-01

    Quite a lot of factors will make effects on beam pointing stability of an optical system, Among them, the element tolerance is one of the most important and common factors. In some large laser systems, it will make final micro beams spot on the image plane deviate obviously. So it is essential for us to achieve effective and accurate analysis theoretically on element tolerance. In order to make the analysis of beam pointing stability convenient and theoretical, we consider transmission of a single chief ray rather than beams approximately to stand for the whole spot deviation. According to optical matrix, we also simplify this complex process of light transmission to multiplication of many matrices. So that we can set up element tolerance model, namely having mathematical expression to illustrate spot deviation in an optical system with element tolerance. In this way, we can realize quantitative analysis of beam pointing stability theoretically. In second half of the paper, we design an experiment to get the spot deviation in a multipass optical system caused by element tolerance, then we adjust the tolerance step by step and compare the results with the datum got from tolerance model, finally prove the correction of tolerance model successfully.

  6. Super-Relaxed ( -Proximal Point Algorithms, Relaxed ( -Proximal Point Algorithms, Linear Convergence Analysis, and Nonlinear Variational Inclusions

    Directory of Open Access Journals (Sweden)

    Agarwal RaviP

    2009-01-01

    Full Text Available We glance at recent advances to the general theory of maximal (set-valued monotone mappings and their role demonstrated to examine the convex programming and closely related field of nonlinear variational inequalities. We focus mostly on applications of the super-relaxed ( -proximal point algorithm to the context of solving a class of nonlinear variational inclusion problems, based on the notion of maximal ( -monotonicity. Investigations highlighted in this communication are greatly influenced by the celebrated work of Rockafellar (1976, while others have played a significant part as well in generalizing the proximal point algorithm considered by Rockafellar (1976 to the case of the relaxed proximal point algorithm by Eckstein and Bertsekas (1992. Even for the linear convergence analysis for the overrelaxed (or super-relaxed ( -proximal point algorithm, the fundamental model for Rockafellar's case does the job. Furthermore, we attempt to explore possibilities of generalizing the Yosida regularization/approximation in light of maximal ( -monotonicity, and then applying to first-order evolution equations/inclusions.

  7. Prospects for direct neutron capture measurements on s-process branching point isotopes

    Energy Technology Data Exchange (ETDEWEB)

    Guerrero, C.; Lerendegui-Marco, J.; Quesada, J.M. [Universidad de Sevilla, Dept. de Fisica Atomica, Molecular y Nuclear, Sevilla (Spain); Domingo-Pardo, C. [CSIC-Universidad de Valencia, Instituto de Fisica Corpuscular, Valencia (Spain); Kaeppeler, F. [Karlsruhe Institute of Technology, Institut fuer Kernphysik, Karlsruhe (Germany); Palomo, F.R. [Universidad de Sevilla, Dept. de Ingenieria Electronica, Sevilla (Spain); Reifarth, R. [Goethe-Universitaet Frankfurt am Main, Frankfurt am Main (Germany)

    2017-05-15

    The neutron capture cross sections of several unstable key isotopes acting as branching points in the s-process are crucial for stellar nucleosynthesis studies, but they are very challenging to measure directly due to the difficult production of sufficient sample material, the high activity of the resulting samples, and the actual (n, γ) measurement, where high neutron fluxes and effective background rejection capabilities are required. At present there are about 21 relevant s-process branching point isotopes whose cross section could not be measured yet over the neutron energy range of interest for astrophysics. However, the situation is changing with some very recent developments and upcoming technologies. This work introduces three techniques that will change the current paradigm in the field: the use of γ-ray imaging techniques in (n, γ) experiments, the production of moderated neutron beams using high-power lasers, and double capture experiments in Maxwellian neutron beams. (orig.)

  8. Instantaneous nonlinear assessment of complex cardiovascular dynamics by Laguerre-Volterra point process models.

    Science.gov (United States)

    Valenza, Gaetano; Citi, Luca; Barbieri, Riccardo

    2013-01-01

    We report an exemplary study of instantaneous assessment of cardiovascular dynamics performed using point-process nonlinear models based on Laguerre expansion of the linear and nonlinear Wiener-Volterra kernels. As quantifiers, instantaneous measures such as high order spectral features and Lyapunov exponents can be estimated from a quadratic and cubic autoregressive formulation of the model first order moment, respectively. Here, these measures are evaluated on heartbeat series coming from 16 healthy subjects and 14 patients with Congestive Hearth Failure (CHF). Data were gathered from the on-line repository PhysioBank, which has been taken as landmark for testing nonlinear indices. Results show that the proposed nonlinear Laguerre-Volterra point-process methods are able to track the nonlinear and complex cardiovascular dynamics, distinguishing significantly between CHF and healthy heartbeat series.

  9. MODELLING AND SIMULATION OF A NEUROPHYSIOLOGICAL EXPERIMENT BY SPATIO-TEMPORAL POINT PROCESSES

    Directory of Open Access Journals (Sweden)

    Viktor Beneš

    2011-05-01

    Full Text Available We present a stochastic model of an experimentmonitoring the spiking activity of a place cell of hippocampus of an experimental animal moving in an arena. Doubly stochastic spatio-temporal point process is used to model and quantify overdispersion. Stochastic intensity is modelled by a Lévy based random field while the animal path is simplified to a discrete random walk. In a simulation study first a method suggested previously is used. Then it is shown that a solution of the filtering problem yields the desired inference to the random intensity. Two approaches are suggested and the new one based on finite point process density is applied. Using Markov chain Monte Carlo we obtain numerical results from the simulated model. The methodology is discussed.

  10. Low dose response analysis through a cytogenetic end-point

    International Nuclear Information System (INIS)

    Bojtor, I.; Koeteles, G.J.

    1998-01-01

    The effects of low doses were studied on human lymphocytes of various individuals. The frequency of micronuclei in cytokinesis-blocked cultured lymphocytes was taken as end-point. The probability distribution of radiation-induced increment was statistically proved and identified as to be asymmetric when the blood samples had been irradiated with doses of 0.01-0.05 Gy of X-rays, similarly to that in unirradiated control population. On the contrary, at or above 1 Gy the corresponding normal curve could be accepted only reflecting an approximately symmetrical scatter of the increments about their mean value. It was found that the slope as well as the closeness of correlation of the variables considerably changed when lower and lower dose ranges had been selected. Below approximately 0.2 Gy even an unrelatedness was found betwen the absorbed dose and the increment

  11. A random point process model for the score in sport matches

    Czech Academy of Sciences Publication Activity Database

    Volf, Petr

    2009-01-01

    Roč. 20, č. 2 (2009), s. 121-131 ISSN 1471-678X R&D Projects: GA AV ČR(CZ) IAA101120604 Institutional research plan: CEZ:AV0Z10750506 Keywords : sport statistics * scoring intensity * Cox’s regression model Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2009/SI/volf-a random point process model for the score in sport matches.pdf

  12. A business process model as a starting point for tight cooperation among organizations

    Directory of Open Access Journals (Sweden)

    O. Mysliveček

    2006-01-01

    Full Text Available Outsourcing and other kinds of tight cooperation among organizations are more and more necessary for success on all markets (markets of high technology products are particularly influenced. Thus it is important for companies to be able to effectively set up all kinds of cooperation. A business process model (BPM is a suitable starting point for this future cooperation. In this paper the process of setting up such cooperation is outlined, as well as why it is important for business success. 

  13. Weak interaction rates for Kr and Sr waiting-point nuclei under rp-process conditions

    International Nuclear Information System (INIS)

    Sarriguren, P.

    2009-01-01

    Weak interaction rates are studied in neutron deficient Kr and Sr waiting-point isotopes in ranges of densities and temperatures relevant for the rp process. The nuclear structure is described within a microscopic model (deformed QRPA) that reproduces not only the half-lives but also the Gamow-Teller strength distributions recently measured. The various sensitivities of the decay rates to both density and temperature are discussed. Continuum electron capture is shown to contribute significantly to the weak rates at rp-process conditions.

  14. FINANCIAL ANALYSIS FROM AN ACCOUNTING POINT OF VIEW

    Directory of Open Access Journals (Sweden)

    Mihaela Ungureanu

    2013-03-01

    Full Text Available Despite the developments which tend to relax the relationship between financial analysis and accounting, property information provided by the latter irreplaceable render its use for diagnostic approaches financial foundation. An efficient information system can provide relevant indicators to users based on accurate and real information and financial analysis results are based on a diagnosis of return and risk. The aim of this article is to present primarily the origin and evolution of the relationship between financial analysis and accounting, and the fundamental role which accounting holds, through the information it produces, into analysts’ work. The used research method is the bibliographic one, being studied timely books and articles of the domain. Literature does not provide concrete answers to this problem, resolutions being expected especially from practitioners.

  15. Analysis of an economic order quantity and reorder point inventory ...

    African Journals Online (AJOL)

    SIBA group has been faced with an ineffective forecasting method that has led to multiple product stock outs. The issue faced has caused sales loss as well as profit loss,. This research goes through the process of analyzing the company's current forecasting model and recommending an inventory control model to help her ...

  16. Spatial Distribution Characteristics of Healthcare Facilities in Nanjing: Network Point Pattern Analysis and Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Jianhua Ni

    2016-08-01

    Full Text Available The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities.

  17. RISK ANALYSIS IN MILK PROCESSING

    Directory of Open Access Journals (Sweden)

    I. PIRVUTOIU

    2008-05-01

    Full Text Available This paper aimed to evaluate Risk bankruptcy using “Score Method” based on Canon and Holder’s Model. The data were collected from the Balance Sheet and Profit and Loss Account for the period 2005-2007, recorded by a Meat processing Plant (Rador Commercial Company .The study has put in evidence the financial situation of the company,the level of the main financial ratios fundamenting the calculation of Z score function value in the three years The low values of Z score function recorded every year reflects that the company is still facing backruptcy. However , the worst situation was recorded in the years 2005 and 2006, when baknruptcy risk was ranging between 70 – 80 % . In the year 2007, the risk bankruptcy was lower, ranging between 50-70 % , as Z function recorded a value lower than 4 .For Meat processing companies such an analysis is compulsory at present as long as business environment is very risky in our country.

  18. PRECLOSURE CRITICALITY ANALYSIS PROCESS REPORT

    International Nuclear Information System (INIS)

    Danise, A.E.

    2004-01-01

    This report describes a process for performing preclosure criticality analyses for a repository at Yucca Mountain, Nevada. These analyses will be performed from the time of receipt of fissile material until permanent closure of the repository (preclosure period). The process describes how criticality safety analyses will be performed for various configurations of waste in or out of waste packages that could occur during preclosure as a result of normal operations or event sequences. The criticality safety analysis considers those event sequences resulting in unanticipated moderation, loss of neutron absorber, geometric changes, or administrative errors in waste form placement (loading) of the waste package. The report proposes a criticality analyses process for preclosure to allow a consistent transition from preclosure to postclosure, thereby possibly reducing potential cost increases and delays in licensing of Yucca Mountain. The proposed approach provides the advantage of using a parallel regulatory framework for evaluation of preclosure and postclosure performance and is consistent with the U.S. Nuclear Regulatory Commission's approach of supporting risk-informed, performance-based regulation for fuel cycle facilities, ''Yucca Mountain Review Plan, Final Report'', and 10 CFR Part 63. The criticality-related criteria for ensuring subcriticality are also described as well as which guidance documents will be utilized. Preclosure operations and facilities have significant similarities to existing facilities and operations currently regulated by the U.S. Nuclear Regulatory Commission; therefore, the design approach for preclosure criticality safety will be dictated by existing regulatory requirements while using a risk-informed approach with burnup credit for in-package operations

  19. Human detection and motion analysis at security points

    Science.gov (United States)

    Ozer, I. Burak; Lv, Tiehan; Wolf, Wayne H.

    2003-08-01

    This paper presents a real-time video surveillance system for the recognition of specific human activities. Specifically, the proposed automatic motion analysis is used as an on-line alarm system to detect abnormal situations in a campus environment. A smart multi-camera system developed at Princeton University is extended for use in smart environments in which the camera detects the presence of multiple persons as well as their gestures and their interaction in real-time.

  20. PARALLEL PROCESSING OF BIG POINT CLOUDS USING Z-ORDER-BASED PARTITIONING

    Directory of Open Access Journals (Sweden)

    C. Alis

    2016-06-01

    Full Text Available As laser scanning technology improves and costs are coming down, the amount of point cloud data being generated can be prohibitively difficult and expensive to process on a single machine. This data explosion is not only limited to point cloud data. Voluminous amounts of high-dimensionality and quickly accumulating data, collectively known as Big Data, such as those generated by social media, Internet of Things devices and commercial transactions, are becoming more prevalent as well. New computing paradigms and frameworks are being developed to efficiently handle the processing of Big Data, many of which utilize a compute cluster composed of several commodity grade machines to process chunks of data in parallel. A central concept in many of these frameworks is data locality. By its nature, Big Data is large enough that the entire dataset would not fit on the memory and hard drives of a single node hence replicating the entire dataset to each worker node is impractical. The data must then be partitioned across worker nodes in a manner that minimises data transfer across the network. This is a challenge for point cloud data because there exist different ways to partition data and they may require data transfer. We propose a partitioning based on Z-order which is a form of locality-sensitive hashing. The Z-order or Morton code is computed by dividing each dimension to form a grid then interleaving the binary representation of each dimension. For example, the Z-order code for the grid square with coordinates (x = 1 = 012, y = 3 = 112 is 10112 = 11. The number of points in each partition is controlled by the number of bits per dimension: the more bits, the fewer the points. The number of bits per dimension also controls the level of detail with more bits yielding finer partitioning. We present this partitioning method by implementing it on Apache Spark and investigating how different parameters affect the accuracy and running time of the k nearest

  1. Parallel Processing of Big Point Clouds Using Z-Order Partitioning

    Science.gov (United States)

    Alis, C.; Boehm, J.; Liu, K.

    2016-06-01

    As laser scanning technology improves and costs are coming down, the amount of point cloud data being generated can be prohibitively difficult and expensive to process on a single machine. This data explosion is not only limited to point cloud data. Voluminous amounts of high-dimensionality and quickly accumulating data, collectively known as Big Data, such as those generated by social media, Internet of Things devices and commercial transactions, are becoming more prevalent as well. New computing paradigms and frameworks are being developed to efficiently handle the processing of Big Data, many of which utilize a compute cluster composed of several commodity grade machines to process chunks of data in parallel. A central concept in many of these frameworks is data locality. By its nature, Big Data is large enough that the entire dataset would not fit on the memory and hard drives of a single node hence replicating the entire dataset to each worker node is impractical. The data must then be partitioned across worker nodes in a manner that minimises data transfer across the network. This is a challenge for point cloud data because there exist different ways to partition data and they may require data transfer. We propose a partitioning based on Z-order which is a form of locality-sensitive hashing. The Z-order or Morton code is computed by dividing each dimension to form a grid then interleaving the binary representation of each dimension. For example, the Z-order code for the grid square with coordinates (x = 1 = 012, y = 3 = 112) is 10112 = 11. The number of points in each partition is controlled by the number of bits per dimension: the more bits, the fewer the points. The number of bits per dimension also controls the level of detail with more bits yielding finer partitioning. We present this partitioning method by implementing it on Apache Spark and investigating how different parameters affect the accuracy and running time of the k nearest neighbour algorithm

  2. Developing a Business Intelligence Process for a Training Module in SharePoint 2010

    Science.gov (United States)

    Schmidtchen, Bryce; Solano, Wanda M.; Albasini, Colby

    2015-01-01

    Prior to this project, training information for the employees of the National Center for Critical Processing and Storage (NCCIPS) was stored in an array of unrelated spreadsheets and SharePoint lists that had to be manually updated. By developing a content management system through a web application platform named SharePoint, this training system is now highly automated and provides a much less intensive method of storing training data and scheduling training courses. This system was developed by using SharePoint Designer and laying out the data structure for the interaction between different lists of data about the employees. The automation of data population inside of the lists was accomplished by implementing SharePoint workflows which essentially lay out the logic for how data is connected and calculated between certain lists. The resulting training system is constructed from a combination of five lists of data with a single list acting as the user-friendly interface. This interface is populated with the courses required for each employee and includes past and future information about course requirements. The employees of NCCIPS now have the ability to view, log, and schedule their training information and courses with much more ease. This system will relieve a significant amount of manual input and serve as a powerful informational resource for the employees of NCCIPS in the future.

  3. Genetic interaction analysis of point mutations enables interrogation of gene function at a residue-level resolution

    Science.gov (United States)

    Braberg, Hannes; Moehle, Erica A.; Shales, Michael; Guthrie, Christine; Krogan, Nevan J.

    2014-01-01

    We have achieved a residue-level resolution of genetic interaction mapping – a technique that measures how the function of one gene is affected by the alteration of a second gene – by analyzing point mutations. Here, we describe how to interpret point mutant genetic interactions, and outline key applications for the approach, including interrogation of protein interaction interfaces and active sites, and examination of post-translational modifications. Genetic interaction analysis has proven effective for characterizing cellular processes; however, to date, systematic high-throughput genetic interaction screens have relied on gene deletions or knockdowns, which limits the resolution of gene function analysis and poses problems for multifunctional genes. Our point mutant approach addresses these issues, and further provides a tool for in vivo structure-function analysis that complements traditional biophysical methods. We also discuss the potential for genetic interaction mapping of point mutations in human cells and its application to personalized medicine. PMID:24842270

  4. Monte Carlo point process estimation of electromyographic envelopes from motor cortical spikes for brain-machine interfaces

    Science.gov (United States)

    Liao, Yuxi; She, Xiwei; Wang, Yiwen; Zhang, Shaomin; Zhang, Qiaosheng; Zheng, Xiaoxiang; Principe, Jose C.

    2015-12-01

    Objective. Representation of movement in the motor cortex (M1) has been widely studied in brain-machine interfaces (BMIs). The electromyogram (EMG) has greater bandwidth than the conventional kinematic variables (such as position, velocity), and is functionally related to the discharge of cortical neurons. As the stochastic information of EMG is derived from the explicit spike time structure, point process (PP) methods will be a good solution for decoding EMG directly from neural spike trains. Previous studies usually assume linear or exponential tuning curves between neural firing and EMG, which may not be true. Approach. In our analysis, we estimate the tuning curves in a data-driven way and find both the traditional functional-excitatory and functional-inhibitory neurons, which are widely found across a rat’s motor cortex. To accurately decode EMG envelopes from M1 neural spike trains, the Monte Carlo point process (MCPP) method is implemented based on such nonlinear tuning properties. Main results. Better reconstruction of EMG signals is shown on baseline and extreme high peaks, as our method can better preserve the nonlinearity of the neural tuning during decoding. The MCPP improves the prediction accuracy (the normalized mean squared error) 57% and 66% on average compared with the adaptive point process filter using linear and exponential tuning curves respectively, for all 112 data segments across six rats. Compared to a Wiener filter using spike rates with an optimal window size of 50 ms, MCPP decoding EMG from a point process improves the normalized mean square error (NMSE) by 59% on average. Significance. These results suggest that neural tuning is constantly changing during task execution and therefore, the use of spike timing methodologies and estimation of appropriate tuning curves needs to be undertaken for better EMG decoding in motor BMIs.

  5. The Melting Point of Palladium Using Miniature Fixed Points of Different Ceramic Materials: Part II—Analysis of Melting Curves and Long-Term Investigation

    Science.gov (United States)

    Edler, F.; Huang, K.

    2016-12-01

    Fifteen miniature fixed-point cells made of three different ceramic crucible materials (Al2O3, ZrO2, and Al2O3(86 %)+ZrO2(14 %)) were filled with pure palladium and used to calibrate type B thermocouples (Pt30 %Rh/Pt6 %Rh). A critical point by using miniature fixed points with small amounts of fixed-point material is the analysis of the melting curves, which are characterized by significant slopes during the melting process compared to flat melting plateaus obtainable using conventional fixed-point cells. The method of the extrapolated starting point temperature using straight line approximation of the melting plateau was applied to analyze the melting curves. This method allowed an unambiguous determination of an electromotive force (emf) assignable as melting temperature. The strict consideration of two constraints resulted in a unique, repeatable and objective method to determine the emf at the melting temperature within an uncertainty of about 0.1 μ V. The lifetime and long-term stability of the miniature fixed points was investigated by performing more than 100 melt/freeze cycles for each crucible of the different ceramic materials. No failure of the crucibles occurred indicating an excellent mechanical stability of the investigated miniature cells. The consequent limitation of heating rates to values below {± }3.5 K min^{-1} above 1100° C and the carefully and completely filled crucibles (the liquid palladium occupies the whole volume of the crucible) are the reasons for successfully preventing the crucibles from breaking. The thermal stability of the melting temperature of palladium was excellent when using the crucibles made of Al2O3(86 %)+ZrO2(14 %) and ZrO2. Emf drifts over the total duration of the long-term investigation were below a temperature equivalent of about 0.1 K-0.2 K.

  6. Optimization of the single point incremental forming process for titanium sheets by using response surface

    Directory of Open Access Journals (Sweden)

    Saidi Badreddine

    2016-01-01

    Full Text Available The single point incremental forming process is well-known to be perfectly suited for prototyping and small series. One of its fields of applicability is the medicine area for the forming of titanium prostheses or titanium medical implants. However this process is not yet very industrialized, mainly due its geometrical inaccuracy, its not homogeneous thickness distribution& Moreover considerable forces can occur. They must be controlled in order to preserve the tooling. In this paper, a numerical approach is proposed in order to minimize the maximum force achieved during the incremental forming of titanium sheets and to maximize the minimal thickness. A surface response methodology is used to find the optimal values of two input parameters of the process, the punch diameter and the vertical step size of the tool path.

  7. Marked point process framework for living probabilistic safety assessment and risk follow-up

    International Nuclear Information System (INIS)

    Arjas, Elja; Holmberg, Jan

    1995-01-01

    We construct a model for living probabilistic safety assessment (PSA) by applying the general framework of marked point processes. The framework provides a theoretically rigorous approach for considering risk follow-up of posterior hazards. In risk follow-up, the hazard of core damage is evaluated synthetically at time points in the past, by using some observed events as logged history and combining it with re-evaluated potential hazards. There are several alternatives for doing this, of which we consider three here, calling them initiating event approach, hazard rate approach, and safety system approach. In addition, for a comparison, we consider a core damage hazard arising in risk monitoring. Each of these four definitions draws attention to a particular aspect in risk assessment, and this is reflected in the behaviour of the consequent risk importance measures. Several alternative measures are again considered. The concepts and definitions are illustrated by a numerical example

  8. Trends in business process analysis: from verification to process mining

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Cardoso, J.; Cordeiro, J.; Filipe, J.

    2007-01-01

    Business process analysis ranges from model verification at design-time to the monitoring of processes at runtime. Much progress has been achieved in process verification. Today we are able to verify the entire reference model of SAP without any problems. Moreover, more and more processes leave

  9. Analysis of web-based online services for GPS relative and precise point positioning techniques

    Directory of Open Access Journals (Sweden)

    Taylan Ocalan

    Full Text Available Nowadays, Global Positioning System (GPS has been used effectively in several engineering applications for the survey purposes by multiple disciplines. Web-based online services developed by several organizations; which are user friendly, unlimited and most of them are free; have become a significant alternative against the high-cost scientific and commercial software on achievement of post processing and analyzing the GPS data. When centimeter (cm or decimeter (dm level accuracies are desired, that can be obtained easily regarding different quality engineering applications through these services. In this paper, a test study was conducted at ISKI-CORS network; Istanbul-Turkey in order to figure out the accuracy analysis of the most used web based online services around the world (namely OPUS, AUSPOS, SCOUT, CSRS-PPP, GAPS, APPS, magicGNSS. These services use relative and precise point positioning (PPP solution approaches. In this test study, the coordinates of eight stations were estimated by using of both online services and Bernese 5.0 scientific GPS processing software from 24-hour GPS data set and then the coordinate differences between the online services and Bernese processing software were computed. From the evaluations, it was seen that the results for each individual differences were less than 10 mm regarding relative online service, and less than 20 mm regarding precise point positioning service. The accuracy analysis was gathered from these coordinate differences and standard deviations of the obtained coordinates from different techniques and then online services were compared to each other. The results show that the position accuracies obtained by associated online services provide high accurate solutions that may be used in many engineering applications and geodetic analysis.

  10. Application of random-point processes to the detection of radiation sources

    International Nuclear Information System (INIS)

    Woods, J.W.

    1978-01-01

    In this report the mathematical theory of random-point processes is reviewed and it is shown how use of the theory can obtain optimal solutions to the problem of detecting radiation sources. As noted, the theory also applies to image processing in low-light-level or low-count-rate situations. Paralleling Snyder's work, the theory is extended to the multichannel case of a continuous, two-dimensional (2-D), energy-time space. This extension essentially involves showing that the data are doubly stochastic Poisson (DSP) point processes in energy as well as time. Further, a new 2-D recursive formulation is presented for the radiation-detection problem with large computational savings over nonrecursive techniques when the number of channels is large (greater than or equal to 30). Finally, some adaptive strategies for on-line ''learning'' of unknown, time-varying signal and background-intensity parameters and statistics are present and discussed. These adaptive procedures apply when a complete statistical description is not available a priori

  11. Nuclear binding around the RP-process waiting points $^{68}$Se and $^{72}$Kr

    CERN Multimedia

    2002-01-01

    Encouraged by the success of mass determinations of nuclei close to the Z=N line performed at ISOLTRAP during the year 2000 and of the recent decay spectroscopy studies on neutron-deficient Kr isotopes (IS351 collaboration), we aim to measure masses and proton separation energies of the bottleneck nuclei defining the flow of the astrophysical rp-process beyond A$\\sim$70. In detail, the program includes mass measurements of the rp-process waiting point nuclei $^{68}$Se and $^{72}$Kr and determination of proton separation energies of the proton-unbound $^{69}$Br and $^{73}$Rb via $\\beta$-decays of $^{69}$Kr and $^{73}$Sr, respectively. The aim of the project is to complete the experimental database for astrophysical network calculations and for the liquid-drop type of mass models typically used in the modelling of the astrophysical rp process in the region. The first beamtime is scheduled for the August 2001 and the aim is to measure the absolute mass of the waiting-point nucleus $^{72}$Kr.

  12. Development of Point Kernel Shielding Analysis Computer Program Implementing Recent Nuclear Data and Graphic User Interfaces

    International Nuclear Information System (INIS)

    Kang, Sang Ho; Lee, Seung Gi; Chung, Chan Young; Lee, Choon Sik; Lee, Jai Ki

    2001-01-01

    In order to comply with revised national regulationson radiological protection and to implement recent nuclear data and dose conversion factors, KOPEC developed a new point kernel gamma and beta ray shielding analysis computer program. This new code, named VisualShield, adopted mass attenuation coefficient and buildup factors from recent ANSI/ANS standards and flux-to-dose conversion factors from the International Commission on Radiological Protection (ICRP) Publication 74 for estimation of effective/equivalent dose recommended in ICRP 60. VisualShield utilizes graphical user interfaces and 3-D visualization of the geometric configuration for preparing input data sets and analyzing results, which leads users to error free processing with visual effects. Code validation and data analysis were performed by comparing the results of various calculations to the data outputs of previous programs such as MCNP 4B, ISOSHLD-II, QAD-CGGP, etc

  13. The effect of starting point placement technique on thoracic transverse process strength: an ex vivo biomechanical study

    Directory of Open Access Journals (Sweden)

    Burton Douglas C

    2010-07-01

    Full Text Available Abstract Background The use of thoracic pedicle screws in spinal deformity, trauma, and tumor reconstruction is becoming more common. Unsuccessful screw placement may require salvage techniques utilizing transverse process hooks. The effect of different starting point placement techniques on the strength of the transverse process has not previously been reported. The purpose of this paper is to determine the biomechanical properties of the thoracic transverse process following various pedicle screw starting point placement techniques. Methods Forty-seven fresh-frozen human cadaveric thoracic vertebrae from T2 to T9 were disarticulated and matched by bone mineral density (BMD and transverse process (TP cross-sectional area. Specimens were randomized to one of four groups: A, control, and three others based on thoracic pedicle screw placement technique; B, straightforward; C, funnel; and D, in-out-in. Initial cortical bone removal for pedicle screw placement was made using a burr at the location on the transverse process or transverse process-laminar junction as published in the original description of each technique. The transverse process was tested measuring load-to-failure simulating a hook in compression mode. Analysis of covariance and Pearson correlation coefficients were used to examine the data. Results Technique was a significant predictor of load-to-failure (P = 0.0007. The least squares mean (LS mean load-to-failure of group A (control was 377 N, group B (straightforward 355 N, group C (funnel 229 N, and group D (in-out-in 301 N. Significant differences were noted between groups A and C, A and D, B and C, and C and D. BMD (0.925 g/cm2 [range, 0.624-1.301 g/cm2] was also a significant predictor of load-to-failure, for all specimens grouped together (P P 0.05. Level and side tested were not found to significantly correlate with load-to-failure. Conclusions The residual coronal plane compressive strength of the thoracic transverse process

  14. Clusterless Decoding of Position From Multiunit Activity Using A Marked Point Process Filter

    Science.gov (United States)

    Deng, Xinyi; Liu, Daniel F.; Kay, Kenneth; Frank, Loren M.; Eden, Uri T.

    2016-01-01

    Point process filters have been applied successfully to decode neural signals and track neural dynamics. Traditionally, these methods assume that multiunit spiking activity has already been correctly spike-sorted. As a result, these methods are not appropriate for situations where sorting cannot be performed with high precision such as real-time decoding for brain-computer interfaces. As the unsupervised spike-sorting problem remains unsolved, we took an alternative approach that takes advantage of recent insights about clusterless decoding. Here we present a new point process decoding algorithm that does not require multiunit signals to be sorted into individual units. We use the theory of marked point processes to construct a function that characterizes the relationship between a covariate of interest (in this case, the location of a rat on a track) and features of the spike waveforms. In our example, we use tetrode recordings, and the marks represent a four-dimensional vector of the maximum amplitudes of the spike waveform on each of the four electrodes. In general, the marks may represent any features of the spike waveform. We then use Bayes’ rule to estimate spatial location from hippocampal neural activity. We validate our approach with a simulation study and with experimental data recorded in the hippocampus of a rat moving through a linear environment. Our decoding algorithm accurately reconstructs the rat’s position from unsorted multiunit spiking activity. We then compare the quality of our decoding algorithm to that of a traditional spike-sorting and decoding algorithm. Our analyses show that the proposed decoding algorithm performs equivalently or better than algorithms based on sorted single-unit activity. These results provide a path toward accurate real-time decoding of spiking patterns that could be used to carry out content-specific manipulations of population activity in hippocampus or elsewhere in the brain. PMID:25973549

  15. Visibility Analysis in a Point Cloud Based on the Medial Axis Transform

    NARCIS (Netherlands)

    Peters, R.; Ledoux, H.; Biljecki, F.

    2015-01-01

    Visibility analysis is an important application of 3D GIS data. Current approaches require 3D city models that are often derived from detailed aerial point clouds. We present an approach to visibility analysis that does not require a city model but works directly on the point cloud. Our approach is

  16. On the estimation of the spherical contact distribution Hs(y) for spatial point processes

    International Nuclear Information System (INIS)

    Doguwa, S.I.

    1990-08-01

    RIPLEY (1977, Journal of the Royal Statistical Society, B39 172-212) proposed an estimator for the spherical contact distribution H s (s), of a spatial point process observed in a bounded planar region. However, this estimator is not defined for some distances of interest, in this bounded region. A new estimator for H s (y), is proposed for use with regular grid of sampling locations. This new estimator is defined for all distances of interest. It also appears to have a smaller bias and a smaller mean squared error than the previously suggested alternative. (author). 11 refs, 4 figs, 1 tab

  17. Analysing the distribution of synaptic vesicles using a spatial point process model

    DEFF Research Database (Denmark)

    Khanmohammadi, Mahdieh; Waagepetersen, Rasmus; Nava, Nicoletta

    2014-01-01

    functionality by statistically modelling the distribution of the synaptic vesicles in two groups of rats: a control group subjected to sham stress and a stressed group subjected to a single acute foot-shock (FS)-stress episode. We hypothesize that the synaptic vesicles have different spatial distributions...... in the two groups. The spatial distributions are modelled using spatial point process models with an inhomogeneous conditional intensity and repulsive pairwise interactions. Our results verify the hypothesis that the two groups have different spatial distributions....

  18. Automated extraction and analysis of rock discontinuity characteristics from 3D point clouds

    Science.gov (United States)

    Bianchetti, Matteo; Villa, Alberto; Agliardi, Federico; Crosta, Giovanni B.

    2016-04-01

    A reliable characterization of fractured rock masses requires an exhaustive geometrical description of discontinuities, including orientation, spacing, and size. These are required to describe discontinuum rock mass structure, perform Discrete Fracture Network and DEM modelling, or provide input for rock mass classification or equivalent continuum estimate of rock mass properties. Although several advanced methodologies have been developed in the last decades, a complete characterization of discontinuity geometry in practice is still challenging, due to scale-dependent variability of fracture patterns and difficult accessibility to large outcrops. Recent advances in remote survey techniques, such as terrestrial laser scanning and digital photogrammetry, allow a fast and accurate acquisition of dense 3D point clouds, which promoted the development of several semi-automatic approaches to extract discontinuity features. Nevertheless, these often need user supervision on algorithm parameters which can be difficult to assess. To overcome this problem, we developed an original Matlab tool, allowing fast, fully automatic extraction and analysis of discontinuity features with no requirements on point cloud accuracy, density and homogeneity. The tool consists of a set of algorithms which: (i) process raw 3D point clouds, (ii) automatically characterize discontinuity sets, (iii) identify individual discontinuity surfaces, and (iv) analyse their spacing and persistence. The tool operates in either a supervised or unsupervised mode, starting from an automatic preliminary exploration data analysis. The identification and geometrical characterization of discontinuity features is divided in steps. First, coplanar surfaces are identified in the whole point cloud using K-Nearest Neighbor and Principal Component Analysis algorithms optimized on point cloud accuracy and specified typical facet size. Then, discontinuity set orientation is calculated using Kernel Density Estimation and

  19. The Impact of the Delivery of Prepared Power Point Presentations on the Learning Process

    Directory of Open Access Journals (Sweden)

    Auksė Marmienė

    2011-04-01

    Full Text Available This article describes the process of the preparation and delivery of Power Point presentations and how it can be used by teachers as a resource for classroom teaching. The advantages of this classroom activity covering some of the problems and providing a few suggestions for dealing with those difficulties are also outlined. The major objective of the present paper is to investigate the students ability to choose the material and the content of Power Point presentations on professional topics via the Internet as well as the ability to prepare and deliver the presentation in front of the audience. The factors which determine the choice of the presentation subject are also analysed in this paper. After the delivery students were requested to self- and peer-assess the difficulties they faced in preparation and performance of the presentations by writing the reports. Learners’ attitudes to the choice of the topic of Power Point presentations were surveyed by administering a self-assessment questionnaire.

  20. Analysis of tree stand horizontal structure using random point field methods

    Directory of Open Access Journals (Sweden)

    O. P. Sekretenko

    2015-06-01

    Full Text Available This paper uses the model approach to analyze the horizontal structure of forest stands. The main types of models of random point fields and statistical procedures that can be used to analyze spatial patterns of trees of uneven and even-aged stands are described. We show how modern methods of spatial statistics can be used to address one of the objectives of forestry – to clarify the laws of natural thinning of forest stand and the corresponding changes in its spatial structure over time. Studying natural forest thinning, we describe the consecutive stages of modeling: selection of the appropriate parametric model, parameter estimation and generation of point patterns in accordance with the selected model, the selection of statistical functions to describe the horizontal structure of forest stands and testing of statistical hypotheses. We show the possibilities of a specialized software package, spatstat, which is designed to meet the challenges of spatial statistics and provides software support for modern methods of analysis of spatial data. We show that a model of stand thinning that does not consider inter-tree interaction can project the size distribution of the trees properly, but the spatial pattern of the modeled stand is not quite consistent with observed data. Using data of three even-aged pine forest stands of 25, 55, and 90-years old, we demonstrate that the spatial point process models are useful for combining measurements in the forest stands of different ages to study the forest stand natural thinning.

  1. Miniaturization for Point-of-Care Analysis: Platform Technology for Almost Every Biomedical Assay.

    Science.gov (United States)

    Schumacher, Soeren; Sartorius, Dorian; Ehrentreich-Förster, Eva; Bier, Frank F

    2012-10-01

    Platform technologies for the changing need of diagnostics are one of the main challenges in medical device technology. From one point-of-view the demand for new and more versatile diagnostic is increasing due to a deeper knowledge of biomarkers and their combination with diseases. From another point-of-view a decentralization of diagnostics will occur since decisions can be made faster resulting in higher success of therapy. Hence, new types of technologies have to be established which enables a multiparameter analysis at the point-of-care. Within this review-like article a system called Fraunhofer ivD-platform is introduced. It consists of a credit-card sized cartridge with integrated reagents, sensors and pumps and a read-out/processing unit. Within the cartridge the assay runs fully automated within 15-20 minutes. Due to the open design of the platform different analyses such as antibody, serological or DNA-assays can be performed. Specific examples of these three different assay types are given to show the broad applicability of the system.

  2. Process for quality assurance of welded joints for electrical resistance point welding

    International Nuclear Information System (INIS)

    Schaefer, R.; Singh, S.

    1977-01-01

    In order to guarantee the reproducibility of welded joints of even quality (above all in the metal working industry), it is proposed that before starting resistance point welding, a preheating current should be allowed to flow at the site of the weld. A given reduction of the total resistance at the site of the weld should effect the time when the preheating current is switched over to welding current. This value is always predetermined empirically. Further possibilities of controlling the welding process are described, where the measurement of thermal expansion of the parts is used. A standard welding time is given. The rated course of electrode movement during the process can be predicted and a running comparison of nominal and actual values can be carried out. (RW) [de

  3. The Purification Method of Matching Points Based on Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    DONG Yang

    2017-02-01

    Full Text Available The traditional purification method of matching points usually uses a small number of the points as initial input. Though it can meet most of the requirements of point constraints, the iterative purification solution is easy to fall into local extreme, which results in the missing of correct matching points. To solve this problem, we introduce the principal component analysis method to use the whole point set as initial input. And thorough mismatching points step eliminating and robust solving, more accurate global optimal solution, which intends to reduce the omission rate of correct matching points and thus reaches better purification effect, can be obtained. Experimental results show that this method can obtain the global optimal solution under a certain original false matching rate, and can decrease or avoid the omission of correct matching points.

  4. Comparison of plastic strains on AA5052 by single point incremental forming process using digital image processing

    Energy Technology Data Exchange (ETDEWEB)

    Mugendiran, V.; Gnanavelbabu, A. [Anna University, Chennai, Tamilnadu (India)

    2017-06-15

    In this study, a surface based strain measurement was used to determine the formability of the sheet metal. A strain measurement may employ manual calculation of plastic strains based on the reference circle and the deformed circle. The manual calculation method has a greater margin of error in the practical applications. In this paper, an attempt has been made to compare the formability by implementing three different theoretical approaches: Namely conventional method, least square method and digital based strain measurements. As the sheet metal was formed by a single point incremental process the etched circles get deformed into elliptical shapes approximately, image acquisition has been done before and after forming. The plastic strains of the deformed circle grids are calculated based on the non- deformed reference. The coordinates of the deformed circles are measured by various image processing steps. Finally the strains obtained from the deformed circle are used to plot the forming limit diagram. To evaluate the accuracy of the system, the conventional, least square and digital based method of prediction of the forming limit diagram was compared. Conventional method and least square method have marginal error when compared with digital based processing method. Measurement of strain based on image processing agrees well and can be used to improve the accuracy and to reduce the measurement error in prediction of forming limit diagram.

  5. Comparison of plastic strains on AA5052 by single point incremental forming process using digital image processing

    International Nuclear Information System (INIS)

    Mugendiran, V.; Gnanavelbabu, A.

    2017-01-01

    In this study, a surface based strain measurement was used to determine the formability of the sheet metal. A strain measurement may employ manual calculation of plastic strains based on the reference circle and the deformed circle. The manual calculation method has a greater margin of error in the practical applications. In this paper, an attempt has been made to compare the formability by implementing three different theoretical approaches: Namely conventional method, least square method and digital based strain measurements. As the sheet metal was formed by a single point incremental process the etched circles get deformed into elliptical shapes approximately, image acquisition has been done before and after forming. The plastic strains of the deformed circle grids are calculated based on the non- deformed reference. The coordinates of the deformed circles are measured by various image processing steps. Finally the strains obtained from the deformed circle are used to plot the forming limit diagram. To evaluate the accuracy of the system, the conventional, least square and digital based method of prediction of the forming limit diagram was compared. Conventional method and least square method have marginal error when compared with digital based processing method. Measurement of strain based on image processing agrees well and can be used to improve the accuracy and to reduce the measurement error in prediction of forming limit diagram.

  6. Partial wave analysis using graphics processing units

    Energy Technology Data Exchange (ETDEWEB)

    Berger, Niklaus; Liu Beijiang; Wang Jike, E-mail: nberger@ihep.ac.c [Institute of High Energy Physics, Chinese Academy of Sciences, 19B Yuquan Lu, Shijingshan, 100049 Beijing (China)

    2010-04-01

    Partial wave analysis is an important tool for determining resonance properties in hadron spectroscopy. For large data samples however, the un-binned likelihood fits employed are computationally very expensive. At the Beijing Spectrometer (BES) III experiment, an increase in statistics compared to earlier experiments of up to two orders of magnitude is expected. In order to allow for a timely analysis of these datasets, additional computing power with short turnover times has to be made available. It turns out that graphics processing units (GPUs) originally developed for 3D computer games have an architecture of massively parallel single instruction multiple data floating point units that is almost ideally suited for the algorithms employed in partial wave analysis. We have implemented a framework for tensor manipulation and partial wave fits called GPUPWA. The user writes a program in pure C++ whilst the GPUPWA classes handle computations on the GPU, memory transfers, caching and other technical details. In conjunction with a recent graphics processor, the framework provides a speed-up of the partial wave fit by more than two orders of magnitude compared to legacy FORTRAN code.

  7. Design of glass-ceramic complex microstructure with using onset point of crystallization in differential thermal analysis

    International Nuclear Information System (INIS)

    Hwang, Seongjin; Kim, Jinho; Shin, Hyo-Soon; Kim, Jong-Hee; Kim, Hyungsun

    2008-01-01

    Two types of frits with different compositions were used to develop a high strength substrate in electronic packaging using a low temperature co-fired ceramic process. In order to reveal the crystallization stage during heating to approximately 900 deg. C, a glass-ceramic consisting of the two types of frits, which had been crystallized to diopside and anorthite after firing, was tested at different mixing ratios of the frits. The exothermal peaks deconvoluted by a Gauss function in the differential thermal analysis curves were used to determine the onset point of crystallization of diopside or anorthite. The onset points of crystallization were affected by the mixing ratio of the frits, and the microstructure of the glass-ceramic depended on the onset point of crystallization. It was found that when multicrystalline phases appear in the microstructure, the resulting complex microstructure could be predicted from the onset point of crystallization obtained by differential thermal analysis

  8. Neutron capture at the s-process branching points $^{171}$Tm and $^{204}$Tl

    CERN Multimedia

    Branching points in the s-process are very special isotopes for which there is a competition between the neutron capture and the subsequent b-decay chain producing the heavy elements beyond Fe. Typically, the knowledge on the associated capture cross sections is very poor due to the difficulty in obtaining enough material of these radioactive isotopes and to measure the cross section of a sample with an intrinsic activity; indeed only 2 out o the 21 ${s}$-process branching points have ever been measured by using the time-of-flight method. In this experiment we aim at measuring for the first time the capture cross sections of $^{171}$Tm and $^{204}$Tl, both of crucial importance for understanding the nucleosynthesis of heavy elements in AGB stars. The combination of both (n,$\\gamma$) measurements on $^{171}$Tm and $^{204}$Tl will allow one to accurately constrain neutron density and the strength of the 13C(α,n) source in low mass AGB stars. Additionally, the cross section of $^{204}$Tl is also of cosmo-chrono...

  9. Detection of bursts in extracellular spike trains using hidden semi-Markov point process models.

    Science.gov (United States)

    Tokdar, Surya; Xi, Peiyi; Kelly, Ryan C; Kass, Robert E

    2010-08-01

    Neurons in vitro and in vivo have epochs of bursting or "up state" activity during which firing rates are dramatically elevated. Various methods of detecting bursts in extracellular spike trains have appeared in the literature, the most widely used apparently being Poisson Surprise (PS). A natural description of the phenomenon assumes (1) there are two hidden states, which we label "burst" and "non-burst," (2) the neuron evolves stochastically, switching at random between these two states, and (3) within each state the spike train follows a time-homogeneous point process. If in (2) the transitions from non-burst to burst and burst to non-burst states are memoryless, this becomes a hidden Markov model (HMM). For HMMs, the state transitions follow exponential distributions, and are highly irregular. Because observed bursting may in some cases be fairly regular-exhibiting inter-burst intervals with small variation-we relaxed this assumption. When more general probability distributions are used to describe the state transitions the two-state point process model becomes a hidden semi-Markov model (HSMM). We developed an efficient Bayesian computational scheme to fit HSMMs to spike train data. Numerical simulations indicate the method can perform well, sometimes yielding very different results than those based on PS.

  10. Mixed-Poisson Point Process with Partially-Observed Covariates: Ecological Momentary Assessment of Smoking.

    Science.gov (United States)

    Neustifter, Benjamin; Rathbun, Stephen L; Shiffman, Saul

    2012-01-01

    Ecological Momentary Assessment is an emerging method of data collection in behavioral research that may be used to capture the times of repeated behavioral events on electronic devices, and information on subjects' psychological states through the electronic administration of questionnaires at times selected from a probability-based design as well as the event times. A method for fitting a mixed Poisson point process model is proposed for the impact of partially-observed, time-varying covariates on the timing of repeated behavioral events. A random frailty is included in the point-process intensity to describe variation among subjects in baseline rates of event occurrence. Covariate coefficients are estimated using estimating equations constructed by replacing the integrated intensity in the Poisson score equations with a design-unbiased estimator. An estimator is also proposed for the variance of the random frailties. Our estimators are robust in the sense that no model assumptions are made regarding the distribution of the time-varying covariates or the distribution of the random effects. However, subject effects are estimated under gamma frailties using an approximate hierarchical likelihood. The proposed approach is illustrated using smoking data.

  11. Students’ Algebraic Thinking Process in Context of Point and Line Properties

    Science.gov (United States)

    Nurrahmi, H.; Suryadi, D.; Fatimah, S.

    2017-09-01

    Learning of schools algebra is limited to symbols and operating procedures, so students are able to work on problems that only require the ability to operate symbols but unable to generalize a pattern as one of part of algebraic thinking. The purpose of this study is to create a didactic design that facilitates students to do algebraic thinking process through the generalization of patterns, especially in the context of the property of point and line. This study used qualitative method and includes Didactical Design Research (DDR). The result is students are able to make factual, contextual, and symbolic generalization. This happen because the generalization arises based on facts on local terms, then the generalization produced an algebraic formula that was described in the context and perspective of each student. After that, the formula uses the algebraic letter symbol from the symbol t hat uses the students’ language. It can be concluded that the design has facilitated students to do algebraic thinking process through the generalization of patterns, especially in the context of property of the point and line. The impact of this study is this design can use as one of material teaching alternative in learning of school algebra.

  12. The implementation of a Hazard Analysis and Critical Control Point management system in a peanut butter ice cream plant

    Directory of Open Access Journals (Sweden)

    Yu-Ting Hung

    2015-09-01

    Full Text Available To ensure the safety of the peanut butter ice cream manufacture, a Hazard Analysis and Critical Control Point (HACCP plan has been designed and applied to the production process. Potential biological, chemical, and physical hazards in each manufacturing procedure were identified. Critical control points for the peanut butter ice cream were then determined as the pasteurization and freezing process. The establishment of a monitoring system, corrective actions, verification procedures, and documentation and record keeping were followed to complete the HACCP program. The results of this study indicate that implementing the HACCP system in food industries can effectively enhance food safety and quality while improving the production management.

  13. Spectral Processing Analysis System (SPANS).

    Science.gov (United States)

    1980-11-01

    Approximately 750 pounds Temperature Range: 60 - 80 degrees Farenheit Humidity: 40 - 70 percent (relative) Duty Cycle: Continuous Power Requirements: 5 wire, 3...displayed per display frame, local or absolute scaling, number of display points per line and waveform av- A eraging. A typical display is shown in Figure 3...the waveform. In the case of white noise, a high degree of correlation is found at zero lag only with the remaining lags showing little correlation

  14. Topobathymetric LiDAR point cloud processing and landform classification in a tidal environment

    Science.gov (United States)

    Skovgaard Andersen, Mikkel; Al-Hamdani, Zyad; Steinbacher, Frank; Rolighed Larsen, Laurids; Brandbyge Ernstsen, Verner

    2017-04-01

    Historically it has been difficult to create high resolution Digital Elevation Models (DEMs) in land-water transition zones due to shallow water depth and often challenging environmental conditions. This gap of information has been reflected as a "white ribbon" with no data in the land-water transition zone. In recent years, the technology of airborne topobathymetric Light Detection and Ranging (LiDAR) has proven capable of filling out the gap by simultaneously capturing topographic and bathymetric elevation information, using only a single green laser. We collected green LiDAR point cloud data in the Knudedyb tidal inlet system in the Danish Wadden Sea in spring 2014. Creating a DEM from a point cloud requires the general processing steps of data filtering, water surface detection and refraction correction. However, there is no transparent and reproducible method for processing green LiDAR data into a DEM, specifically regarding the procedure of water surface detection and modelling. We developed a step-by-step procedure for creating a DEM from raw green LiDAR point cloud data, including a procedure for making a Digital Water Surface Model (DWSM) (see Andersen et al., 2017). Two different classification analyses were applied to the high resolution DEM: A geomorphometric and a morphological classification, respectively. The classification methods were originally developed for a small test area; but in this work, we have used the classification methods to classify the complete Knudedyb tidal inlet system. References Andersen MS, Gergely Á, Al-Hamdani Z, Steinbacher F, Larsen LR, Ernstsen VB (2017). Processing and performance of topobathymetric lidar data for geomorphometric and morphological classification in a high-energy tidal environment. Hydrol. Earth Syst. Sci., 21: 43-63, doi:10.5194/hess-21-43-2017. Acknowledgements This work was funded by the Danish Council for Independent Research | Natural Sciences through the project "Process-based understanding and

  15. The application of quality risk management to the bacterial endotoxins test: use of hazard analysis and critical control points.

    Science.gov (United States)

    Annalaura, Carducci; Giulia, Davini; Stefano, Ceccanti

    2013-01-01

    Risk analysis is widely used in the pharmaceutical industry to manage production processes, validation activities, training, and other activities. Several methods of risk analysis are available (for example, failure mode and effects analysis, fault tree analysis), and one or more should be chosen and adapted to the specific field where they will be applied. Among the methods available, hazard analysis and critical control points (HACCP) is a methodology that has been applied since the 1960s, and whose areas of application have expanded over time from food to the pharmaceutical industry. It can be easily and successfully applied to several processes because its main feature is the identification, assessment, and control of hazards. It can be also integrated with other tools, such as fishbone diagram and flowcharting. The aim of this article is to show how HACCP can be used to manage an analytical process, propose how to conduct the necessary steps, and provide data templates necessary to document and useful to follow current good manufacturing practices. In the quality control process, risk analysis is a useful tool for enhancing the uniformity of technical choices and their documented rationale. Accordingly, it allows for more effective and economical laboratory management, is capable of increasing the reliability of analytical results, and enables auditors and authorities to better understand choices that have been made. The aim of this article is to show how hazard analysis and critical control points can be used to manage bacterial endotoxins testing and other analytical processes in a formal, clear, and detailed manner.

  16. Application of hazard analysis critical control points (HACCP) to organic chemical contaminants in food.

    Science.gov (United States)

    Ropkins, K; Beck, A J

    2002-03-01

    Hazard Analysis Critical Control Points (HACCP) is a systematic approach to the identification, assessment, and control of hazards that was developed as an effective alternative to conventional end-point analysis to control food safety. It has been described as the most effective means of controlling foodborne diseases, and its application to the control of microbiological hazards has been accepted internationally. By contrast, relatively little has been reported relating to the potential use of HACCP, or HACCP-like procedures, to control chemical contaminants of food. This article presents an overview of the implementation of HACCP and discusses its application to the control of organic chemical contaminants in the food chain. Although this is likely to result in many of the advantages previously identified for microbiological HACCP, that is, more effective, efficient, and economical hazard management, a number of areas are identified that require further research and development. These include: (1) a need to refine the methods of chemical contaminant identification and risk assessment employed, (2) develop more cost-effective monitoring and control methods for routine chemical contaminant surveillance of food, and (3) improve the effectiveness of process optimization for the control of chemical contaminants in food.

  17. Nuclear structure and weak rates of heavy waiting point nuclei under rp-process conditions

    Science.gov (United States)

    Nabi, Jameel-Un; Böyükata, Mahmut

    2017-01-01

    The structure and the weak interaction mediated rates of the heavy waiting point (WP) nuclei 80Zr, 84Mo, 88Ru, 92Pd and 96Cd along N = Z line were studied within the interacting boson model-1 (IBM-1) and the proton-neutron quasi-particle random phase approximation (pn-QRPA). The energy levels of the N = Z WP nuclei were calculated by fitting the essential parameters of IBM-1 Hamiltonian and their geometric shapes were predicted by plotting potential energy surfaces (PESs). Half-lives, continuum electron capture rates, positron decay rates, electron capture cross sections of WP nuclei, energy rates of β-delayed protons and their emission probabilities were later calculated using the pn-QRPA. The calculated Gamow-Teller strength distributions were compared with previous calculation. We present positron decay and continuum electron capture rates on these WP nuclei under rp-process conditions using the same model. For the rp-process conditions, the calculated total weak rates are twice the Skyrme HF+BCS+QRPA rates for 80Zr. For remaining nuclei the two calculations compare well. The electron capture rates are significant and compete well with the corresponding positron decay rates under rp-process conditions. The finding of the present study supports that electron capture rates form an integral part of the weak rates under rp-process conditions and has an important role for the nuclear model calculations.

  18. Radial Basis Functional Model of Multi-Point Dieless Forming Process for Springback Reduction and Compensation

    Directory of Open Access Journals (Sweden)

    Misganaw Abebe

    2017-11-01

    Full Text Available Springback in multi-point dieless forming (MDF is a common problem because of the small deformation and blank holder free boundary condition. Numerical simulations are widely used in sheet metal forming to predict the springback. However, the computational time in using the numerical tools is time costly to find the optimal process parameters value. This study proposes radial basis function (RBF to replace the numerical simulation model by using statistical analyses that are based on a design of experiment (DOE. Punch holding time, blank thickness, and curvature radius are chosen as effective process parameters for determining the springback. The Latin hypercube DOE method facilitates statistical analyses and the extraction of a prediction model in the experimental process parameter domain. Finite element (FE simulation model is conducted in the ABAQUS commercial software to generate the springback responses of the training and testing samples. The genetic algorithm is applied to find the optimal value for reducing and compensating the induced springback for the different blank thicknesses using the developed RBF prediction model. Finally, the RBF numerical result is verified by comparing with the FE simulation result of the optimal process parameters and both results show that the springback is almost negligible from the target shape.

  19. Gaussian process regression analysis for functional data

    CERN Document Server

    Shi, Jian Qing

    2011-01-01

    Gaussian Process Regression Analysis for Functional Data presents nonparametric statistical methods for functional regression analysis, specifically the methods based on a Gaussian process prior in a functional space. The authors focus on problems involving functional response variables and mixed covariates of functional and scalar variables.Covering the basics of Gaussian process regression, the first several chapters discuss functional data analysis, theoretical aspects based on the asymptotic properties of Gaussian process regression models, and new methodological developments for high dime

  20. Analysis of multiparty mediation processes

    NARCIS (Netherlands)

    Vuković, Siniša

    2013-01-01

    Crucial challenges for multiparty mediation processes include the achievement of adequate cooperation among the mediators and consequent coordination of their activities in the mediation process. Existing literature goes only as far as to make it clear that successful mediation requires necessary

  1. Experimental analysis of armouring process

    Science.gov (United States)

    Lamberti, Alberto; Paris, Ennio

    Preliminary results from an experimental investigation on armouring processes are presented. Particularly, the process of development and formation of the armour layer under different steady flow conditions has been analyzed in terms of grain size variations and sediment transport rate associated to each size fraction.

  2. Dual keel Space Station payload pointing system design and analysis feasibility study

    Science.gov (United States)

    Smagala, Tom; Class, Brian F.; Bauer, Frank H.; Lebair, Deborah A.

    1988-01-01

    A Space Station attached Payload Pointing System (PPS) has been designed and analyzed. The PPS is responsible for maintaining fixed payload pointing in the presence of disturbance applied to the Space Station. The payload considered in this analysis is the Solar Optical Telescope. System performance is evaluated via digital time simulations by applying various disturbance forces to the Space Station. The PPS meets the Space Station articulated pointing requirement for all disturbances except Shuttle docking and some centrifuge cases.

  3. Predicting seizures in untreated temporal lobe epilepsy using point-process nonlinear models of heartbeat dynamics.

    Science.gov (United States)

    Valenza, G; Romigi, A; Citi, L; Placidi, F; Izzi, F; Albanese, M; Scilingo, E P; Marciani, M G; Duggento, A; Guerrisi, M; Toschi, N; Barbieri, R

    2016-08-01

    Symptoms of temporal lobe epilepsy (TLE) are frequently associated with autonomic dysregulation, whose underlying biological processes are thought to strongly contribute to sudden unexpected death in epilepsy (SUDEP). While abnormal cardiovascular patterns commonly occur during ictal events, putative patterns of autonomic cardiac effects during pre-ictal (PRE) periods (i.e. periods preceding seizures) are still unknown. In this study, we investigated TLE-related heart rate variability (HRV) through instantaneous, nonlinear estimates of cardiovascular oscillations during inter-ictal (INT) and PRE periods. ECG recordings from 12 patients with TLE were processed to extract standard HRV indices, as well as indices of instantaneous HRV complexity (dominant Lyapunov exponent and entropy) and higher-order statistics (bispectra) obtained through definition of inhomogeneous point-process nonlinear models, employing Volterra-Laguerre expansions of linear, quadratic, and cubic kernels. Experimental results demonstrate that the best INT vs. PRE classification performance (balanced accuracy: 73.91%) was achieved only when retaining the time-varying, nonlinear, and non-stationary structure of heartbeat dynamical features. The proposed approach opens novel important avenues in predicting ictal events using information gathered from cardiovascular signals exclusively.

  4. A customizable stochastic state point process filter (SSPPF) for neural spiking activity.

    Science.gov (United States)

    Xin, Yao; Li, Will X Y; Min, Biao; Han, Yan; Cheung, Ray C C

    2013-01-01

    Stochastic State Point Process Filter (SSPPF) is effective for adaptive signal processing. In particular, it has been successfully applied to neural signal coding/decoding in recent years. Recent work has proven its efficiency in non-parametric coefficients tracking in modeling of mammal nervous system. However, existing SSPPF has only been realized in commercial software platforms which limit their computational capability. In this paper, the first hardware architecture of SSPPF has been designed and successfully implemented on field-programmable gate array (FPGA), proving a more efficient means for coefficient tracking in a well-established generalized Laguerre-Volterra model for mammalian hippocampal spiking activity research. By exploring the intrinsic parallelism of the FPGA, the proposed architecture is able to process matrices or vectors with random size, and is efficiently scalable. Experimental result shows its superior performance comparing to the software implementation, while maintaining the numerical precision. This architecture can also be potentially utilized in the future hippocampal cognitive neural prosthesis design.

  5. Continuous quality improvement process pin-points delays, speeds STEMI patients to life-saving treatment.

    Science.gov (United States)

    2011-11-01

    Using a multidisciplinary team approach, the University of California, San Diego, Health System has been able to significantly reduce average door-to-balloon angioplasty times for patients with the most severe form of heart attacks, beating national recommendations by more than a third. The multidisciplinary team meets monthly to review all cases involving patients with ST-segment-elevation myocardial infarctions (STEMI) to see where process improvements can be made. Using this continuous quality improvement (CQI) process, the health system has reduced average door-to-balloon times from 120 minutes to less than 60 minutes, and administrators are now aiming for further progress. Among the improvements instituted by the multidisciplinary team are the implementation of a "greeter" with enough clinical expertise to quickly pick up on potential STEMI heart attacks as soon as patients walk into the ED, and the purchase of an electrocardiogram (EKG) machine so that evaluations can be done in the triage area. ED staff have prepared "STEMI" packets, including items such as special IV tubing and disposable leads, so that patients headed for the catheterization laboratory are prepared to undergo the procedure soon after arrival. All the clocks and devices used in the ED are synchronized so that analysts can later review how long it took to complete each step of the care process. Points of delay can then be targeted for improvement.

  6. Performance Analysis of a Maximum Power Point Tracking Technique using Silver Mean Method

    Directory of Open Access Journals (Sweden)

    Shobha Rani Depuru

    2018-01-01

    Full Text Available The proposed paper presents a simple and particularly efficacious Maximum Power Point Tracking (MPPT algorithm based on Silver Mean Method (SMM. This method operates by choosing a search interval from the P-V characteristics of the given solar array and converges to MPP of the Solar Photo-Voltaic (SPV system by shrinking its interval. After achieving the maximum power, the algorithm stops shrinking and maintains constant voltage until the next interval is decided. The tracking capability efficiency and performance analysis of the proposed algorithm are validated by the simulation and experimental results with a 100W solar panel for variable temperature and irradiance conditions. The results obtained confirm that even without any perturbation and observation process, the proposed method still outperforms the traditional perturb and observe (P&O method by demonstrating far better steady state output, more accuracy and higher efficiency.

  7. Multivariate analysis and extraction of parameters in resistive RAMs using the Quantum Point Contact model

    Science.gov (United States)

    Roldán, J. B.; Miranda, E.; González-Cordero, G.; García-Fernández, P.; Romero-Zaliz, R.; González-Rodelas, P.; Aguilera, A. M.; González, M. B.; Jiménez-Molinos, F.

    2018-01-01

    A multivariate analysis of the parameters that characterize the reset process in Resistive Random Access Memory (RRAM) has been performed. The different correlations obtained can help to shed light on the current components that contribute in the Low Resistance State (LRS) of the technology considered. In addition, a screening method for the Quantum Point Contact (QPC) current component is presented. For this purpose, the second derivative of the current has been obtained using a novel numerical method which allows determining the QPC model parameters. Once the procedure is completed, a whole Resistive Switching (RS) series of thousands of curves is studied by means of a genetic algorithm. The extracted QPC parameter distributions are characterized in depth to get information about the filamentary pathways associated with LRS in the low voltage conduction regime.

  8. A Unified Point Process Probabilistic Framework to Assess Heartbeat Dynamics and Autonomic Cardiovascular Control

    Directory of Open Access Journals (Sweden)

    Zhe eChen

    2012-02-01

    Full Text Available In recent years, time-varying inhomogeneous point process models have been introduced for assessment of instantaneous heartbeat dynamics as well as specific cardiovascular control mechanisms and hemodynamics. Assessment of the model's statistics is established through the Wiener-Volterra theory and a multivariate autoregressive (AR structure. A variety of instantaneous cardiovascular metrics, such as heart rate (HR, heart rate variability (HRV, respiratory sinus arrhythmia (RSA, and baroreceptor-cardiac reflex (baroreflex sensitivity (BRS, are derived within a parametric framework and instantaneously updated with adaptive and local maximum likelihood estimation algorithms. Inclusion of second order nonlinearities, with subsequent bispectral quantification in the frequency domain, further allows for definition of instantaneous metrics of nonlinearity. We here organize a comprehensive review of the devised methods as applied to experimental recordings from healthy subjects during propofol anesthesia. Collective results reveal interesting dynamic trends across the different pharmacological interventions operated within each anesthesia session, confirming the ability of the algorithm to track important changes in cardiorespiratory elicited interactions, and pointing at our mathematical approach as a promising monitoring tool for an accurate, noninvasive assessment in clinical practice.

  9. Catalysts macroporosity and their efficiency in sulphur sub-dew point Claus tail gas treating processes

    Energy Technology Data Exchange (ETDEWEB)

    Tsybulevski, A.M.; Pearson, M. [Alcoa Industrial Chemicals, 16010 Barker`s Point Lane, Houston, TX (United States); Morgun, L.V.; Filatova, O.E. [All-Russian Research Institute of Natural Gases and Gas Technologies VNIIGAZ, Moscow (Russian Federation); Sharp, M. [Porocel Corporation, Westheimer, Houston, TX (United States)

    1996-10-08

    The efficiency of 4 samples of alumina catalyst has been studied experimentally in the course of the Claus `tail gas` treating processes at the sulphur sub-dew point (TGTP). The samples were characterized by the same chemical and crystallographic composition, the same volume of micropores, the same surface area and the same catalytic activity but differed appreciably in the volume of macropores. An increase in the effective operation time of the catalysts before breakthrough of unrecoverable sulphur containing compounds, with the increasing macropore volume has been established. A theoretical model of the TGTP has been considered and it has been shown that the increase in the sulphur capacity of the catalysts with a larger volume of macropores is due to an increase in the catalysts efficiency factor and a slower decrease in their diffusive permeability during filling of micropores by sulphur

  10. Quantification of annual wildfire risk; A spatio-temporal point process approach.

    Directory of Open Access Journals (Sweden)

    Paula Pereira

    2013-10-01

    Full Text Available Policy responses for local and global firemanagement depend heavily on the proper understanding of the fire extent as well as its spatio-temporal variation across any given study area. Annual fire risk maps are important tools for such policy responses, supporting strategic decisions such as location-allocation of equipment and human resources. Here, we define risk of fire in the narrow sense as the probability of its occurrence without addressing the loss component. In this paper, we study the spatio-temporal point patterns of wildfires and model them by a log Gaussian Cox processes. Themean of predictive distribution of randomintensity function is used in the narrow sense, as the annual fire risk map for next year.

  11. The (n, $\\gamma$) reaction in the s-process branching point $^{59}$Ni

    CERN Multimedia

    We propose to measure the $^{59}$Ni(n,$\\gamma$)$^{56}$Fe cross section at the neutron time of flight (n TOF) facility with a dedicated chemical vapor deposition (CVD) diamond detector. The (n, ) reaction in the radioactive $^{59}$Ni is of relevance in nuclear astrophysics as it can be seen as a rst branching point in the astrophysical s-process. Its relevance in nuclear technology is especially related to material embrittlement in stainless steel. There is a strong discrepancy between available experimental data and the evaluated nuclear data les for this isotope. The aim of the measurement is to clarify this disagreement. The clear energy separation of the reaction products of neutron induced reactions in $^{59}$Ni makes it a very suitable candidate for a rst cross section measurement with the CVD diamond detector, which should serve in the future for similar measurements at n_TOF.

  12. Formal analysis of design process dynamics

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2010-01-01

    This paper presents a formal analysis of design process dynamics. Such a formal analysis is a prerequisite to come to a formal theory of design and for the development of automated support for the dynamics of design processes. The analysis was geared toward the identification of dynamic design

  13. Formal Analysis of Design Process Dynamics

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2010-01-01

    This paper presents a formal analysis of design process dynamics. Such a formal analysis is a prerequisite to come to a formal theory of design and for the development of automated support for the dynamics of design processes. The analysis was geared toward the identification of dynamic design

  14. Digital image processing and analysis human and computer vision applications with CVIPtools

    CERN Document Server

    Umbaugh, Scott E

    2010-01-01

    Section I Introduction to Digital Image Processing and AnalysisDigital Image Processing and AnalysisOverviewImage Analysis and Computer VisionImage Processing and Human VisionKey PointsExercisesReferencesFurther ReadingComputer Imaging SystemsImaging Systems OverviewImage Formation and SensingCVIPtools SoftwareImage RepresentationKey PointsExercisesSupplementary ExercisesReferencesFurther ReadingSection II Digital Image Analysis and Computer VisionIntroduction to Digital Image AnalysisIntroductionPreprocessingBinary Image AnalysisKey PointsExercisesSupplementary ExercisesReferencesFurther Read

  15. Quantifying structural uncertainty on fault networks using a marked point process within a Bayesian framework

    Science.gov (United States)

    Aydin, Orhun; Caers, Jef Karel

    2017-08-01

    Faults are one of the building-blocks for subsurface modeling studies. Incomplete observations of subsurface fault networks lead to uncertainty pertaining to location, geometry and existence of faults. In practice, gaps in incomplete fault network observations are filled based on tectonic knowledge and interpreter's intuition pertaining to fault relationships. Modeling fault network uncertainty with realistic models that represent tectonic knowledge is still a challenge. Although methods that address specific sources of fault network uncertainty and complexities of fault modeling exists, a unifying framework is still lacking. In this paper, we propose a rigorous approach to quantify fault network uncertainty. Fault pattern and intensity information are expressed by means of a marked point process, marked Strauss point process. Fault network information is constrained to fault surface observations (complete or partial) within a Bayesian framework. A structural prior model is defined to quantitatively express fault patterns, geometries and relationships within the Bayesian framework. Structural relationships between faults, in particular fault abutting relations, are represented with a level-set based approach. A Markov Chain Monte Carlo sampler is used to sample posterior fault network realizations that reflect tectonic knowledge and honor fault observations. We apply the methodology to a field study from Nankai Trough & Kumano Basin. The target for uncertainty quantification is a deep site with attenuated seismic data with only partially visible faults and many faults missing from the survey or interpretation. A structural prior model is built from shallow analog sites that are believed to have undergone similar tectonics compared to the site of study. Fault network uncertainty for the field is quantified with fault network realizations that are conditioned to structural rules, tectonic information and partially observed fault surfaces. We show the proposed

  16. Chosen Aspects of Modernization Processes in EU Countries and in Poland - Classical Point of View

    OpenAIRE

    Dworak Edyta; Malarska Anna

    2010-01-01

    The aim of this paper is an evaluation of changes in a sectoral structure of the employment in EU-countries in time. Against this background there are exposed changes in Polish economy in the period 1997-2008. There were used classical tools of the statistical analysis to illustrate and initially verification the theory of three sectors by A. Fisher, C. Clark i J. Fourastiè, orientated to the evaluation of the modernization process of EU-economies.

  17. Structured spatio-temporal shot-noise Cox point process models, with a view to modelling forest fires

    DEFF Research Database (Denmark)

    Møller, Jesper; Diaz-Avalos, Carlos

    Spatio-temporal Cox point process models with a multiplicative structure for the driving random intensity, incorporating covariate information into temporal and spatial components, and with a residual term modelled by a shot-noise process, are considered. Such models are flexible and tractable fo...... dataset consisting of 2796 days and 5834 spatial locations of fires. The model is compared with a spatio-temporal log-Gaussian Cox point process model, and likelihood-based methods are discussed to some extent....

  18. Homotopy analysis solutions of point kinetics equations with one delayed precursor group

    International Nuclear Information System (INIS)

    Zhu Qian; Luo Lei; Chen Zhiyun; Li Haofeng

    2010-01-01

    Homotopy analysis method is proposed to obtain series solutions of nonlinear differential equations. Homotopy analysis method was applied for the point kinetics equations with one delayed precursor group. Analytic solutions were obtained using homotopy analysis method, and the algorithm was analysed. The results show that the algorithm computation time and precision agree with the engineering requirements. (authors)

  19. A Multi-Point Method Considering the Maximum Power Point Tracking Dynamic Process for Aerodynamic Optimization of Variable-Speed Wind Turbine Blades

    Directory of Open Access Journals (Sweden)

    Zhiqiang Yang

    2016-05-01

    Full Text Available Due to the dynamic process of maximum power point tracking (MPPT caused by turbulence and large rotor inertia, variable-speed wind turbines (VSWTs cannot maintain the optimal tip speed ratio (TSR from cut-in wind speed up to the rated speed. Therefore, in order to increase the total captured wind energy, the existing aerodynamic design for VSWT blades, which only focuses on performance improvement at a single TSR, needs to be improved to a multi-point design. In this paper, based on a closed-loop system of VSWTs, including turbulent wind, rotor, drive train and MPPT controller, the distribution of operational TSR and its description based on inflow wind energy are investigated. Moreover, a multi-point method considering the MPPT dynamic process for the aerodynamic optimization of VSWT blades is proposed. In the proposed method, the distribution of operational TSR is obtained through a dynamic simulation of the closed-loop system under a specific turbulent wind, and accordingly the multiple design TSRs and the corresponding weighting coefficients in the objective function are determined. Finally, using the blade of a National Renewable Energy Laboratory (NREL 1.5 MW wind turbine as the baseline, the proposed method is compared with the conventional single-point optimization method using the commercial software Bladed. Simulation results verify the effectiveness of the proposed method.

  20. Seafood safety: economics of hazard analysis and Critical Control Point (HACCP) programmes

    National Research Council Canada - National Science Library

    Cato, James C

    1998-01-01

    .... This document on economic issues associated with seafood safety was prepared to complement the work of the Service in seafood technology, plant sanitation and Hazard Analysis Critical Control Point (HACCP) implementation...

  1. Ergodic Capacity Analysis of Free-Space Optical Links with Nonzero Boresight Pointing Errors

    KAUST Repository

    Ansari, Imran Shafique; Alouini, Mohamed-Slim; Cheng, Julian

    2015-01-01

    A unified capacity analysis of a free-space optical (FSO) link that accounts for nonzero boresight pointing errors and both types of detection techniques (i.e. intensity modulation/ direct detection as well as heterodyne detection) is addressed

  2. Multivariate Analysis for the Processing of Signals

    Directory of Open Access Journals (Sweden)

    Beattie J.R.

    2014-01-01

    Full Text Available Real-world experiments are becoming increasingly more complex, needing techniques capable of tracking this complexity. Signal based measurements are often used to capture this complexity, where a signal is a record of a sample’s response to a parameter (e.g. time, displacement, voltage, wavelength that is varied over a range of values. In signals the responses at each value of the varied parameter are related to each other, depending on the composition or state sample being measured. Since signals contain multiple information points, they have rich information content but are generally complex to comprehend. Multivariate Analysis (MA has profoundly transformed their analysis by allowing gross simplification of the tangled web of variation. In addition MA has also provided the advantage of being much more robust to the influence of noise than univariate methods of analysis. In recent years, there has been a growing awareness that the nature of the multivariate methods allows exploitation of its benefits for purposes other than data analysis, such as pre-processing of signals with the aim of eliminating irrelevant variations prior to analysis of the signal of interest. It has been shown that exploiting multivariate data reduction in an appropriate way can allow high fidelity denoising (removal of irreproducible non-signals, consistent and reproducible noise-insensitive correction of baseline distortions (removal of reproducible non-signals, accurate elimination of interfering signals (removal of reproducible but unwanted signals and the standardisation of signal amplitude fluctuations. At present, the field is relatively small but the possibilities for much wider application are considerable. Where signal properties are suitable for MA (such as the signal being stationary along the x-axis, these signal based corrections have the potential to be highly reproducible, and highly adaptable and are applicable in situations where the data is noisy or

  3. Controlling organic chemical hazards in food manufacturing: a hazard analysis critical control points (HACCP) approach.

    Science.gov (United States)

    Ropkins, K; Beck, A J

    2002-08-01

    Hazard analysis by critical control points (HACCP) is a systematic approach to the identification, assessment and control of hazards. Effective HACCP requires the consideration of all hazards, i.e., chemical, microbiological and physical. However, to-date most 'in-place' HACCP procedures have tended to focus on the control of microbiological and physical food hazards. In general, the chemical component of HACCP procedures is either ignored or limited to applied chemicals, e.g., food additives and pesticides. In this paper we discuss the application of HACCP to a broader range of chemical hazards, using organic chemical contaminants as examples, and the problems that are likely to arise in the food manufacturing sector. Chemical HACCP procedures are likely to result in many of the advantages previously identified for microbiological HACCP procedures: more effective, efficient and economical than conventional end-point-testing methods. However, the high costs of analytical monitoring of chemical contaminants and a limited understanding of formulation and process optimisation as means of controlling chemical contamination of foods are likely to prevent chemical HACCP becoming as effective as microbiological HACCP.

  4. Insights into mortality patterns and causes of death through a process point of view model.

    Science.gov (United States)

    Anderson, James J; Li, Ting; Sharrow, David J

    2017-02-01

    Process point of view (POV) models of mortality, such as the Strehler-Mildvan and stochastic vitality models, represent death in terms of the loss of survival capacity through challenges and dissipation. Drawing on hallmarks of aging, we link these concepts to candidate biological mechanisms through a framework that defines death as challenges to vitality where distal factors defined the age-evolution of vitality and proximal factors define the probability distribution of challenges. To illustrate the process POV, we hypothesize that the immune system is a mortality nexus, characterized by two vitality streams: increasing vitality representing immune system development and immunosenescence representing vitality dissipation. Proximal challenges define three mortality partitions: juvenile and adult extrinsic mortalities and intrinsic adult mortality. Model parameters, generated from Swedish mortality data (1751-2010), exhibit biologically meaningful correspondences to economic, health and cause-of-death patterns. The model characterizes the twentieth century epidemiological transition mainly as a reduction in extrinsic mortality resulting from a shift from high magnitude disease challenges on individuals at all vitality levels to low magnitude stress challenges on low vitality individuals. Of secondary importance, intrinsic mortality was described by a gradual reduction in the rate of loss of vitality presumably resulting from reduction in the rate of immunosenescence. Extensions and limitations of a distal/proximal framework for characterizing more explicit causes of death, e.g. the young adult mortality hump or cancer in old age are discussed.

  5. Mesh Processing in Medical Image Analysis

    DEFF Research Database (Denmark)

    The following topics are dealt with: mesh processing; medical image analysis; interactive freeform modeling; statistical shape analysis; clinical CT images; statistical surface recovery; automated segmentation; cerebral aneurysms; and real-time particle-based representation....

  6. Analysis of the resolution processes of three modeling tasks

    Directory of Open Access Journals (Sweden)

    Cèsar Gallart Palau

    2017-08-01

    Full Text Available In this paper we present a comparative analysis of the resolution process of three modeling tasks performed by secondary education students (13-14 years, designed from three different points of view: The Modelling-eliciting Activities, the LEMA project, and the Realistic Mathematical Problems. The purpose of this analysis is to obtain a methodological characterization of them in order to provide to secondary education teachers a proper selection and sequencing of tasks for their implementation in the classroom.

  7. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method.

    Science.gov (United States)

    Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu

    2016-12-24

    A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis.

  8. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method

    Directory of Open Access Journals (Sweden)

    Yueqian Shen

    2016-12-01

    Full Text Available A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis.

  9. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  10. Process Integration Analysis of an Industrial Hydrogen Production Process

    OpenAIRE

    Stolten, Detlef; Grube, Thomas; Tock, Laurence; Maréchal, François; Metzger, Christian; Arpentinier, Philippe

    2010-01-01

    The energy efficiency of an industrial hydrogen production process using steam methane reforming (SMR) combined with the water gas shift reaction (WGS) is analyzed using process integration techniques based on heat cascade calculation and pinch analysis with the aim of identifying potential measures to enhance the process performance. The challenge is to satisfy the high temperature heat demand of the SMR reaction by minimizing the consumption of natural gas to feed the combustion and to expl...

  11. Analysis of relationship between registration performance of point cloud statistical model and generation method of corresponding points

    International Nuclear Information System (INIS)

    Yamaoka, Naoto; Watanabe, Wataru; Hontani, Hidekata

    2010-01-01

    Most of the time when we construct statistical point cloud model, we need to calculate the corresponding points. Constructed statistical model will not be the same if we use different types of method to calculate the corresponding points. This article proposes the effect to statistical model of human organ made by different types of method to calculate the corresponding points. We validated the performance of statistical model by registering a surface of an organ in a 3D medical image. We compare two methods to calculate corresponding points. The first, the 'Generalized Multi-Dimensional Scaling (GMDS)', determines the corresponding points by the shapes of two curved surfaces. The second approach, the 'Entropy-based Particle system', chooses corresponding points by calculating a number of curved surfaces statistically. By these methods we construct the statistical models and using these models we conducted registration with the medical image. For the estimation, we use non-parametric belief propagation and this method estimates not only the position of the organ but also the probability density of the organ position. We evaluate how the two different types of method that calculates corresponding points affects the statistical model by change in probability density of each points. (author)

  12. Preclosure Criticality Analysis Process Report

    International Nuclear Information System (INIS)

    Thomas, D.A.

    1999-01-01

    The design approach for criticality of the disposal container and waste package will be dictated by existing regulatory requirements. This conclusion is based on the fact that preclosure operations and facilities have significant similarities to existing facilities and operations currently regulated by the NRC. The major difference would be the use of a risk-informed approach with burnup credit. This approach could reduce licensing delays and costs of the repository. The probability of success for this proposed seamless licensing strategy is increased, since there is precedence of regulation (10 CFR Part 63 and NUREG 1520) and commercial precedence for allowing burnup credit at sites similar to Yucca Mountain during preclosure. While NUREG 1520 is not directly applicable to a facility for handling spent nuclear fuel, the risk-informed approach to criticality analysis in NUREG 1520 is considered indicative of how the NRC will approach risk-informed criticality analysis at spent fuel facilities in the future. The types of design basis events which must be considered during the criticality safety analysis portion of the Integrated Safety Analysis (ISA) are those events which result in unanticipated moderation, loss of neutron absorber, geometric changes in the critical system, or administrative errors in waste form placement (loading) of the disposal container. The specific events to be considered must be based on the review of the system's design, as discussed in Section 3.2. A transition of licensing approach (e.g., deterministic versus risk-informed, performance-based) is not obvious and will require analysis. For commercial spent nuclear fuel, the probability of interspersed moderation may be low enough to allow nearly the same Critical Limit for both preclosure and postclosure, though an administrative margin will be applied to preclosure and possibly not to postclosure. Similarly the Design Basis Events for the waste package may be incredible and therefore not

  13. Preclosure Criticality Analysis Process Report

    International Nuclear Information System (INIS)

    Thomas, D.A.

    1999-01-01

    The design approach for criticality of the disposal container and waste package will be dictated by existing regulatory requirements. This conclusion is based on the fact that preclosure operations and facilities have significant similarities to existing facilities and operations currently regulated by the NRC. The major difference would be the use of a risk-informed approach with burnup credit. This approach could reduce licensing delays and costs of the repository. The probability of success for this proposed seamless licensing strategy is increased, since there is precedence of regulation (10 CFR Part 63 and NUREG 1520) and commercial precedence for allowing burnup credit at sites similar to Yucca Mountain during preclosure. While NUREG 1520 is not directly applicable to a facility for handling spent nuclear fuel, the risk-informed approach to criticality analysis in NUREG 1520 is considered indicative of how the NRC will approach risk-informed criticality analysis at spent fuel facilities in the future. The types of design basis events which must be considered during the criticality safety analysis portion of the Integrated Safety Analysis (ISA) are those events which result in unanticipated moderation, loss of neutron absorber, geometric changes in the critical system, or administrative errors in waste form placement (loading) of the disposal container. The specific events to be considered must be based on the review of the system's design, as discussed in Section 3.2. A transition of licensing approach (e.g., deterministic versus risk-informed, performance-based) is not obvious and will require analysis. For commercial spent nuclear fuel, the probability of interspersed moderation may be low enough to allow nearly the same Critical Limit for both preclosure and postclosure, though an administrative margin will be applied to preclosure and possibly not to postclosure. Similarly the Design Basis Events for the waste package may be incredible and therefore not

  14. Birth-death models and coalescent point processes: the shape and probability of reconstructed phylogenies.

    Science.gov (United States)

    Lambert, Amaury; Stadler, Tanja

    2013-12-01

    Forward-in-time models of diversification (i.e., speciation and extinction) produce phylogenetic trees that grow "vertically" as time goes by. Pruning the extinct lineages out of such trees leads to natural models for reconstructed trees (i.e., phylogenies of extant species). Alternatively, reconstructed trees can be modelled by coalescent point processes (CPPs), where trees grow "horizontally" by the sequential addition of vertical edges. Each new edge starts at some random speciation time and ends at the present time; speciation times are drawn from the same distribution independently. CPPs lead to extremely fast computation of tree likelihoods and simulation of reconstructed trees. Their topology always follows the uniform distribution on ranked tree shapes (URT). We characterize which forward-in-time models lead to URT reconstructed trees and among these, which lead to CPP reconstructed trees. We show that for any "asymmetric" diversification model in which speciation rates only depend on time and extinction rates only depend on time and on a non-heritable trait (e.g., age), the reconstructed tree is CPP, even if extant species are incompletely sampled. If rates additionally depend on the number of species, the reconstructed tree is (only) URT (but not CPP). We characterize the common distribution of speciation times in the CPP description, and discuss incomplete species sampling as well as three special model cases in detail: (1) the extinction rate does not depend on a trait; (2) rates do not depend on time; (3) mass extinctions may happen additionally at certain points in the past. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. HACCP (Hazard Analysis and Critical Control Points) to guarantee safe water reuse and drinking water production--a case study.

    Science.gov (United States)

    Dewettinck, T; Van Houtte, E; Geenens, D; Van Hege, K; Verstraete, W

    2001-01-01

    To obtain a sustainable water catchment in the dune area of the Flemish west coast, the integration of treated domestic wastewater in the existing potable water production process is planned. The hygienic hazards associated with the introduction of treated domestic wastewater into the water cycle are well recognised. Therefore, the concept of HACCP (Hazard Analysis and Critical Control Points) was used to guarantee hygienically safe drinking water production. Taking into account the literature data on the removal efficiencies of the proposed advanced treatment steps with regard to enteric viruses and protozoa and after setting high quality limits based on the recent progress in quantitative risk assessment, the critical control points (CCPs) and points of attention (POAs) were identified. Based on the HACCP analysis a specific monitoring strategy was developed which focused on the control of these CCPs and POAs.

  16. Simultaneous colour visualizations of multiple ALS point cloud attributes for land cover and vegetation analysis

    Science.gov (United States)

    Zlinszky, András; Schroiff, Anke; Otepka, Johannes; Mandlburger, Gottfried; Pfeifer, Norbert

    2014-05-01

    LIDAR point clouds hold valuable information for land cover and vegetation analysis, not only in the spatial distribution of the points but also in their various attributes. However, LIDAR point clouds are rarely used for visual interpretation, since for most users, the point cloud is difficult to interpret compared to passive optical imagery. Meanwhile, point cloud viewing software is available allowing interactive 3D interpretation, but typically only one attribute at a time. This results in a large number of points with the same colour, crowding the scene and often obscuring detail. We developed a scheme for mapping information from multiple LIDAR point attributes to the Red, Green, and Blue channels of a widely used LIDAR data format, which are otherwise mostly used to add information from imagery to create "photorealistic" point clouds. The possible combinations of parameters are therefore represented in a wide range of colours, but relative differences in individual parameter values of points can be well understood. The visualization was implemented in OPALS software, using a simple and robust batch script, and is viewer independent since the information is stored in the point cloud data file itself. In our case, the following colour channel assignment delivered best results: Echo amplitude in the Red, echo width in the Green and normalized height above a Digital Terrain Model in the Blue channel. With correct parameter scaling (but completely without point classification), points belonging to asphalt and bare soil are dark red, low grassland and crop vegetation are bright red to yellow, shrubs and low trees are green and high trees are blue. Depending on roof material and DTM quality, buildings are shown from red through purple to dark blue. Erroneously high or low points, or points with incorrect amplitude or echo width usually have colours contrasting from terrain or vegetation. This allows efficient visual interpretation of the point cloud in planar

  17. [Powdered infant formulae preparation guide for hospitals based on Hazard Analysis and Critical Control Points (HACCP) principles].

    Science.gov (United States)

    Vargas-Leguás, H; Rodríguez Garrido, V; Lorite Cuenca, R; Pérez-Portabella, C; Redecillas Ferreiro, S; Campins Martí, M

    2009-06-01

    This guide for the preparation of powdered infant formulae in hospital environments is a collaborative work between several hospital services and is based on national and European regulations, international experts meetings and the recommendations of scientific societies. This guide also uses the Hazard Analysis and Critical Control Point principles proposed by Codex Alimentarius and emphasises effective verifying measures, microbiological controls of the process and the corrective actions when monitoring indicates that a critical control point is not under control. It is a dynamic guide and specifies the evaluation procedures that allow it to be constantly adapted.

  18. APPLICABILITY ANALYSIS OF CLOTH SIMULATION FILTERING ALGORITHM FOR MOBILE LIDAR POINT CLOUD

    Directory of Open Access Journals (Sweden)

    S. Cai

    2018-04-01

    Full Text Available Classifying the original point clouds into ground and non-ground points is a key step in LiDAR (light detection and ranging data post-processing. Cloth simulation filtering (CSF algorithm, which based on a physical process, has been validated to be an accurate, automatic and easy-to-use algorithm for airborne LiDAR point cloud. As a new technique of three-dimensional data collection, the mobile laser scanning (MLS has been gradually applied in various fields, such as reconstruction of digital terrain models (DTM, 3D building modeling and forest inventory and management. Compared with airborne LiDAR point cloud, there are some different features (such as point density feature, distribution feature and complexity feature for mobile LiDAR point cloud. Some filtering algorithms for airborne LiDAR data were directly used in mobile LiDAR point cloud, but it did not give satisfactory results. In this paper, we explore the ability of the CSF algorithm for mobile LiDAR point cloud. Three samples with different shape of the terrain are selected to test the performance of this algorithm, which respectively yields total errors of 0.44 %, 0.77 % and1.20 %. Additionally, large area dataset is also tested to further validate the effectiveness of this algorithm, and results show that it can quickly and accurately separate point clouds into ground and non-ground points. In summary, this algorithm is efficient and reliable for mobile LiDAR point cloud.

  19. Application of statistical process control and process capability analysis procedures in orbiter processing activities at the Kennedy Space Center

    Science.gov (United States)

    Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.

    1994-01-01

    Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.

  20. Root cause analysis with enriched process logs

    NARCIS (Netherlands)

    Suriadi, S.; Ouyang, C.; Aalst, van der W.M.P.; Hofstede, ter A.H.M.; La Rosa, M.; Soffer, P.

    2013-01-01

    n the field of process mining, the use of event logs for the purpose of root cause analysis is increasingly studied. In such an analysis, the availability of attributes/features that may explain the root cause of some phenomena is crucial. Currently, the process of obtaining these attributes from

  1. Approaching the r-process "waiting point" nuclei below $^{132}$Sn: quadrupole collectivity in $^{128}$Cd

    CERN Multimedia

    Reiter, P; Blazhev, A A; Nardelli, S; Voulot, D; Habs, D; Schwerdtfeger, W; Iwanicki, J S

    We propose to investigate the nucleus $^{128}$Cd neighbouring the r-process "waiting point" $^{130}$Cd. A possible explanation for the peak in the solar r-abundances at A $\\approx$ 130 is a quenching of the N = 82 shell closure for spherical nuclei below $^{132}$Sn. This explanation seems to be in agreement with recent $\\beta$-decay measurements performed at ISOLDE. In contrast to this picture, a beyond-mean-field approach would explain the anomaly in the excitation energy observed for $^{128}$Cd rather with a quite large quadrupole collectivity. Therefore, we propose to measure the reduced transition strengths B(E2) between ground state and first excited 2$^{+}$-state in $^{128}$Cd applying $\\gamma$-spectroscopy with MINIBALL after "safe" Coulomb excitation of a post-accelerated beam obtained from REX-ISOLDE. Such a measurement came into reach only because of the source developments made in 2006 for experiment IS411, in particular the use of a heated quartz transfer line. The result from the proposed measure...

  2. Process-based coastal erosion modeling for Drew Point (North Slope, Alaska)

    Science.gov (United States)

    Ravens, Thomas M.; Jones, Benjamin M.; Zhang, Jinlin; Arp, Christopher D.; Schmutz, Joel A.

    2012-01-01

    A predictive, coastal erosion/shoreline change model has been developed for a small coastal segment near Drew Point, Beaufort Sea, Alaska. This coastal setting has experienced a dramatic increase in erosion since the early 2000’s. The bluffs at this site are 3-4 m tall and consist of ice-wedge bounded blocks of fine-grained sediments cemented by ice-rich permafrost and capped with a thin organic layer. The bluffs are typically fronted by a narrow (∼ 5  m wide) beach or none at all. During a storm surge, the sea contacts the base of the bluff and a niche is formed through thermal and mechanical erosion. The niche grows both vertically and laterally and eventually undermines the bluff, leading to block failure or collapse. The fallen block is then eroded both thermally and mechanically by waves and currents, which must occur before a new niche forming episode may begin. The erosion model explicitly accounts for and integrates a number of these processes including: (1) storm surge generation resulting from wind and atmospheric forcing, (2) erosional niche growth resulting from wave-induced turbulent heat transfer and sediment transport (using the Kobayashi niche erosion model), and (3) thermal and mechanical erosion of the fallen block. The model was calibrated with historic shoreline change data for one time period (1979-2002), and validated with a later time period (2002-2007).

  3. Generating Impact Maps from Automatically Detected Bomb Craters in Aerial Wartime Images Using Marked Point Processes

    Science.gov (United States)

    Kruse, Christian; Rottensteiner, Franz; Hoberg, Thorsten; Ziems, Marcel; Rebke, Julia; Heipke, Christian

    2018-04-01

    The aftermath of wartime attacks is often felt long after the war ended, as numerous unexploded bombs may still exist in the ground. Typically, such areas are documented in so-called impact maps which are based on the detection of bomb craters. This paper proposes a method for the automatic detection of bomb craters in aerial wartime images that were taken during the Second World War. The object model for the bomb craters is represented by ellipses. A probabilistic approach based on marked point processes determines the most likely configuration of objects within the scene. Adding and removing new objects to and from the current configuration, respectively, changing their positions and modifying the ellipse parameters randomly creates new object configurations. Each configuration is evaluated using an energy function. High gradient magnitudes along the border of the ellipse are favored and overlapping ellipses are penalized. Reversible Jump Markov Chain Monte Carlo sampling in combination with simulated annealing provides the global energy optimum, which describes the conformance with a predefined model. For generating the impact map a probability map is defined which is created from the automatic detections via kernel density estimation. By setting a threshold, areas around the detections are classified as contaminated or uncontaminated sites, respectively. Our results show the general potential of the method for the automatic detection of bomb craters and its automated generation of an impact map in a heterogeneous image stock.

  4. A marked point process approach for identifying neural correlates of tics in Tourette Syndrome.

    Science.gov (United States)

    Loza, Carlos A; Shute, Jonathan B; Principe, Jose C; Okun, Michael S; Gunduz, Aysegul

    2017-07-01

    We propose a novel interpretation of local field potentials (LFP) based on a marked point process (MPP) framework that models relevant neuromodulations as shifted weighted versions of prototypical temporal patterns. Particularly, the MPP samples are categorized according to the well known oscillatory rhythms of the brain in an effort to elucidate spectrally specific behavioral correlates. The result is a transient model for LFP. We exploit data-driven techniques to fully estimate the model parameters with the added feature of exceptional temporal resolution of the resulting events. We utilize the learned features in the alpha and beta bands to assess correlations to tic events in patients with Tourette Syndrome (TS). The final results show stronger coupling between LFP recorded from the centromedian-paraficicular complex of the thalamus and the tic marks, in comparison to electrocorticogram (ECoG) recordings from the hand area of the primary motor cortex (M1) in terms of the area under the curve (AUC) of the receiver operating characteristic (ROC) curve.

  5. An Optimized Multicolor Point-Implicit Solver for Unstructured Grid Applications on Graphics Processing Units

    Science.gov (United States)

    Zubair, Mohammad; Nielsen, Eric; Luitjens, Justin; Hammond, Dana

    2016-01-01

    In the field of computational fluid dynamics, the Navier-Stokes equations are often solved using an unstructuredgrid approach to accommodate geometric complexity. Implicit solution methodologies for such spatial discretizations generally require frequent solution of large tightly-coupled systems of block-sparse linear equations. The multicolor point-implicit solver used in the current work typically requires a significant fraction of the overall application run time. In this work, an efficient implementation of the solver for graphics processing units is proposed. Several factors present unique challenges to achieving an efficient implementation in this environment. These include the variable amount of parallelism available in different kernel calls, indirect memory access patterns, low arithmetic intensity, and the requirement to support variable block sizes. In this work, the solver is reformulated to use standard sparse and dense Basic Linear Algebra Subprograms (BLAS) functions. However, numerical experiments show that the performance of the BLAS functions available in existing CUDA libraries is suboptimal for matrices representative of those encountered in actual simulations. Instead, optimized versions of these functions are developed. Depending on block size, the new implementations show performance gains of up to 7x over the existing CUDA library functions.

  6. Plasmon point spread functions: How do we model plasmon-mediated emission processes?

    Science.gov (United States)

    Willets, Katherine A.

    2014-02-01

    A major challenge with studying plasmon-mediated emission events is the small size of plasmonic nanoparticles relative to the wavelength of light. Objects smaller than roughly half the wavelength of light will appear as diffraction-limited spots in far-field optical images, presenting a significant experimental challenge for studying plasmonic processes on the nanoscale. Super-resolution imaging has recently been applied to plasmonic nanosystems and allows plasmon-mediated emission to be resolved on the order of ˜5 nm. In super-resolution imaging, a diffraction-limited spot is fit to some model function in order to calculate the position of the emission centroid, which represents the location of the emitter. However, the accuracy of the centroid position strongly depends on how well the fitting function describes the data. This Perspective discusses the commonly used two-dimensional Gaussian fitting function applied to super-resolution imaging of plasmon-mediated emission, then introduces an alternative model based on dipole point spread functions. The two fitting models are compared and contrasted for super-resolution imaging of nanoparticle scattering/luminescence, surface-enhanced Raman scattering, and surface-enhanced fluorescence.

  7. Numerical analysis of slowest heating or cooling point in a canned food in oil

    Energy Technology Data Exchange (ETDEWEB)

    Hanzawa, T.; Wang, Q.; Suzuki, M.; Sakai, N. [Tokyo Univ. of Fisheries (Japan)

    1998-06-01

    In the sterilizing process of canned food in oil for a fish meat such as tunny, the slowest heating or cooling point is very important for the thermal process determination of the can. To obtain the slowest point, the temperature profiles in solid food are estimated by numerical calculation from the fundamental equations at unsteady state in consideration of a free convection in the space occupied by the oil. The positions of the slowest heating or cooling point in the canned food in oil are obtained accurately, and a correlative equation for the position is obtained numerically under various operating conditions. The calculated temperature profiles and the position of both slowest points are in sufficiently good approximation to the experimental ones. 4 refs., 9 figs.

  8. Environmental protection standards - from the point of view of systems analysis

    Energy Technology Data Exchange (ETDEWEB)

    Becker, K

    1978-11-01

    A project of the International Institute of Applied Systems Analysis (IIASA) in Laxenburg castle near Vienna is reviewed where standards for environmental protection are interpreted from the point of view of systems analysis. Some examples are given to show how results are influenced not only by technical and economic factors but also by psychological and political factors.

  9. Environmental protection standards - from the point of view of systems analysis

    International Nuclear Information System (INIS)

    Becker, K.

    1978-01-01

    A project of the International Institute of Applied Systems Analysis (IIASA) in Laxenburg castle near Vienna is reviewed where standards for environmental protection are interpreted from the point of view of systems analysis. Some examples are given to show how results are influenced not only by technical and economic factors but also by psychological and political factors. (orig.) [de

  10. A core ontology for business process analysis

    NARCIS (Netherlands)

    Pedrinaci, C.; Domingue, J.; Alves De Medeiros, A.K.; Bechhofer, S.; Hauswirth, M.; Hoffmann, J.; Koubarakis, M.

    2008-01-01

    Business Process Management (BPM) aims at supporting the whole life-cycle necessary to deploy and maintain business processes in organisations. An important step of the BPM life-cycle is the analysis of the processes deployed in companies. However, the degree of automation currently achieved cannot

  11. Temperature calibration procedure for thin film substrates for thermo-ellipsometric analysis using melting point standards

    International Nuclear Information System (INIS)

    Kappert, Emiel J.; Raaijmakers, Michiel J.T.; Ogieglo, Wojciech; Nijmeijer, Arian; Huiskes, Cindy; Benes, Nieck E.

    2015-01-01

    Highlights: • Facile temperature calibration method for thermo-ellipsometric analysis. • The melting point of thin films of indium, lead, zinc, and water can be detected by ellipsometry. • In-situ calibration of ellipsometry hot stage, without using any external equipment. • High-accuracy temperature calibration (±1.3 °C). - Abstract: Precise and accurate temperature control is pertinent to studying thermally activated processes in thin films. Here, we present a calibration method for the substrate–film interface temperature using spectroscopic ellipsometry. The method is adapted from temperature calibration methods that are well developed for thermogravimetric analysis and differential scanning calorimetry instruments, and is based on probing a transition temperature. Indium, lead, and zinc could be spread on a substrate, and the phase transition of these metals could be detected by a change in the Ψ signal of the ellipsometer. For water, the phase transition could be detected by a loss of signal intensity as a result of light scattering by the ice crystals. The combined approach allowed for construction of a linear calibration curve with an accuracy of 1.3 °C or lower over the full temperature range

  12. Datum Feature Extraction and Deformation Analysis Method Based on Normal Vector of Point Cloud

    Science.gov (United States)

    Sun, W.; Wang, J.; Jin, F.; Liang, Z.; Yang, Y.

    2018-04-01

    In order to solve the problem lacking applicable analysis method in the application of three-dimensional laser scanning technology to the field of deformation monitoring, an efficient method extracting datum feature and analysing deformation based on normal vector of point cloud was proposed. Firstly, the kd-tree is used to establish the topological relation. Datum points are detected by tracking the normal vector of point cloud determined by the normal vector of local planar. Then, the cubic B-spline curve fitting is performed on the datum points. Finally, datum elevation and the inclination angle of the radial point are calculated according to the fitted curve and then the deformation information was analyzed. The proposed approach was verified on real large-scale tank data set captured with terrestrial laser scanner in a chemical plant. The results show that the method could obtain the entire information of the monitor object quickly and comprehensively, and reflect accurately the datum feature deformation.

  13. Point process models for spatio-temporal distance sampling data from a large-scale survey of blue whales

    KAUST Repository

    Yuan, Yuan; Bachl, Fabian E.; Lindgren, Finn; Borchers, David L.; Illian, Janine B.; Buckland, Stephen T.; Rue, Haavard; Gerrodette, Tim

    2017-01-01

    Distance sampling is a widely used method for estimating wildlife population abundance. The fact that conventional distance sampling methods are partly design-based constrains the spatial resolution at which animal density can be estimated using these methods. Estimates are usually obtained at survey stratum level. For an endangered species such as the blue whale, it is desirable to estimate density and abundance at a finer spatial scale than stratum. Temporal variation in the spatial structure is also important. We formulate the process generating distance sampling data as a thinned spatial point process and propose model-based inference using a spatial log-Gaussian Cox process. The method adopts a flexible stochastic partial differential equation (SPDE) approach to model spatial structure in density that is not accounted for by explanatory variables, and integrated nested Laplace approximation (INLA) for Bayesian inference. It allows simultaneous fitting of detection and density models and permits prediction of density at an arbitrarily fine scale. We estimate blue whale density in the Eastern Tropical Pacific Ocean from thirteen shipboard surveys conducted over 22 years. We find that higher blue whale density is associated with colder sea surface temperatures in space, and although there is some positive association between density and mean annual temperature, our estimates are consistent with no trend in density across years. Our analysis also indicates that there is substantial spatially structured variation in density that is not explained by available covariates.

  14. Point process models for spatio-temporal distance sampling data from a large-scale survey of blue whales

    KAUST Repository

    Yuan, Yuan

    2017-12-28

    Distance sampling is a widely used method for estimating wildlife population abundance. The fact that conventional distance sampling methods are partly design-based constrains the spatial resolution at which animal density can be estimated using these methods. Estimates are usually obtained at survey stratum level. For an endangered species such as the blue whale, it is desirable to estimate density and abundance at a finer spatial scale than stratum. Temporal variation in the spatial structure is also important. We formulate the process generating distance sampling data as a thinned spatial point process and propose model-based inference using a spatial log-Gaussian Cox process. The method adopts a flexible stochastic partial differential equation (SPDE) approach to model spatial structure in density that is not accounted for by explanatory variables, and integrated nested Laplace approximation (INLA) for Bayesian inference. It allows simultaneous fitting of detection and density models and permits prediction of density at an arbitrarily fine scale. We estimate blue whale density in the Eastern Tropical Pacific Ocean from thirteen shipboard surveys conducted over 22 years. We find that higher blue whale density is associated with colder sea surface temperatures in space, and although there is some positive association between density and mean annual temperature, our estimates are consistent with no trend in density across years. Our analysis also indicates that there is substantial spatially structured variation in density that is not explained by available covariates.

  15. ISRIA statement: ten-point guidelines for an effective process of research impact assessment.

    Science.gov (United States)

    Adam, Paula; Ovseiko, Pavel V; Grant, Jonathan; Graham, Kathryn E A; Boukhris, Omar F; Dowd, Anne-Maree; Balling, Gert V; Christensen, Rikke N; Pollitt, Alexandra; Taylor, Mark; Sued, Omar; Hinrichs-Krapels, Saba; Solans-Domènech, Maite; Chorzempa, Heidi

    2018-02-08

    As governments, funding agencies and research organisations worldwide seek to maximise both the financial and non-financial returns on investment in research, the way the research process is organised and funded is becoming increasingly under scrutiny. There are growing demands and aspirations to measure research impact (beyond academic publications), to understand how science works, and to optimise its societal and economic impact. In response, a multidisciplinary practice called research impact assessment is rapidly developing. Given that the practice is still in its formative stage, systematised recommendations or accepted standards for practitioners (such as funders and those responsible for managing research projects) across countries or disciplines to guide research impact assessment are not yet available.In this statement, we propose initial guidelines for a rigorous and effective process of research impact assessment applicable to all research disciplines and oriented towards practice. This statement systematises expert knowledge and practitioner experience from designing and delivering the International School on Research Impact Assessment (ISRIA). It brings together insights from over 450 experts and practitioners from 34 countries, who participated in the school during its 5-year run (from 2013 to 2017) and shares a set of core values from the school's learning programme. These insights are distilled into ten-point guidelines, which relate to (1) context, (2) purpose, (3) stakeholders' needs, (4) stakeholder engagement, (5) conceptual frameworks, (6) methods and data sources, (7) indicators and metrics, (8) ethics and conflicts of interest, (9) communication, and (10) community of practice.The guidelines can help practitioners improve and standardise the process of research impact assessment, but they are by no means exhaustive and require evaluation and continuous improvement. The prima facie effectiveness of the guidelines is based on the systematised

  16. Analysis of point source size on measurement accuracy of lateral point-spread function of confocal Raman microscopy

    Science.gov (United States)

    Fu, Shihang; Zhang, Li; Hu, Yao; Ding, Xiang

    2018-01-01

    Confocal Raman Microscopy (CRM) has matured to become one of the most powerful instruments in analytical science because of its molecular sensitivity and high spatial resolution. Compared with conventional Raman Microscopy, CRM can perform three dimensions mapping of tiny samples and has the advantage of high spatial resolution thanking to the unique pinhole. With the wide application of the instrument, there is a growing requirement for the evaluation of the imaging performance of the system. Point-spread function (PSF) is an important approach to the evaluation of imaging capability of an optical instrument. Among a variety of measurement methods of PSF, the point source method has been widely used because it is easy to operate and the measurement results are approximate to the true PSF. In the point source method, the point source size has a significant impact on the final measurement accuracy. In this paper, the influence of the point source sizes on the measurement accuracy of PSF is analyzed and verified experimentally. A theoretical model of the lateral PSF for CRM is established and the effect of point source size on full-width at half maximum of lateral PSF is simulated. For long-term preservation and measurement convenience, PSF measurement phantom using polydimethylsiloxane resin, doped with different sizes of polystyrene microspheres is designed. The PSF of CRM with different sizes of microspheres are measured and the results are compared with the simulation results. The results provide a guide for measuring the PSF of the CRM.

  17. Nonlinear bending and collapse analysis of a poked cylinder and other point-loaded cylinders

    International Nuclear Information System (INIS)

    Sobel, L.H.

    1983-06-01

    This paper analyzes the geometrically nonlinear bending and collapse behavior of an elastic, simply supported cylindrical shell subjected to an inward-directed point load applied at midlength. The large displacement analysis results for this thin (R/t = 638) poked cylinder were obtained from the STAGSC-1 finite element computer program. STAGSC-1 results are also presented for two other point-loaded shell problems: a pinched cylinder (R/t = 100), and a venetian blind (R/t = 250)

  18. Numerical analysis of sandwich beam with corrugated core under three-point bending

    Energy Technology Data Exchange (ETDEWEB)

    Wittenbeck, Leszek [Poznan University of Technology, Institute of Mathematics Piotrowo Street No. 5, 60-965 Poznan (Poland); Grygorowicz, Magdalena; Paczos, Piotr [Poznan University of Technology, Institute of Applied Mechanics Jana Pawla IIStreet No. 24, 60-965 Poznan (Poland)

    2015-03-10

    The strength problem of sandwich beam with corrugated core under three-point bending is presented.The beam are made of steel and formed by three mutually orthogonal corrugated layers. The finite element analysis (FEA) of the sandwich beam is performed with the use of the FEM system - ABAQUS. The relationship between the applied load and deflection in three-point bending is considered.

  19. Analysis of the flight dynamics of the Solar Maximum Mission (SMM) off-sun scientific pointing

    Science.gov (United States)

    Pitone, D. S.; Klein, J. R.; Twambly, B. J.

    1990-01-01

    Algorithms are presented which were created and implemented by the Goddard Space Flight Center's (GSFC's) Solar Maximum Mission (SMM) attitude operations team to support large-angle spacecraft pointing at scientific objectives. The mission objective of the post-repair SMM satellite was to study solar phenomena. However, because the scientific instruments, such as the Coronagraph/Polarimeter (CP) and the Hard X-ray Burst Spectrometer (HXRBS), were able to view objects other than the Sun, attitude operations support for attitude pointing at large angles from the nominal solar-pointing attitudes was required. Subsequently, attitude support for SMM was provided for scientific objectives such as Comet Halley, Supernova 1987A, Cygnus X-1, and the Crab Nebula. In addition, the analysis was extended to include the reverse problem, computing the right ascension and declination of a body given the off-Sun angles. This analysis led to the computation of the orbits of seven new solar comets seen in the field-of-view (FOV) of the CP. The activities necessary to meet these large-angle attitude-pointing sequences, such as slew sequence planning, viewing-period prediction, and tracking-bias computation are described. Analysis is presented for the computation of maneuvers and pointing parameters relative to the SMM-unique, Sun-centered reference frame. Finally, science data and independent attitude solutions are used to evaluate the larg-angle pointing performance.

  20. Hazard analysis and critical control point to irradiated food in Brazil

    International Nuclear Information System (INIS)

    Boaratti, Maria de Fatima Guerra

    2004-01-01

    Food borne diseases, in particular gastro-intestinal infections, represent a very large group of pathologies with a strong negative impact on the health of the population because of their widespread nature. Little consideration is given to such conditions due to the fact that their symptoms are often moderate and self-limiting. This has led to a general underestimation of their importance, and consequently to incorrect practices during the preparation and preservation of food, resulting in the frequent occurrence of outbreaks involving groups of varying numbers of consumers. Despite substantial efforts in the avoidance of contamination, an upward trend in the number of outbreaks of food borne illnesses caused by non-spore forming pathogenic bacteria are reported in many countries. Good hygienic practices can reduce the level of contamination but the most important pathogens cannot presently be eliminated from most farms, nor is it possible to eliminate them by primary processing, particularly from those foods which are sold raw. Several decontamination methods exist but the most versatile treatment among them is the ionizing radiation procedure. HACCP (Hazard Analysis and Critical Control Point) is a management system in which food safety is addressed through the analysis and control of biological, chemical, and physical hazards from raw material production, procurement and handling, to manufacturing, distribution and consumption of the finished product. For successful implementation of a HACCP plan, management must be strongly committed to the HACCP concept. A firm commitment to HACCP by top management provides company employees with a sense of the importance of producing safe food. At the same time, it has to be always emphasized that, like other intervention strategies, irradiation must be applied as part of a total sanitation program. The benefits of irradiation should never be considered as an excuse for poor quality or for poor handling and storage conditions

  1. Point process models for localization and interdependence of punctate cellular structures.

    Science.gov (United States)

    Li, Ying; Majarian, Timothy D; Naik, Armaghan W; Johnson, Gregory R; Murphy, Robert F

    2016-07-01

    Accurate representations of cellular organization for multiple eukaryotic cell types are required for creating predictive models of dynamic cellular function. To this end, we have previously developed the CellOrganizer platform, an open source system for generative modeling of cellular components from microscopy images. CellOrganizer models capture the inherent heterogeneity in the spatial distribution, size, and quantity of different components among a cell population. Furthermore, CellOrganizer can generate quantitatively realistic synthetic images that reflect the underlying cell population. A current focus of the project is to model the complex, interdependent nature of organelle localization. We built upon previous work on developing multiple non-parametric models of organelles or structures that show punctate patterns. The previous models described the relationships between the subcellular localization of puncta and the positions of cell and nuclear membranes and microtubules. We extend these models to consider the relationship to the endoplasmic reticulum (ER), and to consider the relationship between the positions of different puncta of the same type. Our results do not suggest that the punctate patterns we examined are dependent on ER position or inter- and intra-class proximity. With these results, we built classifiers to update previous assignments of proteins to one of 11 patterns in three distinct cell lines. Our generative models demonstrate the ability to construct statistically accurate representations of puncta localization from simple cellular markers in distinct cell types, capturing the complex phenomena of cellular structure interaction with little human input. This protocol represents a novel approach to vesicular protein annotation, a field that is often neglected in high-throughput microscopy. These results suggest that spatial point process models provide useful insight with respect to the spatial dependence between cellular structures.

  2. Moments analysis of concurrent Poisson processes

    International Nuclear Information System (INIS)

    McBeth, G.W.; Cross, P.

    1975-01-01

    A moments analysis of concurrent Poisson processes has been carried out. Equations are given which relate combinations of distribution moments to sums of products involving the number of counts associated with the processes and the mean rate of the processes. Elimination of background is discussed and equations suitable for processing random radiation, parent-daughter pairs in the presence of background, and triple and double correlations in the presence of background are given. The theory of identification of the four principle radioactive series by moments analysis is discussed. (Auth.)

  3. Bubble point pressures of the selected model system for CatLiq® bio-oil process

    DEFF Research Database (Denmark)

    Toor, Saqib Sohail; Rosendahl, Lasse; Baig, Muhammad Noman

    2010-01-01

    . In this work, the bubble point pressures of a selected model mixture (CO2 + H2O + Ethanol + Acetic acid + Octanoic acid) were measured to investigate the phase boundaries of the CatLiq® process. The bubble points were measured in the JEFRI-DBR high pressure PVT phase behavior system. The experimental results......The CatLiq® process is a second generation catalytic liquefaction process for the production of bio-oil from WDGS (Wet Distillers Grains with Solubles) at subcritical conditions (280-350 oC and 225-250 bar) in the presence of a homogeneous alkaline and a heterogeneous Zirconia catalyst...

  4. Structured Spatio-temporal shot-noise Cox point process models, with a view to modelling forest fires

    DEFF Research Database (Denmark)

    Møller, Jesper; Diaz-Avalos, Carlos

    2010-01-01

    Spatio-temporal Cox point process models with a multiplicative structure for the driving random intensity, incorporating covariate information into temporal and spatial components, and with a residual term modelled by a shot-noise process, are considered. Such models are flexible and tractable fo...... data set consisting of 2796 days and 5834 spatial locations of fires. The model is compared with a spatio-temporal log-Gaussian Cox point process model, and likelihood-based methods are discussed to some extent....

  5. Three-dimensional model analysis and processing

    CERN Document Server

    Yu, Faxin; Luo, Hao; Wang, Pinghui

    2011-01-01

    This book focuses on five hot research directions in 3D model analysis and processing in computer science:  compression, feature extraction, content-based retrieval, irreversible watermarking and reversible watermarking.

  6. Artificial intelligence applied to process signal analysis

    Science.gov (United States)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  7. A numerical analysis on forming limits during spiral and concentric single point incremental forming

    Science.gov (United States)

    Gipiela, M. L.; Amauri, V.; Nikhare, C.; Marcondes, P. V. P.

    2017-01-01

    Sheet metal forming is one of the major manufacturing industries, which are building numerous parts for aerospace, automotive and medical industry. Due to the high demand in vehicle industry and environmental regulations on less fuel consumption on other hand, researchers are innovating new methods to build these parts with energy efficient sheet metal forming process instead of conventionally used punch and die to form the parts to achieve the lightweight parts. One of the most recognized manufacturing process in this category is Single Point Incremental Forming (SPIF). SPIF is the die-less sheet metal forming process in which the single point tool incrementally forces any single point of sheet metal at any process time to plastic deformation zone. In the present work, finite element method (FEM) is applied to analyze the forming limits of high strength low alloy steel formed by single point incremental forming (SPIF) by spiral and concentric tool path. SPIF numerical simulations were model with 24 and 29 mm cup depth, and the results were compare with Nakajima results obtained by experiments and FEM. It was found that the cup formed with Nakajima tool failed at 24 mm while cups formed by SPIF surpassed the limit for both depths with both profiles. It was also notice that the strain achieved in concentric profile are lower than that in spiral profile.

  8. Summary of process research analysis efforts

    Science.gov (United States)

    Burger, D. R.

    1985-01-01

    A summary of solar-cell process research analysis efforts was presented. Process design and cell design are interactive efforts where technology from integrated circuit processes and other processes are blended. The primary factors that control cell efficiency are: (1) the bulk parameters of the available sheet material, (2) the retention and enhancement of these bulk parameters, and (3) the cell design and the cost to produce versus the finished cells performance. The process sequences need to be tailored to be compatible with the sheet form, the cell shape form, and the processing equipment. New process options that require further evaluation and utilization are lasers, robotics, thermal pulse techniques, and new materials. There are numerous process control techniques that can be adapted and used that will improve product uniformity and reduced costs. Two factors that can lead to longer life modules are the use of solar cell diffusion barriers and improved encapsulation.

  9. Radar signal analysis and processing using Matlab

    CERN Document Server

    Mahafza, Bassem R

    2008-01-01

    Offering radar-related software for the analysis and design of radar waveform and signal processing, this book provides comprehensive coverage of radar signals and signal processing techniques and algorithms. It contains numerous graphical plots, common radar-related functions, table format outputs, and end-of-chapter problems. The complete set of MATLAB[registered] functions and routines are available for download online.

  10. Vygotsky's Analysis of Children's Meaning Making Processes

    Science.gov (United States)

    Mahn, Holbrook

    2012-01-01

    Vygotsky's work is extensive and covers many aspects of the development of children's meaning-making processes in social and cultural contexts. However, his main focus is on the examination of the unification of speaking and thinking processes. His investigation centers on the analysis of the entity created by this unification--an internal…

  11. The implementation of a Hazard Analysis and Critical Control Point management system in a peanut butter ice cream plant.

    Science.gov (United States)

    Hung, Yu-Ting; Liu, Chi-Te; Peng, I-Chen; Hsu, Chin; Yu, Roch-Chui; Cheng, Kuan-Chen

    2015-09-01

    To ensure the safety of the peanut butter ice cream manufacture, a Hazard Analysis and Critical Control Point (HACCP) plan has been designed and applied to the production process. Potential biological, chemical, and physical hazards in each manufacturing procedure were identified. Critical control points for the peanut butter ice cream were then determined as the pasteurization and freezing process. The establishment of a monitoring system, corrective actions, verification procedures, and documentation and record keeping were followed to complete the HACCP program. The results of this study indicate that implementing the HACCP system in food industries can effectively enhance food safety and quality while improving the production management. Copyright © 2015. Published by Elsevier B.V.

  12. The study and analysis of point-to-point vibration isolation and its utility to seismic base isolator

    International Nuclear Information System (INIS)

    Mehboob, M.; Qureshi, A.S.

    2001-01-01

    This paper presents systematic approach to regarding the piece wise vibration isolation generally termed as point-to-point vibration isolation system, and its broad spectrum-utilities to an economic seismic base isolation. Transfer of curves for coulomb damped i.e. softening damper flexible mountings are presented and the utility has been proved equally good for both rigidly and elastically coupled damping. It is clearly shown that the very closest solutions are easily obtainable for both slipping and sticking nature of phases of the motion. This eliminates the conventional and conceptual approximations based on the linearization of the damping. This new concept will not endanger-super-structure if mounted on such isolation systems. (author)

  13. [Introduction of hazard analysis and critical control points (HACCP) principles at the flight catering food production plant].

    Science.gov (United States)

    Popova, A Yu; Trukhina, G M; Mikailova, O M

    In the article there is considered the quality control and safety system implemented in the one of the largest flight catering food production plant for airline passengers and flying squad. The system for the control was based on the Hazard Analysis And Critical Control Points (HACCP) principles and developed hygienic and antiepidemic measures. There is considered the identification of hazard factors at stages of the technical process. There are presented results of the analysis data of monitoring for 6 critical control points over the five-year period. The quality control and safety system permit to decline food contamination risk during acceptance, preparation and supplying of in-flight meal. There was proved the efficiency of the implemented system. There are determined further ways of harmonization and implementation for HACCP principles in the plant.

  14. IRB Process Improvements: A Machine Learning Analysis.

    Science.gov (United States)

    Shoenbill, Kimberly; Song, Yiqiang; Cobb, Nichelle L; Drezner, Marc K; Mendonca, Eneida A

    2017-06-01

    Clinical research involving humans is critically important, but it is a lengthy and expensive process. Most studies require institutional review board (IRB) approval. Our objective is to identify predictors of delays or accelerations in the IRB review process and apply this knowledge to inform process change in an effort to improve IRB efficiency, transparency, consistency and communication. We analyzed timelines of protocol submissions to determine protocol or IRB characteristics associated with different processing times. Our evaluation included single variable analysis to identify significant predictors of IRB processing time and machine learning methods to predict processing times through the IRB review system. Based on initial identified predictors, changes to IRB workflow and staffing procedures were instituted and we repeated our analysis. Our analysis identified several predictors of delays in the IRB review process including type of IRB review to be conducted, whether a protocol falls under Veteran's Administration purview and specific staff in charge of a protocol's review. We have identified several predictors of delays in IRB protocol review processing times using statistical and machine learning methods. Application of this knowledge to process improvement efforts in two IRBs has led to increased efficiency in protocol review. The workflow and system enhancements that are being made support our four-part goal of improving IRB efficiency, consistency, transparency, and communication.

  15. Study on characteristic points of boiling curve by using wavelet analysis and genetic algorithm

    International Nuclear Information System (INIS)

    Wei Huiming; Su Guanghui; Qiu Suizheng; Yang Xingbo

    2009-01-01

    Based on the wavelet analysis theory of signal singularity detection,the critical heat flux (CHF) and minimum film boiling starting point (q min ) of boiling curves can be detected and analyzed by using the wavelet multi-resolution analysis. To predict the CHF in engineering, empirical relations were obtained based on genetic algorithm. The results of wavelet detection and genetic algorithm prediction are consistent with experimental data very well. (authors)

  16. Computerization of the safeguards analysis decision process

    International Nuclear Information System (INIS)

    Ehinger, M.H.

    1990-01-01

    This paper reports that safeguards regulations are evolving to meet new demands for timeliness and sensitivity in detecting the loss or unauthorized use of sensitive nuclear materials. The opportunities to meet new rules, particularly in bulk processing plants, involve developing techniques which use modern, computerized process control and information systems. Using these computerized systems in the safeguards analysis involves all the challenges of the man-machine interface experienced in the typical process control application and adds new dimensions to accuracy requirements, data analysis, and alarm resolution in the regulatory environment

  17. One-point fluctuation analysis of the high-energy neutrino sky

    DEFF Research Database (Denmark)

    Feyereisen, Michael R.; Tamborra, Irene; Ando, Shin'ichiro

    2017-01-01

    We perform the first one-point fluctuation analysis of the high-energy neutrino sky. This method reveals itself to be especially suited to contemporary neutrino data, as it allows to study the properties of the astrophysical components of the high-energy flux detected by the IceCube telescope, even...

  18. Chopped or long roughage: what do calves prefer? Using cross point analysis of double demand functions

    NARCIS (Netherlands)

    Webb, L.E.; Bak Jensen, M.; Engel, B.; Reenen, van C.G.; Gerrits, W.J.J.; Boer, de I.J.M.; Bokkers, E.A.M.

    2014-01-01

    The present study aimed to quantify calves'(Bos taurus) preference for long versus chopped hay and straw, and hay versus straw, using cross point analysis of double demand functions, in a context where energy intake was not a limiting factor. Nine calves, fed milk replacer and concentrate, were

  19. An Exploratory Study: A Kinesic Analysis of Academic Library Public Service Points

    Science.gov (United States)

    Kazlauskas, Edward

    1976-01-01

    An analysis of body movements of individuals at reference and circulation public service points in four academic libraries indicated that both receptive and nonreceptive nonverbal behaviors were used by all levels of library employees, and these behaviors influenced patron interaction. (Author/LS)

  20. Microchip capillary electrophoresis for point-of-care analysis of lithium

    NARCIS (Netherlands)

    Vrouwe, E.X.; Luttge, R.; Vermes, I.; Berg, van den A.

    2007-01-01

    Background: Microchip capillary electrophoresis (CE) is a promising method for chemical analysis of complex samples such as whole blood. We evaluated the method for point-of-care testing of lithium. Methods: Chemical separation was performed on standard glass microchip CE devices with a conductivity

  1. [Pharmaceutical Assistance in the Family Healthcare Program: points of affinity and discord in the organization process].

    Science.gov (United States)

    Silva Oliveira, Tatiana de Alencar; Maria, Tatiane de Oliveira Silva; Alves do Nascimento, Angela Maria; do Nascimento, Angela Alves

    2011-09-01

    The scope of this study was to discuss the organization of the pharmaceutical assistance service in the family healthcare program. Qualitative research from a critical/analytical perspective was conducted in family healthcare units in a municipality of the state of Bahia, Brazil. Data was collected on the basis of systematic observation, semi-structured interviews and documents analysis from a dialectic standpoint. The organization of Pharmaceutical Assistance consisted of selection, planning, acquisition, storage and dispensing activities. The process was studied in the implementation phase, which was occurring in a centralized and uncoordinated fashion, without the proposed team work. An excess of activity was observed among the healthcare workers and there was an absence of a continued education policy for the workers. For the transformation of this situation and to ensure the organization of pharmaceutical assistance with quality and in an integrated manner, a reworking of the manner of thinking and action of the players concerned (managers, health workers and users), who participate directly in the organization, is necessary. Furthermore, mechanical, bureaucratic and impersonal work practices need to be abandoned.

  2. PET and diagnostic technology evaluation in a global clinical process. DGN's point of view

    International Nuclear Information System (INIS)

    Kotzerke, J.; Dietlein, M.; Gruenwald, F.; Bockisch, A.

    2010-01-01

    The German Society of Nuclear Medicine (DGN) criticizes the methodological approach of the IQWiG for evaluation of PET and the conclusions, which represent the opposite point of view compared to the most other European countries and health companies in the USA: (1) Real integration of experienced physicians into the interpretation of data and the evaluation of effectiveness should be used for best possible reporting instead of only formal hearing. (2) Data of the National Oncologic PET Registry (NOPR) from the USA have shown, that PET has changed the therapeutic management in 38% of patients. (3) The decision of the IQWiG to accept outcome data only for their benefit analyses, is controversial. Medical knowledge is generated by different methods, and an actual analysis of the scientific guidelines has shown that only 15% out of all guidelines are based on the level of evidence demanded by the IQWiG. Health economics has created different assessment methods for the evaluation of a diagnostic procedure. The strategy chosen by the IQWiG overestimated the perspective of the population and undervalue the benefit for an individual patient. (4) PET evaluates the effectiveness of a therapeutic procedure, but does not create an effective therapy. When the predictive value of PET is already implemented in a specific study design and the result of PET define a specific management, the trial evaluate the whole algorithm and PET is part of this algorithm only. When PET is implemented as test during chemotherapy or by the end of chemotherapy, the predictive value of PET will depend decisively on the effectiveness of the therapy: The better the therapy, the smaller the differences in survival detected by PET. (5) The significance of an optimal staging by the integration of PET will increase. Rationale is the actual development of ''titration'' of chemotherapy intensity and radiation dose towards the lowest possible, just about effective dosage. (6) The medical therapy of

  3. Concerning the acid dew point in waste gases from combustion processes

    Energy Technology Data Exchange (ETDEWEB)

    Knoche, K.F.; Deutz, W.; Hein, K.; Derichs, W.

    1986-09-01

    The paper discusses the problems associated with the measurement of acid dew point and of sulphuric acid-(say SO/sub 3/-)concentrations in the flue gas from brown coal-fired boiler plants. The sulphuric acid content in brown coal flue gas has been measured at 0.5 to 3 vpm in SO/sub 2/ concentrations of 200 to 800 vpm. Using a conditional equation, the derivation of which from new formulae for phase stability is described in the paper, an acid dew point temperature of 115 to 125/sup 0/C is produced.

  4. Comparison of Clothing Cultures from the View Point of Funeral Procession

    OpenAIRE

    増田, 美子; 大枝, 近子; 梅谷, 知世; 杉本, 浄; 内村, 理奈

    2011-01-01

    This study was for its object to research for the look in the funeral ceremony and make the point of the different and common point between the respective cultural spheres of the Buddhism,Hinduism, Islam and Christianity clearly. In the year 21, we tried to grasp the reality of costumes of funeral courtesy in modern times and present-day. And it became clear in the result, Japan, the Buddhist cultural sphere, China and Taiwan, the Buddhism, the Confucianism and the Taoism intermingled cultura...

  5. Fixed-point Characterization of Compositionality Properties of Probabilistic Processes Combinators

    Directory of Open Access Journals (Sweden)

    Daniel Gebler

    2014-08-01

    Full Text Available Bisimulation metric is a robust behavioural semantics for probabilistic processes. Given any SOS specification of probabilistic processes, we provide a method to compute for each operator of the language its respective metric compositionality property. The compositionality property of an operator is defined as its modulus of continuity which gives the relative increase of the distance between processes when they are combined by that operator. The compositionality property of an operator is computed by recursively counting how many times the combined processes are copied along their evolution. The compositionality properties allow to derive an upper bound on the distance between processes by purely inspecting the operators used to specify those processes.

  6. BUSINESS PROCESS MANAGEMENT SYSTEMS TECHNOLOGY COMPONENTS ANALYSIS

    Directory of Open Access Journals (Sweden)

    Andrea Giovanni Spelta

    2007-05-01

    Full Text Available The information technology that supports the implementation of the business process management appproach is called Business Process Management System (BPMS. The main components of the BPMS solution framework are process definition repository, process instances repository, transaction manager, conectors framework, process engine and middleware. In this paper we define and characterize the role and importance of the components of BPMS's framework. The research method adopted was the case study, through the analysis of the implementation of the BPMS solution in an insurance company called Chubb do Brasil. In the case study, the process "Manage Coinsured Events"" is described and characterized, as well as the components of the BPMS solution adopted and implemented by Chubb do Brasil for managing this process.

  7. Nonlinear consider covariance analysis using a sigma-point filter formulation

    Science.gov (United States)

    Lisano, Michael E.

    2006-01-01

    The research reported here extends the mathematical formulation of nonlinear, sigma-point estimators to enable consider covariance analysis for dynamical systems. This paper presents a novel sigma-point consider filter algorithm, for consider-parameterized nonlinear estimation, following the unscented Kalman filter (UKF) variation on the sigma-point filter formulation, which requires no partial derivatives of dynamics models or measurement models with respect to the parameter list. It is shown that, consistent with the attributes of sigma-point estimators, a consider-parameterized sigma-point estimator can be developed entirely without requiring the derivation of any partial-derivative matrices related to the dynamical system, the measurements, or the considered parameters, which appears to be an advantage over the formulation of a linear-theory sequential consider estimator. It is also demonstrated that a consider covariance analysis performed with this 'partial-derivative-free' formulation yields equivalent results to the linear-theory consider filter, for purely linear problems.

  8. Identification of 'Point A' as the prevalent source of error in cephalometric analysis of lateral radiographs.

    Science.gov (United States)

    Grogger, P; Sacher, C; Weber, S; Millesi, G; Seemann, R

    2018-04-10

    Deviations in measuring dentofacial components in a lateral X-ray represent a major hurdle in the subsequent treatment of dysgnathic patients. In a retrospective study, we investigated the most prevalent source of error in the following commonly used cephalometric measurements: the angles Sella-Nasion-Point A (SNA), Sella-Nasion-Point B (SNB) and Point A-Nasion-Point B (ANB); the Wits appraisal; the anteroposterior dysplasia indicator (APDI); and the overbite depth indicator (ODI). Preoperative lateral radiographic images of patients with dentofacial deformities were collected and the landmarks digitally traced by three independent raters. Cephalometric analysis was automatically performed based on 1116 tracings. Error analysis identified the x-coordinate of Point A as the prevalent source of error in all investigated measurements, except SNB, in which it is not incorporated. In SNB, the y-coordinate of Nasion predominated error variance. SNB showed lowest inter-rater variation. In addition, our observations confirmed previous studies showing that landmark identification variance follows characteristic error envelopes in the highest number of tracings analysed up to now. Variance orthogonal to defining planes was of relevance, while variance parallel to planes was not. Taking these findings into account, orthognathic surgeons as well as orthodontists would be able to perform cephalometry more accurately and accomplish better therapeutic results. Copyright © 2018 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  9. [Analysis and research on cleaning points of HVAC systems in public places].

    Science.gov (United States)

    Yang, Jiaolan; Han, Xu; Chen, Dongqing; Jin, Xin; Dai, Zizhu

    2010-03-01

    To analyze cleaning points of HVAC systems, and to provides scientific base for regulating the cleaning of HVAC systems. Based on the survey results on the cleaning situation of HVAC systems around China for the past three years, we analyzes the cleaning points of HVAC systems from various aspects, such as the major health risk factors of HVAC systems, the formulation strategy of the cleaning of HVAC systems, cleaning methods and acceptance points of the air ducts and the parts of HVAC systems, the onsite protection and individual protection, the waste treatment and the cleaning of the removed equipment, inspection of the cleaning results, video record, and the final acceptance of the cleaning. The analysis of the major health risk factors of HVAC systems and the formulation strategy of the cleaning of HVAC systems is given. The specific methods for cleaning the air ducts, machine units, air ports, coil pipes and the water cooling towers of HVAC systems, the acceptance points of HVAC systems and the requirements of the report on the final acceptance of the cleaning of HVAC systems are proposed. By the analysis of the points of the cleaning of HVAC systems and proposal of corresponding measures, this study provides the base for the scientific and regular launch of the cleaning of HVAC systems, a novel technology service, and lays a foundation for the revision of the existing cleaning regulations, which may generate technical and social benefits to some extent.

  10. Coupling aerosol-cloud-radiative processes in the WRF-Chem model: Investigating the radiative impact of elevated point sources

    Directory of Open Access Journals (Sweden)

    E. G. Chapman

    2009-02-01

    Full Text Available The local and regional influence of elevated point sources on summertime aerosol forcing and cloud-aerosol interactions in northeastern North America was investigated using the WRF-Chem community model. The direct effects of aerosols on incoming solar radiation were simulated using existing modules to relate aerosol sizes and chemical composition to aerosol optical properties. Indirect effects were simulated by adding a prognostic treatment of cloud droplet number and adding modules that activate aerosol particles to form cloud droplets, simulate aqueous-phase chemistry, and tie a two-moment treatment of cloud water (cloud water mass and cloud droplet number to precipitation and an existing radiation scheme. Fully interactive feedbacks thus were created within the modified model, with aerosols affecting cloud droplet number and cloud radiative properties, and clouds altering aerosol size and composition via aqueous processes, wet scavenging, and gas-phase-related photolytic processes. Comparisons of a baseline simulation with observations show that the model captured the general temporal cycle of aerosol optical depths (AODs and produced clouds of comparable thickness to observations at approximately the proper times and places. The model overpredicted SO2 mixing ratios and PM2.5 mass, but reproduced the range of observed SO2 to sulfate aerosol ratios, suggesting that atmospheric oxidation processes leading to aerosol sulfate formation are captured in the model. The baseline simulation was compared to a sensitivity simulation in which all emissions at model levels above the surface layer were set to zero, thus removing stack emissions. Instantaneous, site-specific differences for aerosol and cloud related properties between the two simulations could be quite large, as removing above-surface emission sources influenced when and where clouds formed within the modeling domain. When summed spatially over the finest

  11. Processability analysis of candidate waste forms

    International Nuclear Information System (INIS)

    Gould, T.H. Jr.; Dunson, J.B. Jr.; Eisenberg, A.M.; Haight, H.G. Jr.; Mello, V.E.; Schuyler, R.L. III.

    1982-01-01

    A quantitative merit evaluation, or processability analysis, was performed to assess the relative difficulty of remote processing of Savannah River Plant high-level wastes for seven alternative waste form candidates. The reference borosilicate glass process was rated as the simplest, followed by FUETAP concrete, glass marbles in a lead matrix, high-silica glass, crystalline ceramics (SYNROC-D and tailored ceramics), and coated ceramic particles. Cost estimates for the borosilicate glass, high-silica glass, and ceramic waste form processing facilities are also reported

  12. Focal Points, Endogenous Processes, and Exogenous Shocks in the Autism Epidemic

    Science.gov (United States)

    Liu, Kayuet; Bearman, Peter S.

    2015-01-01

    Autism prevalence has increased rapidly in the United States during the past two decades. We have previously shown that the diffusion of information about autism through spatially proximate social relations has contributed significantly to the epidemic. This study expands on this finding by identifying the focal points for interaction that drive…

  13. CPAS Preflight Drop Test Analysis Process

    Science.gov (United States)

    Englert, Megan E.; Bledsoe, Kristin J.; Romero, Leah M.

    2015-01-01

    Throughout the Capsule Parachute Assembly System (CPAS) drop test program, the CPAS Analysis Team has developed a simulation and analysis process to support drop test planning and execution. This process includes multiple phases focused on developing test simulations and communicating results to all groups involved in the drop test. CPAS Engineering Development Unit (EDU) series drop test planning begins with the development of a basic operational concept for each test. Trajectory simulation tools include the Flight Analysis and Simulation Tool (FAST) for single bodies, and the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulation for the mated vehicle. Results are communicated to the team at the Test Configuration Review (TCR) and Test Readiness Review (TRR), as well as at Analysis Integrated Product Team (IPT) meetings in earlier and intermediate phases of the pre-test planning. The ability to plan and communicate efficiently with rapidly changing objectives and tight schedule constraints is a necessity for safe and successful drop tests.

  14. Dosimetric analysis at ICRU reference points in HDR-brachytherapy of cervical carcinoma.

    Science.gov (United States)

    Eich, H T; Haverkamp, U; Micke, O; Prott, F J; Müller, R P

    2000-01-01

    In vivo dosimetry in bladder and rectum as well as determining doses on suggested reference points following the ICRU report 38 contribute to quality assurance in HDR-brachytherapy of cervical carcinoma, especially to minimize side effects. In order to gain information regarding the radiation exposure at ICRU reference points in rectum, bladder, ureter and regional lymph nodes those were calculated (digitalisation) by means of orthogonal radiographs of 11 applications in patients with cervical carcinoma, who received primary radiotherapy. In addition, the doses at the ICRU rectum reference point was compared to the results of in vivo measurements in the rectum. The in vivo measurements were by factor 1.5 below the doses determined for the ICRU rectum reference point (4.05 +/- 0.68 Gy versus 6.11 +/- 1.63 Gy). Reasons for this were: calibration errors, non-orthogonal radiographs, movement of applicator and probe in the time span between X-ray and application, missing connection of probe and anterior rectal wall. The standard deviation of calculations at ICRU reference points was on average +/- 30%. Possible reasons for the relatively large standard deviation were difficulties in defining the points, identifying them on radiographs and the different locations of the applicators. Although 3 D CT, US or MR based treatment planning using dose volume histogram analysis is more and more established, this simple procedure of marking and digitising the ICRU reference points lengthened treatment planning only by 5 to 10 minutes. The advantages of in vivo dosimetry are easy practicability and the possibility to determine rectum doses during radiation. The advantages of computer-aided planning at ICRU reference points are that calculations are available before radiation and that they can still be taken into account for treatment planning. Both methods should be applied in HDR-brachytherapy of cervical carcinoma.

  15. IMAGE-PLANE ANALYSIS OF n-POINT-MASS LENS CRITICAL CURVES AND CAUSTICS

    Energy Technology Data Exchange (ETDEWEB)

    Danek, Kamil; Heyrovský, David, E-mail: kamil.danek@utf.mff.cuni.cz, E-mail: heyrovsky@utf.mff.cuni.cz [Institute of Theoretical Physics, Faculty of Mathematics and Physics, Charles University in Prague (Czech Republic)

    2015-06-10

    The interpretation of gravitational microlensing events caused by planetary systems or multiple stars is based on the n-point-mass lens model. The first planets detected by microlensing were well described by the two-point-mass model of a star with one planet. By the end of 2014, four events involving three-point-mass lenses had been announced. Two of the lenses were stars with two planetary companions each; two were binary stars with a planet orbiting one component. While the two-point-mass model is well understood, the same cannot be said for lenses with three or more components. Even the range of possible critical-curve topologies and caustic geometries of the three-point-mass lens remains unknown. In this paper we provide new tools for mapping the critical-curve topology and caustic cusp number in the parameter space of n-point-mass lenses. We perform our analysis in the image plane of the lens. We show that all contours of the Jacobian are critical curves of re-scaled versions of the lens configuration. Utilizing this property further, we introduce the cusp curve to identify cusp-image positions on all contours simultaneously. In order to track cusp-number changes in caustic metamorphoses, we define the morph curve, which pinpoints the positions of metamorphosis-point images along the cusp curve. We demonstrate the usage of both curves on simple two- and three-point-mass lens examples. For the three simplest caustic metamorphoses we illustrate the local structure of the image and source planes.

  16. Meta-Analysis of Effect Sizes Reported at Multiple Time Points Using General Linear Mixed Model

    Science.gov (United States)

    Musekiwa, Alfred; Manda, Samuel O. M.; Mwambi, Henry G.; Chen, Ding-Geng

    2016-01-01

    Meta-analysis of longitudinal studies combines effect sizes measured at pre-determined time points. The most common approach involves performing separate univariate meta-analyses at individual time points. This simplistic approach ignores dependence between longitudinal effect sizes, which might result in less precise parameter estimates. In this paper, we show how to conduct a meta-analysis of longitudinal effect sizes where we contrast different covariance structures for dependence between effect sizes, both within and between studies. We propose new combinations of covariance structures for the dependence between effect size and utilize a practical example involving meta-analysis of 17 trials comparing postoperative treatments for a type of cancer, where survival is measured at 6, 12, 18 and 24 months post randomization. Although the results from this particular data set show the benefit of accounting for within-study serial correlation between effect sizes, simulations are required to confirm these results. PMID:27798661

  17. Comparative analysis among several methods used to solve the point kinetic equations

    International Nuclear Information System (INIS)

    Nunes, Anderson L.; Goncalves, Alessandro da C.; Martinez, Aquilino S.; Silva, Fernando Carvalho da

    2007-01-01

    The main objective of this work consists on the methodology development for comparison of several methods for the kinetics equations points solution. The evaluated methods are: the finite differences method, the stiffness confinement method, improved stiffness confinement method and the piecewise constant approximations method. These methods were implemented and compared through a systematic analysis that consists basically of confronting which one of the methods consume smaller computational time with higher precision. It was calculated the relative which function is to combine both criteria in order to reach the goal. Through the analyses of the performance factor it is possible to choose the best method for the solution of point kinetics equations. (author)

  18. Comparative analysis among several methods used to solve the point kinetic equations

    Energy Technology Data Exchange (ETDEWEB)

    Nunes, Anderson L.; Goncalves, Alessandro da C.; Martinez, Aquilino S.; Silva, Fernando Carvalho da [Universidade Federal, Rio de Janeiro, RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia. Programa de Engenharia Nuclear; E-mails: alupo@if.ufrj.br; agoncalves@con.ufrj.br; aquilino@lmp.ufrj.br; fernando@con.ufrj.br

    2007-07-01

    The main objective of this work consists on the methodology development for comparison of several methods for the kinetics equations points solution. The evaluated methods are: the finite differences method, the stiffness confinement method, improved stiffness confinement method and the piecewise constant approximations method. These methods were implemented and compared through a systematic analysis that consists basically of confronting which one of the methods consume smaller computational time with higher precision. It was calculated the relative which function is to combine both criteria in order to reach the goal. Through the analyses of the performance factor it is possible to choose the best method for the solution of point kinetics equations. (author)

  19. Bayesian analysis of log Gaussian Cox processes for disease mapping

    DEFF Research Database (Denmark)

    Benes, Viktor; Bodlák, Karel; Møller, Jesper

    We consider a data set of locations where people in Central Bohemia have been infected by tick-borne encephalitis, and where population census data and covariates concerning vegetation and altitude are available. The aims are to estimate the risk map of the disease and to study the dependence...... of the risk on the covariates. Instead of using the common area level approaches we consider a Bayesian analysis for a log Gaussian Cox point process with covariates. Posterior characteristics for a discretized version of the log Gaussian Cox process are computed using markov chain Monte Carlo methods...

  20. Trend analysis and change point detection of annual and seasonal temperature series in Peninsular Malaysia

    Science.gov (United States)

    Suhaila, Jamaludin; Yusop, Zulkifli

    2017-06-01

    Most of the trend analysis that has been conducted has not considered the existence of a change point in the time series analysis. If these occurred, then the trend analysis will not be able to detect an obvious increasing or decreasing trend over certain parts of the time series. Furthermore, the lack of discussion on the possible factors that influenced either the decreasing or the increasing trend in the series needs to be addressed in any trend analysis. Hence, this study proposes to investigate the trends, and change point detection of mean, maximum and minimum temperature series, both annually and seasonally in Peninsular Malaysia and determine the possible factors that could contribute to the significance trends. In this study, Pettitt and sequential Mann-Kendall (SQ-MK) tests were used to examine the occurrence of any abrupt climate changes in the independent series. The analyses of the abrupt changes in temperature series suggested that most of the change points in Peninsular Malaysia were detected during the years 1996, 1997 and 1998. These detection points captured by Pettitt and SQ-MK tests are possibly related to climatic factors, such as El Niño and La Niña events. The findings also showed that the majority of the significant change points that exist in the series are related to the significant trend of the stations. Significant increasing trends of annual and seasonal mean, maximum and minimum temperatures in Peninsular Malaysia were found with a range of 2-5 °C/100 years during the last 32 years. It was observed that the magnitudes of the increasing trend in minimum temperatures were larger than the maximum temperatures for most of the studied stations, particularly at the urban stations. These increases are suspected to be linked with the effect of urban heat island other than El Niño event.

  1. Phase-equilibria for design of coal-gasification processes: dew points of hot gases containing condensible tars. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Prausnitz, J.M.

    1980-05-01

    This research is concerned with the fundamental physical chemistry and thermodynamics of condensation of tars (dew points) from the vapor phase at advanced temperatures and pressures. Fundamental quantitative understanding of dew points is important for rational design of heat exchangers to recover sensible heat from hot, tar-containing gases that are produced in coal gasification. This report includes essentially six contributions toward establishing the desired understanding: (1) Characterization of Coal Tars for Dew-Point Calculations; (2) Fugacity Coefficients for Dew-Point Calculations in Coal-Gasification Process Design; (3) Vapor Pressures of High-Molecular-Weight Hydrocarbons; (4) Estimation of Vapor Pressures of High-Boiling Fractions in Liquefied Fossil Fuels Containing Heteroatoms Nitrogen or Sulfur; and (5) Vapor Pressures of Heavy Liquid Hydrocarbons by a Group-Contribution Method.

  2. Multiview 3D sensing and analysis for high quality point cloud reconstruction

    Science.gov (United States)

    Satnik, Andrej; Izquierdo, Ebroul; Orjesek, Richard

    2018-04-01

    Multiview 3D reconstruction techniques enable digital reconstruction of 3D objects from the real world by fusing different viewpoints of the same object into a single 3D representation. This process is by no means trivial and the acquisition of high quality point cloud representations of dynamic 3D objects is still an open problem. In this paper, an approach for high fidelity 3D point cloud generation using low cost 3D sensing hardware is presented. The proposed approach runs in an efficient low-cost hardware setting based on several Kinect v2 scanners connected to a single PC. It performs autocalibration and runs in real-time exploiting an efficient composition of several filtering methods including Radius Outlier Removal (ROR), Weighted Median filter (WM) and Weighted Inter-Frame Average filtering (WIFA). The performance of the proposed method has been demonstrated through efficient acquisition of dense 3D point clouds of moving objects.

  3. Use of the hazard analysis and critical control points (HACCP) risk assessment on a medical device for parenteral application.

    Science.gov (United States)

    Jahnke, Michael; Kühn, Klaus-Dieter

    2003-01-01

    In order to guarantee the consistently high quality of medical products for human use, it is absolutely necessary that flawless hygiene conditions are maintained by the strict observance of hygiene rules. With the growing understanding of the impact of process conditions on the quality of the resulting product, process controls (surveillance) have gained increasing importance to complete the quality profile traditionally defined by post-process product testing. Today, process controls have become an important GMP requirement for the pharmaceutical industry. However, before quality process controls can be introduced, the manufacturing process has to be analyzed, with the focus on its critical quality-influencing steps. The HACCP (Hazard Analysis and Critical Control Points) method is well recognized as a useful tool in the pharmaceutical industry. This risk analysis, following the guidelines of the HACCP method and the monitoring of critical steps during the manufacturing process was applied to the manufacture of methyl methacrylate solution used for bone cement and led to the establishment of a preventative monitoring system and constitutes an effective concept for quality assurance of hygiene and all other parameters influencing the quality of the product.

  4. [Multiple time scales analysis of spatial differentiation characteristics of non-point source nitrogen loss within watershed].

    Science.gov (United States)

    Liu, Mei-bing; Chen, Xing-wei; Chen, Ying

    2015-07-01

    Identification of the critical source areas of non-point source pollution is an important means to control the non-point source pollution within the watershed. In order to further reveal the impact of multiple time scales on the spatial differentiation characteristics of non-point source nitrogen loss, a SWAT model of Shanmei Reservoir watershed was developed. Based on the simulation of total nitrogen (TN) loss intensity of all 38 subbasins, spatial distribution characteristics of nitrogen loss and critical source areas were analyzed at three time scales of yearly average, monthly average and rainstorms flood process, respectively. Furthermore, multiple linear correlation analysis was conducted to analyze the contribution of natural environment and anthropogenic disturbance on nitrogen loss. The results showed that there were significant spatial differences of TN loss in Shanmei Reservoir watershed at different time scales, and the spatial differentiation degree of nitrogen loss was in the order of monthly average > yearly average > rainstorms flood process. TN loss load mainly came from upland Taoxi subbasin, which was identified as the critical source area. At different time scales, land use types (such as farmland and forest) were always the dominant factor affecting the spatial distribution of nitrogen loss, while the effect of precipitation and runoff on the nitrogen loss was only taken in no fertilization month and several processes of storm flood at no fertilization date. This was mainly due to the significant spatial variation of land use and fertilization, as well as the low spatial variability of precipitation and runoff.

  5. Processing of pulse oximeter data using discrete wavelet analysis.

    Science.gov (United States)

    Lee, Seungjoon; Ibey, Bennett L; Xu, Weijian; Wilson, Mark A; Ericson, M Nance; Coté, Gerard L

    2005-07-01

    A wavelet-based signal processing technique was employed to improve an implantable blood perfusion monitoring system. Data was acquired from both in vitro and in vivo sources: a perfusion model and the proximal jejunum of an adult pig. Results showed that wavelet analysis could isolate perfusion signals from raw, periodic, in vitro data as well as fast Fourier transform (FFT) methods. However, for the quasi-periodic in vivo data segments, wavelet analysis provided more consistent results than the FFT analysis for data segments of 50, 10, and 5 s in length. Wavelet analysis has thus been shown to require less data points for quasi-periodic data than FFT analysis making it a good choice for an indwelling perfusion monitor where power consumption and reaction time are paramount.

  6. Assessment of hygiene standards and Hazard Analysis Critical Control Points implementation on passenger ships.

    Science.gov (United States)

    Mouchtouri, Varavara; Malissiova, Eleni; Zisis, Panagiotis; Paparizou, Evina; Hadjichristodoulou, Christos

    2013-01-01

    The level of hygiene on ferries can have impact on travellers' health. The aim of this study was to assess the hygiene standards of ferries in Greece and to investigate whether Hazard Analysis Critical Control Points (HACCP) implementation contributes to the hygiene status and particularly food safety aboard passenger ships. Hygiene inspections on 17 ferries in Greece were performed using a standardized inspection form, with a 135-point scale. Thirty-four water and 17 food samples were collected and analysed. About 65% (11/17) of ferries were scored with >100 points. Ferries with HACCP received higher scores during inspection compared to those without HACCP (p value food samples, only one was found positive for Salmonella spp. Implementation of management systems including HACCP principles can help to raise the level of hygiene aboard passenger ships.

  7. Analysis of payload bay magnetic fields due to dc power multipoint and single point ground configurations

    Science.gov (United States)

    Lawton, R. M.

    1976-01-01

    An analysis of magnetic fields in the Orbiter Payload Bay resulting from the present grounding configuration (structure return) was presented and the amount of improvement that would result from installing wire returns for the three dc power buses was determined. Ac and dc magnetic fields at five points in a cross-section of the bay are calculated for both grounding configurations. Y and Z components of the field at each point are derived in terms of a constant coefficient and the current amplitude of each bus. The dc loads assumed are 100 Amperes for each bus. The ac noise current used is a spectrum 6 db higher than the Orbiter equipment limit for narrowband conducted emissions. It was concluded that installing return wiring to provide a single point ground for the dc Buses in the Payload Bay would reduce the ac and dc magnetic field intensity by approximately 30 db.

  8. Systemic analysis of the caulking assembly process

    Directory of Open Access Journals (Sweden)

    Rodean Claudiu

    2017-01-01

    Full Text Available The present paper highlights the importance of a caulking process which is nowadays less studied in comparison with the growing of its usage in the automotive industry. Due to the fact that the caulking operation is used in domains with high importance such as shock absorbers and brake systems there comes the demand of this paper to detail the parameters which characterize the process, viewed as input data and output data, and the requirements asked for the final product. The paper presents the actual measurement methods used for analysis the performance of the caulking assembly. All this parameters leads to an analysis algorithm of performance established for the caulking process which it is used later in the paper for an experimental research. The study is a basis from which it will be able to go to further researches in order to optimize the following processing.

  9. Fractal Point Process and Queueing Theory and Application to Communication Networks

    National Research Council Canada - National Science Library

    Wornel, Gregory

    1999-01-01

    .... A unifying theme in the approaches to these problems has been an integration of interrelated perspectives from communication theory, information theory, signal processing theory, and control theory...

  10. Process of extracting oil from stones and sands. [heating below cracking temperature and above boiling point of oil

    Energy Technology Data Exchange (ETDEWEB)

    Bergfeld, K

    1935-03-09

    A process of extracting oil from stones or sands bearing oils is characterized by the stones and sands being heated in a suitable furnace to a temperature below that of cracking and preferably slightly higher than the boiling-point of the oils. The oily vapors are removed from the treating chamber by means of flushing gas.

  11. A three-dimensional point process model for the spatial distribution of disease occurrence in relation to an exposure source

    DEFF Research Database (Denmark)

    Grell, Kathrine; Diggle, Peter J; Frederiksen, Kirsten

    2015-01-01

    We study methods for how to include the spatial distribution of tumours when investigating the relation between brain tumours and the exposure from radio frequency electromagnetic fields caused by mobile phone use. Our suggested point process model is adapted from studies investigating spatial...... the Interphone Study, a large multinational case-control study on the association between brain tumours and mobile phone use....

  12. The solution of the neutron point kinetics equation with stochastic extension: an analysis of two moments

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Milena Wollmann da; Vilhena, Marco Tullio M.B.; Bodmann, Bardo Ernst J.; Vasques, Richard, E-mail: milena.wollmann@ufrgs.br, E-mail: vilhena@mat.ufrgs.br, E-mail: bardobodmann@ufrgs.br, E-mail: richard.vasques@fulbrightmail.org [Universidade Federal do Rio Grande do Sul (UFRGS), Porto Alegre, RS (Brazil). Programa de Pos-Graduacao em Engenharia Mecanica

    2015-07-01

    The neutron point kinetics equation, which models the time-dependent behavior of nuclear reactors, is often used to understand the dynamics of nuclear reactor operations. It consists of a system of coupled differential equations that models the interaction between (i) the neutron population; and (II) the concentration of the delayed neutron precursors, which are radioactive isotopes formed in the fission process that decay through neutron emission. These equations are deterministic in nature, and therefore can provide only average values of the modeled populations. However, the actual dynamical process is stochastic: the neutron density and the delayed neutron precursor concentrations vary randomly with time. To address this stochastic behavior, Hayes and Allen have generalized the standard deterministic point kinetics equation. They derived a system of stochastic differential equations that can accurately model the random behavior of the neutron density and the precursor concentrations in a point reactor. Due to the stiffness of these equations, this system was numerically implemented using a stochastic piecewise constant approximation method (Stochastic PCA). Here, we present a study of the influence of stochastic fluctuations on the results of the neutron point kinetics equation. We reproduce the stochastic formulation introduced by Hayes and Allen and compute Monte Carlo numerical results for examples with constant and time-dependent reactivity, comparing these results with stochastic and deterministic methods found in the literature. Moreover, we introduce a modified version of the stochastic method to obtain a non-stiff solution, analogue to a previously derived deterministic approach. (author)

  13. The solution of the neutron point kinetics equation with stochastic extension: an analysis of two moments

    International Nuclear Information System (INIS)

    Silva, Milena Wollmann da; Vilhena, Marco Tullio M.B.; Bodmann, Bardo Ernst J.; Vasques, Richard

    2015-01-01

    The neutron point kinetics equation, which models the time-dependent behavior of nuclear reactors, is often used to understand the dynamics of nuclear reactor operations. It consists of a system of coupled differential equations that models the interaction between (i) the neutron population; and (II) the concentration of the delayed neutron precursors, which are radioactive isotopes formed in the fission process that decay through neutron emission. These equations are deterministic in nature, and therefore can provide only average values of the modeled populations. However, the actual dynamical process is stochastic: the neutron density and the delayed neutron precursor concentrations vary randomly with time. To address this stochastic behavior, Hayes and Allen have generalized the standard deterministic point kinetics equation. They derived a system of stochastic differential equations that can accurately model the random behavior of the neutron density and the precursor concentrations in a point reactor. Due to the stiffness of these equations, this system was numerically implemented using a stochastic piecewise constant approximation method (Stochastic PCA). Here, we present a study of the influence of stochastic fluctuations on the results of the neutron point kinetics equation. We reproduce the stochastic formulation introduced by Hayes and Allen and compute Monte Carlo numerical results for examples with constant and time-dependent reactivity, comparing these results with stochastic and deterministic methods found in the literature. Moreover, we introduce a modified version of the stochastic method to obtain a non-stiff solution, analogue to a previously derived deterministic approach. (author)

  14. Book: Marine Bioacoustic Signal Processing and Analysis

    Science.gov (United States)

    2011-09-30

    physicists , and mathematicians . However, more and more biologists and psychologists are starting to use advanced signal processing techniques and...Book: Marine Bioacoustic Signal Processing and Analysis 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT ...chapters than it should be, since the project must be finished by Dec. 31. I have started setting aside 2 hours of uninterrupted per workday to work

  15. A cost analysis: processing maple syrup products

    Science.gov (United States)

    Neil K. Huyler; Lawrence D. Garrett

    1979-01-01

    A cost analysis of processing maple sap to syrup for three fuel types, oil-, wood-, and LP gas-fired evaporators, indicates that: (1) fuel, capital, and labor are the major cost components of processing sap to syrup; (2) wood-fired evaporators show a slight cost advantage over oil- and LP gas-fired evaporators; however, as the cost of wood approaches $50 per cord, wood...

  16. Parallel processing of structural integrity analysis codes

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.

    1996-01-01

    Structural integrity analysis forms an important role in assessing and demonstrating the safety of nuclear reactor components. This analysis is performed using analytical tools such as Finite Element Method (FEM) with the help of digital computers. The complexity of the problems involved in nuclear engineering demands high speed computation facilities to obtain solutions in reasonable amount of time. Parallel processing systems such as ANUPAM provide an efficient platform for realising the high speed computation. The development and implementation of software on parallel processing systems is an interesting and challenging task. The data and algorithm structure of the codes plays an important role in exploiting the parallel processing system capabilities. Structural analysis codes based on FEM can be divided into two categories with respect to their implementation on parallel processing systems. The first category codes such as those used for harmonic analysis, mechanistic fuel performance codes need not require the parallelisation of individual modules of the codes. The second category of codes such as conventional FEM codes require parallelisation of individual modules. In this category, parallelisation of equation solution module poses major difficulties. Different solution schemes such as domain decomposition method (DDM), parallel active column solver and substructuring method are currently used on parallel processing systems. Two codes, FAIR and TABS belonging to each of these categories have been implemented on ANUPAM. The implementation details of these codes and the performance of different equation solvers are highlighted. (author). 5 refs., 12 figs., 1 tab

  17. Estimating functions for inhomogeneous spatial point processes with incomplete covariate data

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus

    and this leads to parameter estimation error which is difficult to quantify. In this paper we introduce a Monte Carlo version of the estimating function used in "spatstat" for fitting inhomogeneous Poisson processes and certain inhomogeneous cluster processes. For this modified estimating function it is feasible...

  18. Estimating functions for inhomogeneous spatial point processes with incomplete covariate data

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus

    2008-01-01

    and this leads to parameter estimation error which is difficult to quantify. In this paper, we introduce a Monte Carlo version of the estimating function used in spatstat for fitting inhomogeneous Poisson processes and certain inhomogeneous cluster processes. For this modified estimating function, it is feasible...

  19. Congruence from the operator's point of view: compositionality requirements on process semantics

    NARCIS (Netherlands)

    Gazda, M.; Fokkink, W.J.

    2010-01-01

    One of the basic sanity properties of a behavioural semantics is that it constitutes a congruence with respect to standard process operators. This issue has been traditionally addressed by the development of rule formats for transition system specifications that define process algebras. In this

  20. Congruence from the operator's point of view : compositionality requirements on process semantics

    NARCIS (Netherlands)

    Gazda, M.W.; Fokkink, W.J.; Aceto, L.; Sobocinski, P.

    2010-01-01

    One of the basic sanity properties of a behavioural semantics is that it constitutes a congruence with respect to standard process operators. This issue has been traditionally addressed by the development of rule formats for transition system specifications that define process algebras. In this

  1. Identification of the Key Fields and Their Key Technical Points of Oncology by Patent Analysis.

    Science.gov (United States)

    Zhang, Ting; Chen, Juan; Jia, Xiaofeng

    2015-01-01

    This paper aims to identify the key fields and their key technical points of oncology by patent analysis. Patents of oncology applied from 2006 to 2012 were searched in the Thomson Innovation database. The key fields and their key technical points were determined by analyzing the Derwent Classification (DC) and the International Patent Classification (IPC), respectively. Patent applications in the top ten DC occupied 80% of all the patent applications of oncology, which were the ten fields of oncology to be analyzed. The number of patent applications in these ten fields of oncology was standardized based on patent applications of oncology from 2006 to 2012. For each field, standardization was conducted separately for each of the seven years (2006-2012) and the mean of the seven standardized values was calculated to reflect the relative amount of patent applications in that field; meanwhile, regression analysis using time (year) and the standardized values of patent applications in seven years (2006-2012) was conducted so as to evaluate the trend of patent applications in each field. Two-dimensional quadrant analysis, together with the professional knowledge of oncology, was taken into consideration in determining the key fields of oncology. The fields located in the quadrant with high relative amount or increasing trend of patent applications are identified as key ones. By using the same method, the key technical points in each key field were identified. Altogether 116,820 patents of oncology applied from 2006 to 2012 were retrieved, and four key fields with twenty-nine key technical points were identified, including "natural products and polymers" with nine key technical points, "fermentation industry" with twelve ones, "electrical medical equipment" with four ones, and "diagnosis, surgery" with four ones. The results of this study could provide guidance on the development direction of oncology, and also help researchers broaden innovative ideas and discover new

  2. [Power, interdependence and complementarity in hospital work: an analysis from the nursing point of view].

    Science.gov (United States)

    Lopes, M J

    1997-01-01

    This essay intends to discuss recent transformation both to hospital work and nursing work specifically. Analysis privilege inter and intra relations with multidisciplinary teams which is constituted of practices on the therapeutic process present in hospital space-time.

  3. Flux balance analysis of ammonia assimilation network in E. coli predicts preferred regulation point.

    Science.gov (United States)

    Wang, Lu; Lai, Luhua; Ouyang, Qi; Tang, Chao

    2011-01-25

    Nitrogen assimilation is a critical biological process for the synthesis of biomolecules in Escherichia coli. The central ammonium assimilation network in E. coli converts carbon skeleton α-ketoglutarate and ammonium into glutamate and glutamine, which further serve as nitrogen donors for nitrogen metabolism in the cell. This reaction network involves three enzymes: glutamate dehydrogenase (GDH), glutamine synthetase (GS) and glutamate synthase (GOGAT). In minimal media, E. coli tries to maintain an optimal growth rate by regulating the activity of the enzymes to match the availability of the external ammonia. The molecular mechanism and the strategy of the regulation in this network have been the research topics for many investigators. In this paper, we develop a flux balance model for the nitrogen metabolism, taking into account of the cellular composition and biosynthetic requirements for nitrogen. The model agrees well with known experimental results. Specifically, it reproduces all the (15)N isotope labeling experiments in the wild type and the two mutant (ΔGDH and ΔGOGAT) strains of E. coli. Furthermore, the predicted catalytic activities of GDH, GS and GOGAT in different ammonium concentrations and growth rates for the wild type, ΔGDH and ΔGOGAT strains agree well with the enzyme concentrations obtained from western blots. Based on this flux balance model, we show that GS is the preferred regulation point among the three enzymes in the nitrogen assimilation network. Our analysis reveals the pattern of regulation in this central and highly regulated network, thus providing insights into the regulation strategy adopted by the bacteria. Our model and methods may also be useful in future investigations in this and other networks.

  4. Flux balance analysis of ammonia assimilation network in E. coli predicts preferred regulation point.

    Directory of Open Access Journals (Sweden)

    Lu Wang

    Full Text Available Nitrogen assimilation is a critical biological process for the synthesis of biomolecules in Escherichia coli. The central ammonium assimilation network in E. coli converts carbon skeleton α-ketoglutarate and ammonium into glutamate and glutamine, which further serve as nitrogen donors for nitrogen metabolism in the cell. This reaction network involves three enzymes: glutamate dehydrogenase (GDH, glutamine synthetase (GS and glutamate synthase (GOGAT. In minimal media, E. coli tries to maintain an optimal growth rate by regulating the activity of the enzymes to match the availability of the external ammonia. The molecular mechanism and the strategy of the regulation in this network have been the research topics for many investigators. In this paper, we develop a flux balance model for the nitrogen metabolism, taking into account of the cellular composition and biosynthetic requirements for nitrogen. The model agrees well with known experimental results. Specifically, it reproduces all the (15N isotope labeling experiments in the wild type and the two mutant (ΔGDH and ΔGOGAT strains of E. coli. Furthermore, the predicted catalytic activities of GDH, GS and GOGAT in different ammonium concentrations and growth rates for the wild type, ΔGDH and ΔGOGAT strains agree well with the enzyme concentrations obtained from western blots. Based on this flux balance model, we show that GS is the preferred regulation point among the three enzymes in the nitrogen assimilation network. Our analysis reveals the pattern of regulation in this central and highly regulated network, thus providing insights into the regulation strategy adopted by the bacteria. Our model and methods may also be useful in future investigations in this and other networks.

  5. Object-Based Point Cloud Analysis of Full-Waveform Airborne Laser Scanning Data for Urban Vegetation Classification

    Directory of Open Access Journals (Sweden)

    Norbert Pfeifer

    2008-08-01

    Full Text Available Airborne laser scanning (ALS is a remote sensing technique well-suited for 3D vegetation mapping and structure characterization because the emitted laser pulses are able to penetrate small gaps in the vegetation canopy. The backscattered echoes from the foliage, woody vegetation, the terrain, and other objects are detected, leading to a cloud of points. Higher echo densities (> 20 echoes/m2 and additional classification variables from full-waveform (FWF ALS data, namely echo amplitude, echo width and information on multiple echoes from one shot, offer new possibilities in classifying the ALS point cloud. Currently FWF sensor information is hardly used for classification purposes. This contribution presents an object-based point cloud analysis (OBPA approach, combining segmentation and classification of the 3D FWF ALS points designed to detect tall vegetation in urban environments. The definition tall vegetation includes trees and shrubs, but excludes grassland and herbage. In the applied procedure FWF ALS echoes are segmented by a seeded region growing procedure. All echoes sorted descending by their surface roughness are used as seed points. Segments are grown based on echo width homogeneity. Next, segment statistics (mean, standard deviation, and coefficient of variation are calculated by aggregating echo features such as amplitude and surface roughness. For classification a rule base is derived automatically from a training area using a statistical classification tree. To demonstrate our method we present data of three sites with around 500,000 echoes each. The accuracy of the classified vegetation segments is evaluated for two independent validation sites. In a point-wise error assessment, where the classification is compared with manually classified 3D points, completeness and correctness better than 90% are reached for the validation sites. In comparison to many other algorithms the proposed 3D point classification works on the original

  6. Process based analysis of manually controlled drilling processes for bone

    Science.gov (United States)

    Teicher, Uwe; Achour, Anas Ben; Nestler, Andreas; Brosius, Alexander; Lauer, Günter

    2018-05-01

    The machining operation drilling is part of the standard repertoire for medical applications. This machining cycle, which is usually a multi-stage process, generates the geometric element for the subsequent integration of implants, which are screwed into the bone in subsequent processes. In addition to the form, shape and position of the generated drill hole, it is also necessary to use a technology that ensures an operation with minimal damage. A surface damaged by excessive mechanical and thermal energy input shows a deterioration in the healing capacity of implants and represents a structure with complications for inflammatory reactions. The resulting loads are influenced by the material properties of the bone, the used technology and the tool properties. An important aspect of the process analysis is the fact that machining of bone is in most of the cases a manual process that depends mainly on the skills of the operator. This includes, among other things, the machining time for the production of a drill hole, since manual drilling is a force-controlled process. Experimental work was carried out on the bone of a porcine mandible in order to investigate the interrelation of the applied load during drilling. It can be shown that the load application can be subdivided according to the working feed direction. The entire drilling process thus consists of several time domains, which can be divided into the geometry-generating feed motion and a retraction movement of the tool. It has been shown that the removal of the tool from the drill hole has a significant influence on the mechanical load input. This fact is proven in detail by a new evaluation methodology. The causes of this characteristic can also be identified, as well as possible ways of reducing the load input.

  7. Four points function fitted and first derivative procedure for determining the end points in potentiometric titration curves: statistical analysis and method comparison.

    Science.gov (United States)

    Kholeif, S A

    2001-06-01

    A new method that belongs to the differential category for determining the end points from potentiometric titration curves is presented. It uses a preprocess to find first derivative values by fitting four data points in and around the region of inflection to a non-linear function, and then locate the end point, usually as a maximum or minimum, using an inverse parabolic interpolation procedure that has an analytical solution. The behavior and accuracy of the sigmoid and cumulative non-linear functions used are investigated against three factors. A statistical evaluation of the new method using linear least-squares method validation and multifactor data analysis are covered. The new method is generally applied to symmetrical and unsymmetrical potentiometric titration curves, and the end point is calculated using numerical procedures only. It outperforms the "parent" regular differential method in almost all factors levels and gives accurate results comparable to the true or estimated true end points. Calculated end points from selected experimental titration curves compatible with the equivalence point category of methods, such as Gran or Fortuin, are also compared with the new method.

  8. Visualizing nD Point Clouds as Topological Landscape Profiles to Guide Local Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Oesterling, Patrick [Univ. of Leipzig (Germany). Computer Science Dept.; Heine, Christian [Univ. of Leipzig (Germany). Computer Science Dept.; Federal Inst. of Technology (ETH), Zurich (Switzerland). Dept. of Computer Science; Weber, Gunther H. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Computational Research Division; Scheuermann, Gerik [Univ. of Leipzig (Germany). Computer Science Dept.

    2012-05-04

    Analyzing high-dimensional point clouds is a classical challenge in visual analytics. Traditional techniques, such as projections or axis-based techniques, suffer from projection artifacts, occlusion, and visual complexity.We propose to split data analysis into two parts to address these shortcomings. First, a structural overview phase abstracts data by its density distribution. This phase performs topological analysis to support accurate and non-overlapping presentation of the high-dimensional cluster structure as a topological landscape profile. Utilizing a landscape metaphor, it presents clusters and their nesting as hills whose height, width, and shape reflect cluster coherence, size, and stability, respectively. A second local analysis phase utilizes this global structural knowledge to select individual clusters or point sets for further, localized data analysis. Focusing on structural entities significantly reduces visual clutter in established geometric visualizations and permits a clearer, more thorough data analysis. In conclusion, this analysis complements the global topological perspective and enables the user to study subspaces or geometric properties, such as shape.

  9. Fuzzy Risk Analysis for a Production System Based on the Nagel Point of a Triangle

    Directory of Open Access Journals (Sweden)

    Handan Akyar

    2016-01-01

    Full Text Available Ordering and ranking fuzzy numbers and their comparisons play a significant role in decision-making problems such as social and economic systems, forecasting, optimization, and risk analysis problems. In this paper, a new method for ordering triangular fuzzy numbers using the Nagel point of a triangle is presented. With the aid of the proposed method, reasonable properties of ordering fuzzy numbers are verified. Certain comparative examples are given to illustrate the advantages of the new method. Many papers have been devoted to studies on fuzzy ranking methods, but some of these studies have certain shortcomings. The proposed method overcomes the drawbacks of the existing methods in the literature. The suggested method can order triangular fuzzy numbers as well as crisp numbers and fuzzy numbers with the same centroid point. An application to the fuzzy risk analysis problem is given, based on the suggested ordering approach.

  10. A Deep Learning Prediction Model Based on Extreme-Point Symmetric Mode Decomposition and Cluster Analysis

    OpenAIRE

    Li, Guohui; Zhang, Songling; Yang, Hong

    2017-01-01

    Aiming at the irregularity of nonlinear signal and its predicting difficulty, a deep learning prediction model based on extreme-point symmetric mode decomposition (ESMD) and clustering analysis is proposed. Firstly, the original data is decomposed by ESMD to obtain the finite number of intrinsic mode functions (IMFs) and residuals. Secondly, the fuzzy c-means is used to cluster the decomposed components, and then the deep belief network (DBN) is used to predict it. Finally, the reconstructed ...

  11. PREMOR: a point reactor exposure model computer code for survey analysis of power plant performance

    International Nuclear Information System (INIS)

    Vondy, D.R.

    1979-10-01

    The PREMOR computer code was written to exploit a simple, two-group point nuclear reactor power plant model for survey analysis. Up to thirteen actinides, fourteen fission products, and one lumped absorber nuclide density are followed over a reactor history. Successive feed batches are accounted for with provision for from one to twenty batches resident. The effect of exposure of each of the batches to the same neutron flux is determined

  12. PREMOR: a point reactor exposure model computer code for survey analysis of power plant performance

    Energy Technology Data Exchange (ETDEWEB)

    Vondy, D.R.

    1979-10-01

    The PREMOR computer code was written to exploit a simple, two-group point nuclear reactor power plant model for survey analysis. Up to thirteen actinides, fourteen fission products, and one lumped absorber nuclide density are followed over a reactor history. Successive feed batches are accounted for with provision for from one to twenty batches resident. The effect of exposure of each of the batches to the same neutron flux is determined.

  13. Scientific evidence is just the starting point: A generalizable process for developing sports injury prevention interventions

    Directory of Open Access Journals (Sweden)

    Alex Donaldson

    2016-09-01

    Conclusion: This systematic yet pragmatic and iterative intervention development process is potentially applicable to any injury prevention topic across all sports settings and levels. It will guide researchers wishing to undertake intervention development.

  14. Software Process Improvement Using Force Field Analysis ...

    African Journals Online (AJOL)

    An improvement plan is then drawn and implemented. This paper studied the state of Nigerian software development organizations based on selected attributes. Force field analysis is used to partition the factors obtained into driving and restraining forces. An attempt was made to improve the software development process ...

  15. Applied Behavior Analysis and Statistical Process Control?

    Science.gov (United States)

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  16. Main points of research in crude oil processing and petrochemistry. [German Democratic Republic

    Energy Technology Data Exchange (ETDEWEB)

    Keil, G.; Nowak, S.; Fiedrich, G.; Klare, H.; Apelt, E.

    1982-04-01

    This article analyzes general aspects in the development of petrochemistry and carbochemistry on a global scale and for industry in the German Democratic Republic. Diagrams are given for liquid and solid carbon resources and their natural hydrogen content showing the increasing hydrogen demand for chemical fuel conversion processes. The petrochemical and carbochemical industry must take a growing level of hydrogen demand into account, which is at present 25 Mt/a on a global scale and which increases by 7% annually. Various methods for chemical processing of crude oil and crude oil residues are outlined. Advanced coal conversion processes with prospects for future application in the GDR are also explained, including the methanol carbonylation process, which achieves 90% selectivity and which is based on carbon monoxide hydrogenation, further the Transcat process, using ethane for vinyl chloride production. Acetylene and carbide carbochemistry in the GDR is a further major line in research and development. Technological processes for the pyrolysis of vacuum gas oil are also evaluated. (27 refs.)

  17. The Hazard Analysis and Critical Control Points (HACCP) generic model for the production of Thai fermented pork sausage (Nham).

    Science.gov (United States)

    Paukatong, K V; Kunawasen, S

    2001-01-01

    Nham is a traditional Thai fermented pork sausage. The major ingredients of Nham are ground pork meat and shredded pork rind. Nham has been reported to be contaminated with Salmonella spp., Staphylococcus aureus, and Listeria monocytogenes. Therefore, it is a potential cause of foodborne diseases for consumers. A Hazard Analysis and Critical Control Points (HACCP) generic model has been developed for the Nham process. Nham processing plants were observed and a generic flow diagram of Nham processes was constructed. Hazard analysis was then conducted. Other than microbial hazards, the pathogens previously found in Nham, sodium nitrite and metal were identified as chemical and physical hazards in this product, respectively. Four steps in the Nham process have been identified as critical control points. These steps are the weighing of the nitrite compound, stuffing, fermentation, and labeling. The chemical hazard of nitrite must be controlled during the weighing step. The critical limit of nitrite levels in the Nham mixture has been set at 100-200 ppm. This level is high enough to control Clostridium botulinum but does not cause chemical hazards to the consumer. The physical hazard from metal clips could be prevented by visual inspection of every Nham product during stuffing. The microbiological hazard in Nham could be reduced in the fermentation process. The critical limit of the pH of Nham was set at lower than 4.6. Since this product is not cooked during processing, finally, educating the consumer, by providing information on the label such as "safe if cooked before consumption", could be an alternative way to prevent the microbiological hazards of this product.

  18. Steam generators secondary side chemical cleaning at Point Lepreau using the Siemens high temperature process

    International Nuclear Information System (INIS)

    Verma, K.; MacNeil, C.; Odar, S.; Kuhnke, K.

    1997-01-01

    This paper describes the chemical cleaning of the four steam generators at the Point Lepreau facility, which was accomplished as a part of a normal service outage. The steam generators had been in service for twelve years. Sludge samples showed the main elements were Fe, P and Na, with minor amounts of Ca, Mg, Mn, Cr, Zn, Cl, Cu, Ni, Ti, Si, and Pb, 90% in the form of Magnetite, substantial phosphate, and trace amounts of silicates. The steam generators were experiencing partial blockage of broached holes in the TSPs, and corrosion on tube ODs in the form of pitting and wastage. In addition heat transfer was clearly deteriorating. More than 1000 kg of magnetite and 124 kg of salts were removed from the four steam generators

  19. The role of point defects and defect complexes in silicon device processing. Summary report and papers

    Energy Technology Data Exchange (ETDEWEB)

    Sopori, B.; Tan, T.Y.

    1994-08-01

    This report is a summary of a workshop hold on August 24--26, 1992. Session 1 of the conference discussed characteristics of various commercial photovoltaic silicon substrates, the nature of impurities and defects in them, and how they are related to the material growth. Session 2 on point defects reviewed the capabilities of theoretical approaches to determine equilibrium structure of defects in the silicon lattice arising from transitional metal impurities and hydrogen. Session 3 was devoted to a discussion of the surface photovoltaic method for characterizing bulk wafer lifetimes, and to detailed studies on the effectiveness of various gettering operations on reducing the deleterious effects of transition metals. Papers presented at the conference are also included in this summary report.

  20. Congruence from the Operator's Point of View: Compositionality Requirements on Process Semantics

    Directory of Open Access Journals (Sweden)

    Maciej Gazda

    2010-08-01

    Full Text Available One of the basic sanity properties of a behavioural semantics is that it constitutes a congruence with respect to standard process operators. This issue has been traditionally addressed by the development of rule formats for transition system specifications that define process algebras. In this paper we suggest a novel, orthogonal approach. Namely, we focus on a number of process operators, and for each of them attempt to find the widest possible class of congruences. To this end, we impose restrictions on sublanguages of Hennessy-Milner logic, so that a semantics whose modal characterization satisfies a given criterion is guaranteed to be a congruence with respect to the operator in question. We investigate action prefix, alternative composition, two restriction operators, and parallel composition.

  1. Accuracy of heart rate variability estimation by photoplethysmography using an smartphone: Processing optimization and fiducial point selection.

    Science.gov (United States)

    Ferrer-Mileo, V; Guede-Fernandez, F; Fernandez-Chimeno, M; Ramos-Castro, J; Garcia-Gonzalez, M A

    2015-08-01

    This work compares several fiducial points to detect the arrival of a new pulse in a photoplethysmographic signal using the built-in camera of smartphones or a photoplethysmograph. Also, an optimization process for the signal preprocessing stage has been done. Finally we characterize the error produced when we use the best cutoff frequencies and fiducial point for smartphones and photopletysmograph and compare if the error of smartphones can be reasonably be explained by variations in pulse transit time. The results have revealed that the peak of the first derivative and the minimum of the second derivative of the pulse wave have the lowest error. Moreover, for these points, high pass filtering the signal between 0.1 to 0.8 Hz and low pass around 2.7 Hz or 3.5 Hz are the best cutoff frequencies. Finally, the error in smartphones is slightly higher than in a photoplethysmograph.

  2. Feedwater line break accident analysis for SMART in the view point of minimum departure from nucleate boiling ratio

    International Nuclear Information System (INIS)

    Kim Soo Hyoung; Bae, Kyoo Hwan; Chung, Young Jong; Kim, Keung Koo

    2012-01-01

    KAERI and KEPCO consortium had performed standard design of SMART(System integrated Modular Advanced ReacTor) from 2009 to 2011 and obtained standard design approval in July 2012. To confirm the safety of SMART design, all of the safety related design basis events were analyzed. A feedwater line break (FLB) is a postulated accident and is a limiting accident for a decrease in the heat removal by the secondary system in the view point of the peak RCS pressure. It is well known that departure from nucleate boiling ratio (DNBR) increases with the increase of the system pressure for conventional nuclear power plants. But SMART has comparatively lower RCS flow rate, and there is a possibility to show different DNBR behavior depending on the system pressure. To confirm that SMART is safe in case of FLB accident, the Korean nuclear regulatory body required to perform the safety analysis in the view point of minimum DNBR (MDNBR) during the licensing review process for standard design approval (SDA) of SMART design. In this paper, the safety analysis results of the FLB accident for SMART in the view point of MDNBR is described

  3. Feedwater line break accident analysis for SMART in the view point of minimum departure from nucleate boiling ratio

    Energy Technology Data Exchange (ETDEWEB)

    Kim Soo Hyoung; Bae, Kyoo Hwan; Chung, Young Jong; Kim, Keung Koo [KAERI, Daejeon (Korea, Republic of)

    2012-10-15

    KAERI and KEPCO consortium had performed standard design of SMART(System integrated Modular Advanced ReacTor) from 2009 to 2011 and obtained standard design approval in July 2012. To confirm the safety of SMART design, all of the safety related design basis events were analyzed. A feedwater line break (FLB) is a postulated accident and is a limiting accident for a decrease in the heat removal by the secondary system in the view point of the peak RCS pressure. It is well known that departure from nucleate boiling ratio (DNBR) increases with the increase of the system pressure for conventional nuclear power plants. But SMART has comparatively lower RCS flow rate, and there is a possibility to show different DNBR behavior depending on the system pressure. To confirm that SMART is safe in case of FLB accident, the Korean nuclear regulatory body required to perform the safety analysis in the view point of minimum DNBR (MDNBR) during the licensing review process for standard design approval (SDA) of SMART design. In this paper, the safety analysis results of the FLB accident for SMART in the view point of MDNBR is described.

  4. Portable Dew Point Mass Spectrometry System for Real-Time Gas and Moisture Analysis

    Science.gov (United States)

    Arkin, C.; Gillespie, Stacey; Ratzel, Christopher

    2010-01-01

    A portable instrument incorporates both mass spectrometry and dew point measurement to provide real-time, quantitative gas measurements of helium, nitrogen, oxygen, argon, and carbon dioxide, along with real-time, quantitative moisture analysis. The Portable Dew Point Mass Spectrometry (PDP-MS) system comprises a single quadrupole mass spectrometer and a high vacuum system consisting of a turbopump and a diaphragm-backing pump. A capacitive membrane dew point sensor was placed upstream of the MS, but still within the pressure-flow control pneumatic region. Pressure-flow control was achieved with an upstream precision metering valve, a capacitance diaphragm gauge, and a downstream mass flow controller. User configurable LabVIEW software was developed to provide real-time concentration data for the MS, dew point monitor, and sample delivery system pressure control, pressure and flow monitoring, and recording. The system has been designed to include in situ, NIST-traceable calibration. Certain sample tubing retains sufficient water that even if the sample is dry, the sample tube will desorb water to an amount resulting in moisture concentration errors up to 500 ppm for as long as 10 minutes. It was determined that Bev-A-Line IV was the best sample line to use. As a result of this issue, it is prudent to add a high-level humidity sensor to PDP-MS so such events can be prevented in the future.

  5. [Evaluation of a new blood gas analysis system: RapidPoint 500(®)].

    Science.gov (United States)

    Nicolas, Thierry; Cabrolier, Nadège; Bardonnet, Karine; Davani, Siamak

    2013-01-01

    We present here evaluation of a new blood gas analysis system, RapidPoint 500(®) (Siemens Healthcare Diagnostics). The aim of this research was to compare the ergonomics and analytical performances of this analyser with those of the RapidLab 1265 for the following parameters: pH, partial oxygen pressure, partial carbon dioxide pressure, sodium, potassium, ionized calcium, lactate and the CO-oximetry parameters: hemoglobin, oxyhemoglobin, carboxyhemoglobin, methemoglobin, reduced hemoglobin, neonatal bilirubin; as well as with the Dimension Vista 500 results for chloride and glucose. The Valtec protocol, recommended by the French Society of Clinical Biology (SFBC), was used to analyze the study results. The experiment was carried out over a period of one month in the Department of medical biochemistry. One hundred sixty five samples from adult patients admitted to the ER or hospitalized in intensive care were tested. The RapidPoint 500(®) was highly satisfactory from an ergonomic point of view. Intra-and inter- assay coefficients of variation (CV) with the three control levels were below those recommended by the SFBC for all parameters, and the comparative study gave coefficients of determination higher than 0.91. Taken together, the RapidPoint 500(®) appears fully satisfactory in terms of ergonomics and analytical performance.

  6. A Traffic Model for Machine-Type Communications Using Spatial Point Processes

    DEFF Research Database (Denmark)

    Thomsen, Henning; Manchón, Carles Navarro; Fleury, Bernard Henri

    2018-01-01

    , where the generated traffic by a given device depends on its position and event positions. We first consider the case where devices and events are static and devices generate traffic according to a Bernoulli process, where we derive the total rate from the devices at the base station. We then extend...

  7. Optimal estimation of the intensity function of a spatial point process

    DEFF Research Database (Denmark)

    Guan, Yongtao; Jalilian, Abdollah; Waagepetersen, Rasmus

    easily computable estimating functions. We derive the optimal estimating function in a class of first-order estimating functions. The optimal estimating function depends on the solution of a certain Fredholm integral equation and reduces to the likelihood score in case of a Poisson process. We discuss...

  8. Entry points to stimulation of expansion in hides and skins processing

    African Journals Online (AJOL)

    Only 3.4% of respondents add value to hides and skins by processing. ... For this status of the chain, it was proposed that a workable intervention model has to encompass placement of tanneries and slaughter slabs in the chain as new actors, linking chain actors, improving livestock services especially dipping, and ...

  9. Mentoring Novice Teachers: Motives, Process, and Outcomes from the Mentor's Point of View

    Science.gov (United States)

    Iancu-Haddad, Debbie; Oplatka, Izhar

    2009-01-01

    The purpose of this paper is to present the major motives leading senior teachers to be involved in a mentoring process of newly appointed teachers and its benefits for the mentor teacher. Based on semi-structured interviews with 12 experienced teachers who participated in a university-based mentoring program in Israel, the current study found a…

  10. Stressors and Turning Points in High School and Dropout: A Stress Process, Life Course Framework

    Science.gov (United States)

    Dupéré, Véronique; Leventhal, Tama; Dion, Eric; Crosnoe, Robert; Archambault, Isabelle; Janosz, Michel

    2015-01-01

    High school dropout is commonly seen as the result of a long-term process of failure and disengagement. As useful as it is, this view has obscured the heterogeneity of pathways leading to dropout. Research suggests, for instance, that some students leave school not as a result of protracted difficulties but in response to situations that emerge…

  11. Proposition for Improvement of Economics Situation with Use of Analysis of Break Even Point

    OpenAIRE

    Starečková, Alena

    2015-01-01

    Bakalářská práce se zabývá realizací Break-even-point analýzy v podniku, analýzou nákladů a návrhem na zlepšení finanční situace podniku zejména z pohledu nákladů. V první části práce jsou vymezeny pojmy a vzorce týkajících se Break-even-point analýzy a problematiky nákladů. Ve druhé části pak na konkrétním podniku bude provedena analýza bodu zvratu a následné návrhy na zlepšení stávajícího stavu. Bachelor work is dealing with realization of break even point analysis of company, analysis o...

  12. Meeting points in the VPL process - a key challenge for VPL activities

    DEFF Research Database (Denmark)

    Aagaard, Kirsten; Enggaard, Ellen

    2014-01-01

    , a step up the career ladder, personal development or threat of losing his job and the work place’s demand for new competences? There are three main players on this scene: the individual, the (HE) educational institution and the work place. There may be more players involved in the process......The right to have your competences recognized and validated as a mean to gain access to or exemptions of a higher education has existed since 2007, but the knowledge of this opportunity is still not very well spread and the potentials of the law are not exploited. This goes for individuals as well...... the individual in his or her individual career strategies benefit from the option of VPL in the process of managing his or her career strategy? What are the main barriers and obstacles the individual might meet in his or her attempt to move on in his career whether the motivation is change of career direction...

  13. Integrating human factors into process hazard analysis

    International Nuclear Information System (INIS)

    Kariuki, S.G.; Loewe, K.

    2007-01-01

    A comprehensive process hazard analysis (PHA) needs to address human factors. This paper describes an approach that systematically identifies human error in process design and the human factors that influence its production and propagation. It is deductive in nature and therefore considers human error as a top event. The combinations of different factors that may lead to this top event are analysed. It is qualitative in nature and is used in combination with other PHA methods. The method has an advantage because it does not look at the operator error as the sole contributor to the human failure within a system but a combination of all underlying factors

  14. The Development of Point Doppler Velocimeter Data Acquisition and Processing Software

    Science.gov (United States)

    Cavone, Angelo A.

    2008-01-01

    In order to develop efficient and quiet aircraft and validate Computational Fluid Dynamic predications, aerodynamic researchers require flow parameter measurements to characterize flow fields about wind tunnel models and jet flows. A one-component Point Doppler Velocimeter (pDv), a non-intrusive, laser-based instrument, was constructed using a design/develop/test/validate/deploy approach. A primary component of the instrument is software required for system control/management and data collection/reduction. This software along with evaluation algorithms, advanced pDv from a laboratory curiosity to a production level instrument. Simultaneous pDv and pitot probe velocity measurements obtained at the centerline of a flow exiting a two-inch jet, matched within 0.4%. Flow turbulence spectra obtained with pDv and a hot-wire detected the primary and secondary harmonics with equal dynamic range produced by the fan driving the flow. Novel,hardware and software methods were developed, tested and incorporated into the system to eliminate and/or minimize error sources and improve system reliability.

  15. ANALYSIS, THEMATIC MAPS AND DATA MINING FROM POINT CLOUD TO ONTOLOGY FOR SOFTWARE DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    R. Nespeca

    2016-06-01

    Full Text Available The primary purpose of the survey for the restoration of Cultural Heritage is the interpretation of the state of building preservation. For this, the advantages of the remote sensing systems that generate dense point cloud (range-based or image-based are not limited only to the acquired data. The paper shows that it is possible to extrapolate very useful information in diagnostics using spatial annotation, with the use of algorithms already implemented in open-source software. Generally, the drawing of degradation maps is the result of manual work, so dependent on the subjectivity of the operator. This paper describes a method of extraction and visualization of information, obtained by mathematical procedures, quantitative, repeatable and verifiable. The case study is a part of the east facade of the Eglise collégiale Saint-Maurice also called Notre Dame des Grâces, in Caromb, in southern France. The work was conducted on the matrix of information contained in the point cloud asci format. The first result is the extrapolation of new geometric descriptors. First, we create the digital maps with the calculated quantities. Subsequently, we have moved to semi-quantitative analyses that transform new data into useful information. We have written the algorithms for accurate selection, for the segmentation of point cloud, for automatic calculation of the real surface and the volume. Furthermore, we have created the graph of spatial distribution of the descriptors. This work shows that if we work during the data processing we can transform the point cloud into an enriched database: the use, the management and the data mining is easy, fast and effective for everyone involved in the restoration process.

  16. From Takeoff to Landing: Looking at the Design Process for the Development of NASA Blast at Thanksgiving Point

    Directory of Open Access Journals (Sweden)

    Stephen Ashton

    2011-01-01

    Full Text Available In this article we discuss the process of design used to develop and design the NASA Blast exhibition at Thanksgiving Point, a museum complex in Lehi, Utah. This was a class project for the Advanced Instructional Design Class at Brigham Young University. In an attempt to create a new discourse (Krippendorff, 2006 for Thanksgiving Point visitors and staff members, the design class used a very fluid design approach by utilizing brainstorming, researching, class member personas, and prototyping to create ideas for the new exhibition. Because of the nature of the experience, the design class developed their own techniques to enhance the process of their design. The result of the design was a compelling narrative that brought all the elements of the exhibition together in a cohesive piece.

  17. LIFE CYCLE ASSESSMENT AND HAZARD ANALYSIS AND CRITICAL CONTROL POINTS TO THE PASTA PRODUCT

    Directory of Open Access Journals (Sweden)

    Yulexis Meneses Linares

    2016-10-01

    Full Text Available The objective of this work is to combine the Life Cycle Assessment (LCA and Hazard Analysis and Critical Control Points (HACCP methodologies for the determination of risks that the food production represents to the human health and the ecosystem. The environmental performance of the production of pastas in the “Marta Abreu” Pasta Factory of Cienfuegos is assessed, where the critical control points determined by the biological dangers (mushrooms and plagues and the physical dangers (wood, paper, thread and ferromagnetic particles were the raw materials: flour, semolina and its mixtures, and the disposition and extraction of them. Resources are the most affected damage category due to the consumption of fossil fuels.

  18. Using thermal analysis techniques for identifying the flash point temperatures of some lubricant and base oils

    Directory of Open Access Journals (Sweden)

    Aksam Abdelkhalik

    2018-03-01

    Full Text Available The flash point (FP temperatures of some lubricant and base oils were measured according to ASTM D92 and ASTM D93. In addition, the thermal stability of the oils was studied using differential scanning calorimeter (DSC and thermogravimetric analysis (TGA under nitrogen atmosphere. The DSC results showed that the FP temperatures, for each oil, were found during the first decomposition step and the temperature at the peak of the first decomposition step was usually higher than FP temperatures. The TGA results indicated that the temperature at which 17.5% weigh loss take placed (T17.5% was nearly identical with the FP temperature (±10 °C that was measured according to ASTM D92. The deviation percentage between FP and T17.5% was in the range from −0.8% to 3.6%. Keywords: Flash point, TGA, DSC

  19. Search for neutrino point sources with an all-sky autocorrelation analysis in IceCube

    Energy Technology Data Exchange (ETDEWEB)

    Turcati, Andrea; Bernhard, Anna; Coenders, Stefan [TU, Munich (Germany); Collaboration: IceCube-Collaboration

    2016-07-01

    The IceCube Neutrino Observatory is a cubic kilometre scale neutrino telescope located in the Antarctic ice. Its full-sky field of view gives unique opportunities to study the neutrino emission from the Galactic and extragalactic sky. Recently, IceCube found the first signal of astrophysical neutrinos with energies up to the PeV scale, but the origin of these particles still remains unresolved. Given the observed flux, the absence of observations of bright point-sources is explainable with the presence of numerous weak sources. This scenario can be tested using autocorrelation methods. We present here the sensitivities and discovery potentials of a two-point angular correlation analysis performed on seven years of IceCube data, taken between 2008 and 2015. The test is applied on the northern and southern skies separately, using the neutrino energy information to improve the effectiveness of the method.

  20. A systematic analysis of the Braitenberg vehicle 2b for point-like stimulus sources

    International Nuclear Information System (INIS)

    Rañó, Iñaki

    2012-01-01

    Braitenberg vehicles have been used experimentally for decades in robotics with limited empirical understanding. This paper presents the first mathematical model of the vehicle 2b, displaying so-called aggression behaviour, and analyses the possible trajectories for point-like smooth stimulus sources. This sensory-motor steering control mechanism is used to implement biologically grounded target approach, target-seeking or obstacle-avoidance behaviour. However, the analysis of the resulting model reveals that complex and unexpected trajectories can result even for point-like stimuli. We also prove how the implementation of the controller and the vehicle morphology interact to affect the behaviour of the vehicle. This work provides a better understanding of Braitenberg vehicle 2b, explains experimental results and paves the way for a formally grounded application on robotics as well as for a new way of understanding target seeking in biology. (paper)

  1. Control system for technological processes in tritium processing plants with process analysis

    International Nuclear Information System (INIS)

    Retevoi, Carmen Maria; Stefan, Iuliana; Balteanu, Ovidiu; Stefan, Liviu; Bucur, Ciprian

    2005-01-01

    Integration of a large variety of installations and equipment into a unitary system for controlling the technological process in tritium processing nuclear facilities appears to be a rather complex approach particularly when experimental or new technologies are developed. Ensuring a high degree of versatility allowing easy modifications in configurations and process parameters is a major requirement imposed on experimental installations. The large amount of data which must be processed, stored and easily accessed for subsequent analyses imposes development of a large information network based on a highly integrated system containing the acquisition, control and technological process analysis data as well as data base system. On such a basis integrated systems of computation and control able to conduct the technological process could be developed as well protection systems for cases of failures or break down. The integrated system responds to the control and security requirements in case of emergency and of the technological processes specific to the industry that processes radioactive or toxic substances with severe consequences in case of technological failure as in the case of tritium processing nuclear plant. In order to lower the risk technological failure of these processes an integrated software, data base and process analysis system are developed, which, based on identification algorithm of the important parameters for protection and security systems, will display the process evolution trend. The system was checked on a existing plant that includes a removal tritium unit, finally used in a nuclear power plant, by simulating the failure events as well as the process. The system will also include a complete data base monitoring all the parameters and a process analysis software for the main modules of the tritium processing plant, namely, isotope separation, catalytic purification and cryogenic distillation

  2. Point defect characterization in HAADF-STEM images using multivariate statistical analysis

    International Nuclear Information System (INIS)

    Sarahan, Michael C.; Chi, Miaofang; Masiel, Daniel J.; Browning, Nigel D.

    2011-01-01

    Quantitative analysis of point defects is demonstrated through the use of multivariate statistical analysis. This analysis consists of principal component analysis for dimensional estimation and reduction, followed by independent component analysis to obtain physically meaningful, statistically independent factor images. Results from these analyses are presented in the form of factor images and scores. Factor images show characteristic intensity variations corresponding to physical structure changes, while scores relate how much those variations are present in the original data. The application of this technique is demonstrated on a set of experimental images of dislocation cores along a low-angle tilt grain boundary in strontium titanate. A relationship between chemical composition and lattice strain is highlighted in the analysis results, with picometer-scale shifts in several columns measurable from compositional changes in a separate column. -- Research Highlights: → Multivariate analysis of HAADF-STEM images. → Distinct structural variations among SrTiO 3 dislocation cores. → Picometer atomic column shifts correlated with atomic column population changes.

  3. Validation of capillary blood analysis and capillary testing mode on the epoc Point of Care system

    Directory of Open Access Journals (Sweden)

    Jing Cao

    2017-12-01

    Full Text Available Background: Laboratory test in transport is a critical component of patient care, and capillary blood is a preferred sample type particularly in children. This study evaluated the performance of capillary blood testing on the epoc Point of Care Blood Analysis System (Alere Inc. Methods: Ten fresh venous blood samples was tested on the epoc system under the capillary mode. Correlation with GEM 4000 (Instrumentation Laboratory was examined for Na+, K+, Cl-, Ca2+, glucose, lactate, hematocrit, hemoglobin, pO2, pCO2, and pH, and correlation with serum tested on Vitros 5600 (Ortho Clinical Diagnostics was examined for creatinine. Eight paired capillary and venous blood was tested on epoc and ABL800 (Radiometer for the correlation of Na+, K+, Cl-, Ca2+, glucose, lactate, hematocrit, hemoglobin, pCO2, and pH. Capillary blood from 23 apparently healthy volunteers was tested on the epoc system to assess the concordance to reference ranges used locally. Results: Deming regression correlation coefficients for all the comparisons were above 0.65 except for ionized Ca2+. Accordance of greater than 85% to the local reference ranges were found in all assays with the exception of pO2 and Cl-. Conclusion: Data from this study indicates that capillary blood tests on the epoc system provide comparable results to reference method for these assays, Na+, K+, glucose, lactate, hematocrit, hemoglobin, pCO2, and pH. Further validation in critically ill patients is needed to implement the epoc system in patient transport. Impact of the study: This study demonstrated that capillary blood tests on the epoc Point of Care Blood Analysis System give comparable results to other chemistry analyzers for major blood gas and critical tests. The results are informative to institutions where pre-hospital and inter-hospital laboratory testing on capillary blood is a critical component of patient point of care testing. Keywords: Epoc, Capillary, Transport, Blood gas, Point of care

  4. Big Data Analysis of Manufacturing Processes

    Science.gov (United States)

    Windmann, Stefan; Maier, Alexander; Niggemann, Oliver; Frey, Christian; Bernardi, Ansgar; Gu, Ying; Pfrommer, Holger; Steckel, Thilo; Krüger, Michael; Kraus, Robert

    2015-11-01

    The high complexity of manufacturing processes and the continuously growing amount of data lead to excessive demands on the users with respect to process monitoring, data analysis and fault detection. For these reasons, problems and faults are often detected too late, maintenance intervals are chosen too short and optimization potential for higher output and increased energy efficiency is not sufficiently used. A possibility to cope with these challenges is the development of self-learning assistance systems, which identify relevant relationships by observation of complex manufacturing processes so that failures, anomalies and need for optimization are automatically detected. The assistance system developed in the present work accomplishes data acquisition, process monitoring and anomaly detection in industrial and agricultural processes. The assistance system is evaluated in three application cases: Large distillation columns, agricultural harvesting processes and large-scale sorting plants. In this paper, the developed infrastructures for data acquisition in these application cases are described as well as the developed algorithms and initial evaluation results.

  5. Big Data Analysis of Manufacturing Processes

    International Nuclear Information System (INIS)

    Windmann, Stefan; Maier, Alexander; Niggemann, Oliver; Frey, Christian; Bernardi, Ansgar; Gu, Ying; Pfrommer, Holger; Steckel, Thilo; Krüger, Michael; Kraus, Robert

    2015-01-01

    The high complexity of manufacturing processes and the continuously growing amount of data lead to excessive demands on the users with respect to process monitoring, data analysis and fault detection. For these reasons, problems and faults are often detected too late, maintenance intervals are chosen too short and optimization potential for higher output and increased energy efficiency is not sufficiently used. A possibility to cope with these challenges is the development of self-learning assistance systems, which identify relevant relationships by observation of complex manufacturing processes so that failures, anomalies and need for optimization are automatically detected. The assistance system developed in the present work accomplishes data acquisition, process monitoring and anomaly detection in industrial and agricultural processes. The assistance system is evaluated in three application cases: Large distillation columns, agricultural harvesting processes and large-scale sorting plants. In this paper, the developed infrastructures for data acquisition in these application cases are described as well as the developed algorithms and initial evaluation results. (paper)

  6. Design and fabrication of a diffractive beam splitter for dual-wavelength and concurrent irradiation of process points.

    Science.gov (United States)

    Amako, Jun; Shinozaki, Yu

    2016-07-11

    We report on a dual-wavelength diffractive beam splitter designed for use in parallel laser processing. This novel optical element generates two beam arrays of different wavelengths and allows their overlap at the process points on a workpiece. To design the deep surface-relief profile of a splitter using a simulated annealing algorithm, we introduce a heuristic but practical scheme to determine the maximum depth and the number of quantization levels. The designed corrugations were fabricated in a photoresist by maskless grayscale exposure using a high-resolution spatial light modulator. We characterized the photoresist splitter, thereby validating the proposed beam-splitting concept.

  7. Three-dimensional distribution of cortical synapses: a replicated point pattern-based analysis

    Science.gov (United States)

    Anton-Sanchez, Laura; Bielza, Concha; Merchán-Pérez, Angel; Rodríguez, José-Rodrigo; DeFelipe, Javier; Larrañaga, Pedro

    2014-01-01

    The biggest problem when analyzing the brain is that its synaptic connections are extremely complex. Generally, the billions of neurons making up the brain exchange information through two types of highly specialized structures: chemical synapses (the vast majority) and so-called gap junctions (a substrate of one class of electrical synapse). Here we are interested in exploring the three-dimensional spatial distribution of chemical synapses in the cerebral cortex. Recent research has showed that the three-dimensional spatial distribution of synapses in layer III of the neocortex can be modeled by a random sequential adsorption (RSA) point process, i.e., synapses are distributed in space almost randomly, with the only constraint that they cannot overlap. In this study we hypothesize that RSA processes can also explain the distribution of synapses in all cortical layers. We also investigate whether there are differences in both the synaptic density and spatial distribution of synapses between layers. Using combined focused ion beam milling and scanning electron microscopy (FIB/SEM), we obtained three-dimensional samples from the six layers of the rat somatosensory cortex and identified and reconstructed the synaptic junctions. A total volume of tissue of approximately 4500μm3 and around 4000 synapses from three different animals were analyzed. Different samples, layers and/or animals were aggregated and compared using RSA replicated spatial point processes. The results showed no significant differences in the synaptic distribution across the different rats used in the study. We found that RSA processes described the spatial distribution of synapses in all samples of each layer. We also found that the synaptic distribution in layers II to VI conforms to a common underlying RSA process with different densities per layer. Interestingly, the results showed that synapses in layer I had a slightly different spatial distribution from the other layers. PMID:25206325

  8. Analysis of Variance in Statistical Image Processing

    Science.gov (United States)

    Kurz, Ludwik; Hafed Benteftifa, M.

    1997-04-01

    A key problem in practical image processing is the detection of specific features in a noisy image. Analysis of variance (ANOVA) techniques can be very effective in such situations, and this book gives a detailed account of the use of ANOVA in statistical image processing. The book begins by describing the statistical representation of images in the various ANOVA models. The authors present a number of computationally efficient algorithms and techniques to deal with such problems as line, edge, and object detection, as well as image restoration and enhancement. By describing the basic principles of these techniques, and showing their use in specific situations, the book will facilitate the design of new algorithms for particular applications. It will be of great interest to graduate students and engineers in the field of image processing and pattern recognition.

  9. Neyman, Markov processes and survival analysis.

    Science.gov (United States)

    Yang, Grace

    2013-07-01

    J. Neyman used stochastic processes extensively in his applied work. One example is the Fix and Neyman (F-N) competing risks model (1951) that uses finite homogeneous Markov processes to analyse clinical trials with breast cancer patients. We revisit the F-N model, and compare it with the Kaplan-Meier (K-M) formulation for right censored data. The comparison offers a way to generalize the K-M formulation to include risks of recovery and relapses in the calculation of a patient's survival probability. The generalization is to extend the F-N model to a nonhomogeneous Markov process. Closed-form solutions of the survival probability are available in special cases of the nonhomogeneous processes, like the popular multiple decrement model (including the K-M model) and Chiang's staging model, but these models do not consider recovery and relapses while the F-N model does. An analysis of sero-epidemiology current status data with recurrent events is illustrated. Fix and Neyman used Neyman's RBAN (regular best asymptotic normal) estimates for the risks, and provided a numerical example showing the importance of considering both the survival probability and the length of time of a patient living a normal life in the evaluation of clinical trials. The said extension would result in a complicated model and it is unlikely to find analytical closed-form solutions for survival analysis. With ever increasing computing power, numerical methods offer a viable way of investigating the problem.

  10. Breed differences in dogs sensitivity to human points: a meta-analysis.

    Science.gov (United States)

    Dorey, Nicole R; Udell, Monique A R; Wynne, Clive D L

    2009-07-01

    The last decade has seen a substantial increase in research on the behavioral and cognitive abilities of pet dogs, Canis familiaris. The most commonly used experimental paradigm is the object-choice task in which a dog is given a choice of two containers and guided to the reinforced object by human pointing gestures. We review here studies of this type and attempt a meta-analysis of the available data. In the meta-analysis breeds of dogs were grouped into the eight categories of the American Kennel Club, and into four clusters identified by Parker and Ostrander [Parker, H.G., Ostrander, E.A., 2005. Canine genomics and genetics: running with the pack. PLoS Genet. 1, 507-513] on the basis of a genetic analysis. No differences in performance between breeds categorized in either fashion were identified. Rather, all dog breeds appear to be similarly and highly successful in following human points to locate desired food. We suggest this result could be due to the paucity of data available in published studies, and the restricted range of breeds tested.

  11. Benefits analysis of Soft Open Points for electrical distribution network operation

    International Nuclear Information System (INIS)

    Cao, Wanyu; Wu, Jianzhong; Jenkins, Nick; Wang, Chengshan; Green, Timothy

    2016-01-01

    Highlights: • An analysis framework was developed to quantify the operational benefits. • The framework considers both network reconfiguration and SOP control. • Benefits were analyzed through both quantitative and sensitivity analysis. - Abstract: Soft Open Points (SOPs) are power electronic devices installed in place of normally-open points in electrical power distribution networks. They are able to provide active power flow control, reactive power compensation and voltage regulation under normal network operating conditions, as well as fast fault isolation and supply restoration under abnormal conditions. A steady state analysis framework was developed to quantify the operational benefits of a distribution network with SOPs under normal network operating conditions. A generic power injection model was developed and used to determine the optimal SOP operation using an improved Powell’s Direct Set method. Physical limits and power losses of the SOP device (based on back to back voltage-source converters) were considered in the model. Distribution network reconfiguration algorithms, with and without SOPs, were developed and used to identify the benefits of using SOPs. Test results on a 33-bus distribution network compared the benefits of using SOPs, traditional network reconfiguration and the combination of both. The results showed that using only one SOP achieved a similar improvement in network operation compared to the case of using network reconfiguration with all branches equipped with remotely controlled switches. A combination of SOP control and network reconfiguration provided the optimal network operation.

  12. Point Analysis in Java applied to histological images of the perforant pathway: a user's account.

    Science.gov (United States)

    Scorcioni, Ruggero; Wright, Susan N; Patrick Card, J; Ascoli, Giorgio A; Barrionuevo, Germán

    2008-01-01

    The freeware Java tool Point Analysis in Java (PAJ), created to perform 3D point analysis, was tested in an independent laboratory setting. The input data consisted of images of the hippocampal perforant pathway from serial immunocytochemical localizations of the rat brain in multiple views at different resolutions. The low magnification set (x2 objective) comprised the entire perforant pathway, while the high magnification set (x100 objective) allowed the identification of individual fibers. A preliminary stereological study revealed a striking linear relationship between the fiber count at high magnification and the optical density at low magnification. PAJ enabled fast analysis for down-sampled data sets and a friendly interface with automated plot drawings. Noted strengths included the multi-platform support as well as the free availability of the source code, conducive to a broad user base and maximum flexibility for ad hoc requirements. PAJ has great potential to extend its usability by (a) improving its graphical user interface, (b) increasing its input size limit, (c) improving response time for large data sets, and (d) potentially being integrated with other Java graphical tools such as ImageJ.

  13. Accessibility analysis in manufacturing processes using visibility cones

    Institute of Scientific and Technical Information of China (English)

    尹周平; 丁汉; 熊有伦

    2002-01-01

    Accessibility is a kind of important design feature of products,and accessibility analysis has been acknowledged as a powerful tool for solving computational manufacturing problems arising from different manufacturing processes.After exploring the relations among approachability,accessibility and visibility,a general method for accessibility analysis using visibility cones (VC) is proposed.With the definition of VC of a point,three kinds of visibility of a feature,namely complete visibility cone (CVC),partial visibility cone (PVC) and local visibility cone (LVC),are defined.A novel approach to computing VCs is formulated by identifying C-obstacles in the C-space,for which a general and efficient algorithm is proposed and implemented by making use of visibility culling.Lastly,we discuss briefly how to realize accessibility analysis in numerically controlled (NC) machining planning,coordinate measuring machines (CMMs) inspection planning and assembly sequence planning with the proposed methods.

  14. Dual time point 18FDG-PET/CT versus single time point 18FDG-PET/CT for the differential diagnosis of pulmonary nodules - A meta-analysis

    International Nuclear Information System (INIS)

    Zhang, Li; Wang, Yinzhong; Lei, Junqiang; Tian, Jinhui; Zhai, Yanan

    2013-01-01

    Background: Lung cancer is one of the most common cancer types in the world. An accurate diagnosis of lung cancer is crucial for early treatment and management. Purpose: To perform a comprehensive meta-analysis to evaluate the diagnostic performance of dual time point 18F-fluorodexyglucose position emission tomography/computed tomography (FDG-PET/CT) and single time point 18FDG-PET/CT in the diagnosis of pulmonary nodules. Material and Methods: PubMed (1966-2011.11), EMBASE (1974-2011.11), Web of Science (1972-2011.11), Cochrane Library (-2011.11), and four Chinese databases; CBM (1978-2011.11), CNKI (1994-2011.11), VIP (1989-2011.11), and Wanfang Database (1994-2011.11) were searched. Summary sensitivity, summary specificity, summary diagnostic odds ratios (DOR), and summary positive likelihood ratios (LR+) and negative likelihood ratios (LR-) were obtained using Meta-Disc software. Summary receiver-operating characteristic (SROC) curves were used to evaluate the diagnostic performance of dual time point 18FDG-PET/CT and single time point 18FDG-PET/CT. Results: The inclusion criteria were fulfilled by eight articles, with a total of 415 patients and 430 pulmonary nodules. Compared with the gold standard (pathology or clinical follow-up), the summary sensitivity of dual time point 18FDG-PET/CT was 79% (95%CI, 74.0 - 84.0%), and its summary specificity was 73% (95%CI, 65.0-79.0%); the summary LR+ was 2.61 (95%CI, 1.96-3.47), and the summary LR- was 0.29 (95%CI, 0.21 - 0.41); the summary DOR was 10.25 (95%CI, 5.79 - 18.14), and the area under the SROC curve (AUC) was 0.8244. The summary sensitivity for single time point 18FDG-PET/CT was 77% (95%CI, 71.9 - 82.3%), and its summary specificity was 59% (95%CI, 50.6 - 66.2%); the summary LR+ was 1.97 (95%CI, 1.32 - 2.93), and the summary LR- was 0.37 (95%CI, 0.29 - 0.49); the summary DOR was 6.39 (95%CI, 3.39 - 12.05), and the AUC was 0.8220. Conclusion: The results indicate that dual time point 18FDG-PET/CT and single

  15. Process control analysis of IMRT QA: implications for clinical trials

    International Nuclear Information System (INIS)

    Pawlicki, Todd; Rice, Roger K; Yoo, Sua; Court, Laurence E; McMillan, Sharon K; Russell, J Donald; Pacyniak, John M; Woo, Milton K; Basran, Parminder S; Boyer, Arthur L; Bonilla, Claribel

    2008-01-01

    The purpose of this study is two-fold: first is to investigate the process of IMRT QA using control charts and second is to compare control chart limits to limits calculated using the standard deviation (σ). Head and neck and prostate IMRT QA cases from seven institutions in both academic and community settings are considered. The percent difference between the point dose measurement in phantom and the corresponding result from the treatment planning system (TPS) is used for analysis. The average of the percent difference calculations defines the accuracy of the process and is called the process target. This represents the degree to which the process meets the clinical goal of 0% difference between the measurements and TPS. IMRT QA process ability defines the ability of the process to meet clinical specifications (e.g. 5% difference between the measurement and TPS). The process ability is defined in two ways: (1) the half-width of the control chart limits, and (2) the half-width of ±3σ limits. Process performance is characterized as being in one of four possible states that describes the stability of the process and its ability to meet clinical specifications. For the head and neck cases, the average process target across institutions was 0.3% (range: -1.5% to 2.9%). The average process ability using control chart limits was 7.2% (range: 5.3% to 9.8%) compared to 6.7% (range: 5.3% to 8.2%) using standard deviation limits. For the prostate cases, the average process target across the institutions was 0.2% (range: -1.8% to 1.4%). The average process ability using control chart limits was 4.4% (range: 1.3% to 9.4%) compared to 5.3% (range: 2.3% to 9.8%) using standard deviation limits. Using the standard deviation to characterize IMRT QA process performance resulted in processes being preferentially placed in one of the four states. This is in contrast to using control charts for process characterization where the IMRT QA processes were spread over three of the

  16. Mass customization process for the Social Housing. Potentiality, critical points, research lines

    Directory of Open Access Journals (Sweden)

    Michele Di Sivo

    2012-10-01

    Full Text Available The demand for lengthening the life cycle of the residential estate, engendered with the economical and housing crisis since the last few years, brings out, in the course of time, the need for conservation and improvement works of the property house performances, through the direct involvement of the users. The possibility of reducing maintenance and adjustment costs may develop into a project resource, consistent to the participation and cooperation principles, identifying social housing interventions. With this aim, the BETHA group of the d’Annunzio University is investigating the potentiality of technological transfer of the ‘mass customization’ process from the industrial products field to the social housing segment, by detecting issues, strategies and opportunities.

  17. Stochastic dynamical model of a growing citation network based on a self-exciting point process.

    Science.gov (United States)

    Golosovsky, Michael; Solomon, Sorin

    2012-08-31

    We put under experimental scrutiny the preferential attachment model that is commonly accepted as a generating mechanism of the scale-free complex networks. To this end we chose a citation network of physics papers and traced the citation history of 40,195 papers published in one year. Contrary to common belief, we find that the citation dynamics of the individual papers follows the superlinear preferential attachment, with the exponent α=1.25-1.3. Moreover, we show that the citation process cannot be described as a memoryless Markov chain since there is a substantial correlation between the present and recent citation rates of a paper. Based on our findings we construct a stochastic growth model of the citation network, perform numerical simulations based on this model and achieve an excellent agreement with the measured citation distributions.

  18. An Investigation of Three-point Shooting through an Analysis of NBA Player Tracking Data

    OpenAIRE

    Sliz, Bradley A.

    2017-01-01

    I address the difficult challenge of measuring the relative influence of competing basketball game strategies, and I apply my analysis to plays resulting in three-point shots. I use a glut of SportVU player tracking data from over 600 NBA games to derive custom position-based features that capture tangible game strategies from game-play data, such as teamwork, player matchups, and on-ball defender distances. Then, I demonstrate statistical methods for measuring the relative importance of any ...

  19. [Incorporation of the Hazard Analysis and Critical Control Point system (HACCP) in food legislation].

    Science.gov (United States)

    Castellanos Rey, Liliana C; Villamil Jiménez, Luis C; Romero Prada, Jaime R

    2004-01-01

    The Hazard Analysis and Critical Control Point system (HACCP), recommended by different international organizations as the Codex Alimentarius Commission, the World Trade Organization (WTO), the International Office of Epizootics (OIE) and the International Convention for Vegetables Protection (ICPV) amongst others, contributes to ensuring the innocuity of food along the agro-alimentary chain and requires of Good Manufacturing Practices (GMP) for its implementation, GMP's which are legislated in most countries. Since 1997, Colombia has set rules and legislation for application of HACCP system in agreement with international standards. This paper discusses the potential and difficulties of the legislation enforcement and suggests some policy implications towards food safety.

  20. [Process and key points of clinical literature evaluation of post-marketing traditional Chinese medicine].

    Science.gov (United States)

    Liu, Huan; Xie, Yanming

    2011-10-01

    The clinical literature evaluation of the post-marketing traditional Chinese medicine is a comprehensive evaluation by the comprehensive gain, analysis of the drug, literature of drug efficacy, safety, economy, based on the literature evidence and is part of the evaluation of evidence-based medicine. The literature evaluation in the post-marketing Chinese medicine clinical evaluation is in the foundation and the key position. Through the literature evaluation, it can fully grasp the information, grasp listed drug variety of traditional Chinese medicines second development orientation, make clear further clinical indications, perfect the medicines, etc. This paper discusses the main steps and emphasis of the clinical literature evaluation. Emphasizing security literature evaluation should attach importance to the security of a comprehensive collection drug information. Safety assessment should notice traditional Chinese medicine validity evaluation in improving syndrome, improveing the living quality of patients with special advantage. The economics literature evaluation should pay attention to reliability, sensitivity and practicability of the conclusion.

  1. Change-Point and Trend Analysis on Annual Maximum Discharge in Continental United States

    Science.gov (United States)

    Serinaldi, F.; Villarini, G.; Smith, J. A.; Krajewski, W. F.

    2008-12-01

    Annual maximum discharge records from 36 stations representing different hydro-climatic regimes in the continental United States with at least 100 years of records are used to investigate the presence of temporal trends and abrupt changes in mean and variance. Change point analysis is performed by means of two non- parametric (Pettitt and CUSUM), one semi-parametric (Guan), and two parametric (Rodionov and Bayesian Change Point) tests. Two non-parametric (Mann-Kendall and Spearman) and one parametric (Pearson) tests are applied to detect the presence of temporal trends. Generalized Additive Model for Location Scale and Shape (GAMLSS) models are also used to parametrically model the streamflow data exploiting their flexibility to account for changes and temporal trends in the parameters of distribution functions. Additionally, serial correlation is assessed in advance by computing the autocorrelation function (ACF), and the Hurst parameter is estimated using two estimators (aggregated variance and differenced variance methods) to investigate the presence of long range dependence. The results of this study indicate lack of long range dependence in the maximum streamflow series. At some stations the authors found a statistically significant change point in the mean and/or variance, while in general they detected no statistically significant temporal trends.

  2. One-point fluctuation analysis of the high-energy neutrino sky

    Energy Technology Data Exchange (ETDEWEB)

    Feyereisen, Michael R.; Ando, Shin' ichiro [GRAPPA Institute, University of Amsterdam, Science Park 904, 1098 XH Amsterdam (Netherlands); Tamborra, Irene, E-mail: m.r.feyereisen@uva.nl, E-mail: tamborra@nbi.ku.dk, E-mail: s.ando@uva.nl [Niels Bohr International Academy, Niels Bohr Institute, Blegdamsvej 17, 2100 Copenhagen (Denmark)

    2017-03-01

    We perform the first one-point fluctuation analysis of the high-energy neutrino sky. This method reveals itself to be especially suited to contemporary neutrino data, as it allows to study the properties of the astrophysical components of the high-energy flux detected by the IceCube telescope, even with low statistics and in the absence of point source detection. Besides the veto-passing atmospheric foregrounds, we adopt a simple model of the high-energy neutrino background by assuming two main extra-galactic components: star-forming galaxies and blazars. By leveraging multi-wavelength data from Herschel and Fermi , we predict the spectral and anisotropic probability distributions for their expected neutrino counts in IceCube. We find that star-forming galaxies are likely to remain a diffuse background due to the poor angular resolution of IceCube, and we determine an upper limit on the number of shower events that can reasonably be associated to blazars. We also find that upper limits on the contribution of blazars to the measured flux are unfavourably affected by the skewness of the blazar flux distribution. One-point event clustering and likelihood analyses of the IceCube HESE data suggest that this method has the potential to dramatically improve over more conventional model-based analyses, especially for the next generation of neutrino telescopes.

  3. Electron-density critical points analysis and catastrophe theory to forecast structure instability in periodic solids.

    Science.gov (United States)

    Merli, Marcello; Pavese, Alessandro

    2018-03-01

    The critical points analysis of electron density, i.e. ρ(x), from ab initio calculations is used in combination with the catastrophe theory to show a correlation between ρ(x) topology and the appearance of instability that may lead to transformations of crystal structures, as a function of pressure/temperature. In particular, this study focuses on the evolution of coalescing non-degenerate critical points, i.e. such that ∇ρ(x c ) = 0 and λ 1 , λ 2 , λ 3 ≠ 0 [λ being the eigenvalues of the Hessian of ρ(x) at x c ], towards degenerate critical points, i.e. ∇ρ(x c ) = 0 and at least one λ equal to zero. The catastrophe theory formalism provides a mathematical tool to model ρ(x) in the neighbourhood of x c and allows one to rationalize the occurrence of instability in terms of electron-density topology and Gibbs energy. The phase/state transitions that TiO 2 (rutile structure), MgO (periclase structure) and Al 2 O 3 (corundum structure) undergo because of pressure and/or temperature are here discussed. An agreement of 3-5% is observed between the theoretical model and experimental pressure/temperature of transformation.

  4. Archiving, sharing, processing and publishing historical earthquakes data: the IT point of view

    Science.gov (United States)

    Locati, Mario; Rovida, Andrea; Albini, Paola

    2014-05-01

    Digital tools devised for seismological data are mostly designed for handling instrumentally recorded data. Researchers working on historical seismology are forced to perform their daily job using a general purpose tool and/or coding their own to address their specific tasks. The lack of out-of-the-box tools expressly conceived to deal with historical data leads to a huge amount of time lost in performing tedious task to search for the data and, to manually reformat it in order to jump from one tool to the other, sometimes causing a loss of the original data. This reality is common to all activities related to the study of earthquakes of the past centuries, from the interpretations of past historical sources, to the compilation of earthquake catalogues. A platform able to preserve the historical earthquake data, trace back their source, and able to fulfil many common tasks was very much needed. In the framework of two European projects (NERIES and SHARE) and one global project (Global Earthquake History, GEM), two new data portals were designed and implemented. The European portal "Archive of Historical Earthquakes Data" (AHEAD) and the worldwide "Global Historical Earthquake Archive" (GHEA), are aimed at addressing at least some of the above mentioned issues. The availability of these new portals and their well-defined standards makes it easier than before the development of side tools for archiving, publishing and processing the available historical earthquake data. The AHEAD and GHEA portals, their underlying technologies and the developed side tools are presented.

  5. Dissolution Dominating Calcification Process in Polar Pteropods Close to the Point of Aragonite Undersaturation

    Science.gov (United States)

    Bednaršek, Nina; Tarling, Geraint A.; Bakker, Dorothee C. E.; Fielding, Sophie; Feely, Richard A.

    2014-01-01

    Thecosome pteropods are abundant upper-ocean zooplankton that build aragonite shells. Ocean acidification results in the lowering of aragonite saturation levels in the surface layers, and several incubation studies have shown that rates of calcification in these organisms decrease as a result. This study provides a weight-specific net calcification rate function for thecosome pteropods that includes both rates of dissolution and calcification over a range of plausible future aragonite saturation states (Ωar). We measured gross dissolution in the pteropod Limacina helicina antarctica in the Scotia Sea (Southern Ocean) by incubating living specimens across a range of aragonite saturation states for a maximum of 14 days. Specimens started dissolving almost immediately upon exposure to undersaturated conditions (Ωar∼0.8), losing 1.4% of shell mass per day. The observed rate of gross dissolution was different from that predicted by rate law kinetics of aragonite dissolution, in being higher at Ωar levels slightly above 1 and lower at Ωar levels of between 1 and 0.8. This indicates that shell mass is affected by even transitional levels of saturation, but there is, nevertheless, some partial means of protection for shells when in undersaturated conditions. A function for gross dissolution against Ωar derived from the present observations was compared to a function for gross calcification derived by a different study, and showed that dissolution became the dominating process even at Ωar levels close to 1, with net shell growth ceasing at an Ωar of 1.03. Gross dissolution increasingly dominated net change in shell mass as saturation levels decreased below 1. As well as influencing their viability, such dissolution of pteropod shells in the surface layers will result in slower sinking velocities and decreased carbon and carbonate fluxes to the deep ocean. PMID:25285916

  6. The effect of post-processing treatments on inflection points in current–voltage curves of roll-to-roll processed polymer photovoltaics

    DEFF Research Database (Denmark)

    Lilliedal, Mathilde Raad; Medford, Andrew James; Vesterager Madsen, Morten

    2010-01-01

    Inflection point behaviour is often observed in the current–voltage (IV) curve of polymer solar cells. This phenomenon is examined in the context of flexible roll-to-roll (R2R) processed polymer solar cells in a large series of devices with a layer structure of: PET–ITO–ZnO–P3HT...... characterization of device interfaces was carried out in order to identify possible chemical processes that are related to photo-annealing. A possible mechanism based on ZnO photoconductivity, photooxidation and redistribution of oxygen inside the cell is proposed, and it is anticipated that the findings......:PCBM–PEDOT:PSS–Ag. The devices were manufactured using a combination of slot-die coating and screen printing; they were then encapsulated by lamination using a polymer based barrier material. All manufacturing steps were carried out in ambient air. The freshly prepared devices showed a consistent inflection point in the IV...

  7. Flame analysis using image processing techniques

    Science.gov (United States)

    Her Jie, Albert Chang; Zamli, Ahmad Faizal Ahmad; Zulazlan Shah Zulkifli, Ahmad; Yee, Joanne Lim Mun; Lim, Mooktzeng

    2018-04-01

    This paper presents image processing techniques with the use of fuzzy logic and neural network approach to perform flame analysis. Flame diagnostic is important in the industry to extract relevant information from flame images. Experiment test is carried out in a model industrial burner with different flow rates. Flame features such as luminous and spectral parameters are extracted using image processing and Fast Fourier Transform (FFT). Flame images are acquired using FLIR infrared camera. Non-linearities such as thermal acoustic oscillations and background noise affect the stability of flame. Flame velocity is one of the important characteristics that determines stability of flame. In this paper, an image processing method is proposed to determine flame velocity. Power spectral density (PSD) graph is a good tool for vibration analysis where flame stability can be approximated. However, a more intelligent diagnostic system is needed to automatically determine flame stability. In this paper, flame features of different flow rates are compared and analyzed. The selected flame features are used as inputs to the proposed fuzzy inference system to determine flame stability. Neural network is used to test the performance of the fuzzy inference system.

  8. Spectral analysis of growing graphs a quantum probability point of view

    CERN Document Server

    Obata, Nobuaki

    2017-01-01

    This book is designed as a concise introduction to the recent achievements on spectral analysis of graphs or networks from the point of view of quantum (or non-commutative) probability theory. The main topics are spectral distributions of the adjacency matrices of finite or infinite graphs and their limit distributions for growing graphs. The main vehicle is quantum probability, an algebraic extension of the traditional probability theory, which provides a new framework for the analysis of adjacency matrices revealing their non-commutative nature. For example, the method of quantum decomposition makes it possible to study spectral distributions by means of interacting Fock spaces or equivalently by orthogonal polynomials. Various concepts of independence in quantum probability and corresponding central limit theorems are used for the asymptotic study of spectral distributions for product graphs. This book is written for researchers, teachers, and students interested in graph spectra, their (asymptotic) spectr...

  9. Preliminary Hazards Analysis Plasma Hearth Process

    International Nuclear Information System (INIS)

    Aycock, M.; Coordes, D.; Russell, J.; TenBrook, W.; Yimbo, P.

    1993-11-01

    This Preliminary Hazards Analysis (PHA) for the Plasma Hearth Process (PHP) follows the requirements of United States Department of Energy (DOE) Order 5480.23 (DOE, 1992a), DOE Order 5480.21 (DOE, 1991d), DOE Order 5480.22 (DOE, 1992c), DOE Order 5481.1B (DOE, 1986), and the guidance provided in DOE Standards DOE-STD-1027-92 (DOE, 1992b). Consideration is given to ft proposed regulations published as 10 CFR 830 (DOE, 1993) and DOE Safety Guide SG 830.110 (DOE, 1992b). The purpose of performing a PRA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PRA then is followed by a Preliminary Safety Analysis Report (PSAR) performed during Title I and II design. This PSAR then leads to performance of the Final Safety Analysis Report performed during construction, testing, and acceptance and completed before routine operation. Radiological assessments indicate that a PHP facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous material assessments indicate that a PHP facility will be a Low Hazard facility having no significant impacts either onsite or offsite to personnel and the environment

  10. Validation of capillary blood analysis and capillary testing mode on the epoc Point of Care system.

    Science.gov (United States)

    Cao, Jing; Edwards, Rachel; Chairez, Janette; Devaraj, Sridevi

    2017-12-01

    Laboratory test in transport is a critical component of patient care, and capillary blood is a preferred sample type particularly in children. This study evaluated the performance of capillary blood testing on the epoc Point of Care Blood Analysis System (Alere Inc). Ten fresh venous blood samples was tested on the epoc system under the capillary mode. Correlation with GEM 4000 (Instrumentation Laboratory) was examined for Na+, K+, Cl-, Ca2+, glucose, lactate, hematocrit, hemoglobin, pO2, pCO2, and pH, and correlation with serum tested on Vitros 5600 (Ortho Clinical Diagnostics) was examined for creatinine. Eight paired capillary and venous blood was tested on epoc and ABL800 (Radiometer) for the correlation of Na+, K+, Cl-, Ca2+, glucose, lactate, hematocrit, hemoglobin, pCO2, and pH. Capillary blood from 23 apparently healthy volunteers was tested on the epoc system to assess the concordance to reference ranges used locally. Deming regression correlation coefficients for all the comparisons were above 0.65 except for ionized Ca2+. Accordance of greater than 85% to the local reference ranges were found in all assays with the exception of pO2 and Cl-. Data from this study indicates that capillary blood tests on the epoc system provide comparable results to reference method for these assays, Na+, K+, glucose, lactate, hematocrit, hemoglobin, pCO2, and pH. Further validation in critically ill patients is needed to implement the epoc system in patient transport. This study demonstrated that capillary blood tests on the epoc Point of Care Blood Analysis System give comparable results to other chemistry analyzers for major blood gas and critical tests. The results are informative to institutions where pre-hospital and inter-hospital laboratory testing on capillary blood is a critical component of patient point of care testing.

  11. AUTOMATED VOXEL MODEL FROM POINT CLOUDS FOR STRUCTURAL ANALYSIS OF CULTURAL HERITAGE

    Directory of Open Access Journals (Sweden)

    G. Bitelli

    2016-06-01

    Full Text Available In the context of cultural heritage, an accurate and comprehensive digital survey of a historical building is today essential in order to measure its geometry in detail for documentation or restoration purposes, for supporting special studies regarding materials and constructive characteristics, and finally for structural analysis. Some proven geomatic techniques, such as photogrammetry and terrestrial laser scanning, are increasingly used to survey buildings with different complexity and dimensions; one typical product is in form of point clouds. We developed a semi-automatic procedure to convert point clouds, acquired from laserscan or digital photogrammetry, to a filled volume model of the whole structure. The filled volume model, in a voxel format, can be useful for further analysis and also for the generation of a Finite Element Model (FEM of the surveyed building. In this paper a new approach is presented with the aim to decrease operator intervention in the workflow and obtain a better description of the structure. In order to achieve this result a voxel model with variable resolution is produced. Different parameters are compared and different steps of the procedure are tested and validated in the case study of the North tower of the San Felice sul Panaro Fortress, a monumental historical building located in San Felice sul Panaro (Modena, Italy that was hit by an earthquake in 2012.

  12. Bayesian change-point analysis reveals developmental change in a classic theory of mind task.

    Science.gov (United States)

    Baker, Sara T; Leslie, Alan M; Gallistel, C R; Hood, Bruce M

    2016-12-01

    Although learning and development reflect changes situated in an individual brain, most discussions of behavioral change are based on the evidence of group averages. Our reliance on group-averaged data creates a dilemma. On the one hand, we need to use traditional inferential statistics. On the other hand, group averages are highly ambiguous when we need to understand change in the individual; the average pattern of change may characterize all, some, or none of the individuals in the group. Here we present a new method for statistically characterizing developmental change in each individual child we study. Using false-belief tasks, fifty-two children in two cohorts were repeatedly tested for varying lengths of time between 3 and 5 years of age. Using a novel Bayesian change point analysis, we determined both the presence and-just as importantly-the absence of change in individual longitudinal cumulative records. Whenever the analysis supports a change conclusion, it identifies in that child's record the most likely point at which change occurred. Results show striking variability in patterns of change and stability across individual children. We then group the individuals by their various patterns of change or no change. The resulting patterns provide scarce support for sudden changes in competence and shed new light on the concepts of "passing" and "failing" in developmental studies. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Spectral analysis of point-vortex dynamics: first application to vortex polygons in a circular domain

    International Nuclear Information System (INIS)

    Speetjens, M F M; Meleshko, V V; Van Heijst, G J F

    2014-01-01

    The present study addresses the classical problem of the dynamics and stability of a cluster of N-point vortices of equal strength arranged in a polygonal configuration (‘N-vortex polygons’). In unbounded domains, such N-vortex polygons are unconditionally stable for N⩽7. Confinement in a circular domain tightens the stability conditions to N⩽6 and a maximum polygon size relative to the domain radius. This work expands on existing studies on stability and integrability by a first giving an exploratory spectral analysis of the dynamics of N vortex polygons in circular domains. Key to this is that the spectral signature of the time evolution of vortex positions reflects their qualitative behaviour. Expressing vortex motion by a generic evolution operator (the so-called Koopman operator) provides a rigorous framework for such spectral analyses. This paves the way to further differentiation and classification of point-vortex behaviour beyond stability and integrability. The concept of Koopman-based spectral analysis is demonstrated for N-vortex polygons. This reveals that conditional stability can be seen as a local form of integrability and confirms an important generic link between spectrum and dynamics: discrete spectra imply regular (quasi-periodic) motion; continuous (sub-)spectra imply chaotic motion. Moreover, this exposes rich nonlinear dynamics as intermittency between regular and chaotic motion and quasi-coherent structures formed by chaotic vortices. (ss 1)

  14. Area, and Power Performance Analysis of a Floating-Point Based Application on FPGAs

    National Research Council Canada - National Science Library

    Govindu, Gokul

    2003-01-01

    .... However the inevitable quantization effects and the complexity of converting the floating-point algorithm into a fixed point one, limit the use of fixed-point arithmetic for high precision embedded computing...

  15. On the diffusion process of irradiation-induced point defects in the stress field of a moving dislocation

    International Nuclear Information System (INIS)

    Steinbach, E.

    1987-01-01

    The cellular model of a dislocation is used for an investigation of the time-dependent diffusion process of irradiation-induced point defects interacting with the stress field of a moving dislocation. An analytic solution is given taking into account the elastic interaction due to the first-order size effect and the stress-induced interaction, the kinematic interaction due to the dislocation motion as well as the presence of secondary neutral sinks. The results for the space and time-dependent point defect concentration, represented in terms of Mathieu-Bessel and Mathieu-Hankel functions, emphasize the influence of the parameters which have been taken into consideration. Proceeding from these solutions, formulae for the diffusion flux reaching unit length of the dislocation, which plays an important role with regard to void swelling and irradiation-induced creep, are derived

  16. Processing Cost Analysis for Biomass Feedstocks

    Energy Technology Data Exchange (ETDEWEB)

    Badger, P.C.

    2002-11-20

    The receiving, handling, storing, and processing of woody biomass feedstocks is an overlooked component of biopower systems. The purpose of this study was twofold: (1) to identify and characterize all the receiving, handling, storing, and processing steps required to make woody biomass feedstocks suitable for use in direct combustion and gasification applications, including small modular biopower (SMB) systems, and (2) to estimate the capital and operating costs at each step. Since biopower applications can be varied, a number of conversion systems and feedstocks required evaluation. In addition to limiting this study to woody biomass feedstocks, the boundaries of this study were from the power plant gate to the feedstock entry point into the conversion device. Although some power plants are sited at a source of wood waste fuel, it was assumed for this study that all wood waste would be brought to the power plant site. This study was also confined to the following three feedstocks (1) forest residues, (2) industrial mill residues, and (3) urban wood residues. Additionally, the study was confined to grate, suspension, and fluidized bed direct combustion systems; gasification systems; and SMB conversion systems. Since scale can play an important role in types of equipment, operational requirements, and capital and operational costs, this study examined these factors for the following direct combustion and gasification system size ranges: 50, 20, 5, and 1 MWe. The scope of the study also included: Specific operational issues associated with specific feedstocks (e.g., bark and problems with bridging); Opportunities for reducing handling, storage, and processing costs; How environmental restrictions can affect handling and processing costs (e.g., noise, commingling of treated wood or non-wood materials, emissions, and runoff); and Feedstock quality issues and/or requirements (e.g., moisture, particle size, presence of non-wood materials). The study found that over the

  17. POST-PROCESSING ANALYSIS FOR THC SEEPAGE

    International Nuclear Information System (INIS)

    SUN, Y.

    2004-01-01

    This report describes the selection of water compositions for the total system performance assessment (TSPA) model of results from the thermal-hydrological-chemical (THC) seepage model documented in ''Drift-Scale THC Seepage Model'' (BSC 2004 [DIRS 169856]). The selection has been conducted in accordance with ''Technical Work Plan for: Near-Field Environment and Transport: Coupled Processes (Mountain-Scale TH/THC/THM, Drift-Scale THC Seepage, and Post-Processing Analysis for THC Seepage) Report Integration'' (BSC 2004 [DIRS 171334]). This technical work plan (TWP) was prepared in accordance with AP-2.27Q, ''Planning for Science Activities''. Section 1.2.3 of the TWP describes planning information pertaining to the technical scope, content, and management of this report. The post-processing analysis for THC seepage (THC-PPA) documented in this report provides a methodology for evaluating the near-field compositions of water and gas around a typical waste emplacement drift as these relate to the chemistry of seepage, if any, into the drift. The THC-PPA inherits the conceptual basis of the THC seepage model, but is an independently developed process. The relationship between the post-processing analysis and other closely related models, together with their main functions in providing seepage chemistry information for the Total System Performance Assessment for the License Application (TSPA-LA), are illustrated in Figure 1-1. The THC-PPA provides a data selection concept and direct input to the physical and chemical environment (P and CE) report that supports the TSPA model. The purpose of the THC-PPA is further discussed in Section 1.2. The data selection methodology of the post-processing analysis (Section 6.2.1) was initially applied to results of the THC seepage model as presented in ''Drift-Scale THC Seepage Model'' (BSC 2004 [DIRS 169856]). Other outputs from the THC seepage model (DTN: LB0302DSCPTHCS.002 [DIRS 161976]) used in the P and CE (BSC 2004 [DIRS 169860

  18. The signer and the sign: cortical correlates of person identity and language processing from point-light displays.

    Science.gov (United States)

    Campbell, Ruth; Capek, Cheryl M; Gazarian, Karine; MacSweeney, Mairéad; Woll, Bencie; David, Anthony S; McGuire, Philip K; Brammer, Michael J

    2011-09-01

    In this study, the first to explore the cortical correlates of signed language (SL) processing under point-light display conditions, the observer identified either a signer or a lexical sign from a display in which different signers were seen producing a number of different individual signs. Many of the regions activated by point-light under these conditions replicated those previously reported for full-image displays, including regions within the inferior temporal cortex that are specialised for face and body-part identification, although such body parts were invisible in the display. Right frontal regions were also recruited - a pattern not usually seen in full-image SL processing. This activation may reflect the recruitment of information about person identity from the reduced display. A direct comparison of identify-signer and identify-sign conditions showed these tasks relied to a different extent on the posterior inferior regions. Signer identification elicited greater activation than sign identification in (bilateral) inferior temporal gyri (BA 37/19), fusiform gyri (BA 37), middle and posterior portions of the middle temporal gyri (BAs 37 and 19), and superior temporal gyri (BA 22 and 42). Right inferior frontal cortex was a further focus of differential activation (signer>sign). These findings suggest that the neural systems supporting point-light displays for the processing of SL rely on a cortical network including areas of the inferior temporal cortex specialized for face and body identification. While this might be predicted from other studies of whole body point-light actions (Vaina, Solomon, Chowdhury, Sinha, & Belliveau, 2001) it is not predicted from the perspective of spoken language processing, where voice characteristics and speech content recruit distinct cortical regions (Stevens, 2004) in addition to a common network. In this respect, our findings contrast with studies of voice/speech recognition (Von Kriegstein, Kleinschmidt, Sterzer

  19. Sustainable process design & analysis of hybrid separations

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Befort, Bridgette; Garg, Nipun

    2016-01-01

    Distillation is an energy intensive operation in chemical process industries. There are around 40,000 distillation columns in operation in the US, requiring approximately 40% of the total energy consumption in US chemical process industries. However, analysis of separations by distillation has...... shown that more than 50% of energy is spent in purifying the last 5-10% of the distillate product. Membrane modules on the other hand can achieve high purity separations at lower energy costs, but if the flux is high, it requires large membrane area. A hybrid scheme where distillation and membrane...... modules are combined such that each operates at its highest efficiency, has the potential for significant energy reduction without significant increase of capital costs. This paper presents a method for sustainable design of hybrid distillation-membrane schemes with guaranteed reduction of energy...

  20. Warpage analysis in injection moulding process

    Science.gov (United States)

    Hidayah, M. H. N.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Hazwan, M. H. M.

    2017-09-01

    This study was concentrated on the effects of process parameters in plastic injection moulding process towards warpage problem by using Autodesk Moldflow Insight (AMI) software for the simulation. In this study, plastic dispenser of dental floss has been analysed with thermoplastic material of Polypropylene (PP) used as the moulded material and details properties of 80 Tonne Nessei NEX 1000 injection moulding machine also has been used in this study. The variable parameters of the process are packing pressure, packing time, melt temperature and cooling time. Minimization of warpage obtained from the optimization and analysis data from the Design Expert software. Integration of Response Surface Methodology (RSM), Center Composite Design (CCD) with polynomial models that has been obtained from Design of Experiment (DOE) is the method used in this study. The results show that packing pressure is the main factor that will contribute to the formation of warpage in x-axis and y-axis. While in z-axis, the main factor is melt temperature and packing time is the less significant among the four parameters in x, y and z-axes. From optimal processing parameter, the value of warpage in x, y and z-axis have been optimised by 21.60%, 26.45% and 24.53%, respectively.