WorldWideScience

Sample records for based model-free analysis

  1. Comparing model-based and model-free analysis methods for QUASAR arterial spin labeling perfusion quantification.

    Science.gov (United States)

    Chappell, Michael A; Woolrich, Mark W; Petersen, Esben T; Golay, Xavier; Payne, Stephen J

    2013-05-01

    Amongst the various implementations of arterial spin labeling MRI methods for quantifying cerebral perfusion, the QUASAR method is unique. By using a combination of labeling with and without flow suppression gradients, the QUASAR method offers the separation of macrovascular and tissue signals. This permits local arterial input functions to be defined and "model-free" analysis, using numerical deconvolution, to be used. However, it remains unclear whether arterial spin labeling data are best treated using model-free or model-based analysis. This work provides a critical comparison of these two approaches for QUASAR arterial spin labeling in the healthy brain. An existing two-component (arterial and tissue) model was extended to the mixed flow suppression scheme of QUASAR to provide an optimal model-based analysis. The model-based analysis was extended to incorporate dispersion of the labeled bolus, generally regarded as the major source of discrepancy between the two analysis approaches. Model-free and model-based analyses were compared for perfusion quantification including absolute measurements, uncertainty estimation, and spatial variation in cerebral blood flow estimates. Major sources of discrepancies between model-free and model-based analysis were attributed to the effects of dispersion and the degree to which the two methods can separate macrovascular and tissue signal. Copyright © 2012 Wiley Periodicals, Inc.

  2. Evaluation-Function-based Model-free Adaptive Fuzzy Control

    Directory of Open Access Journals (Sweden)

    Agus Naba

    2016-12-01

    Full Text Available Designs of adaptive fuzzy controllers (AFC are commonly based on the Lyapunov approach, which requires a known model of the controlled plant. They need to consider a Lyapunov function candidate as an evaluation function to be minimized. In this study these drawbacks were handled by designing a model-free adaptive fuzzy controller (MFAFC using an approximate evaluation function defined in terms of the current state, the next state, and the control action. MFAFC considers the approximate evaluation function as an evaluative control performance measure similar to the state-action value function in reinforcement learning. The simulation results of applying MFAFC to the inverted pendulum benchmark verified the proposed scheme’s efficacy.

  3. A novel model-free data analysis technique based on clustering in a mutual information space: application to resting-state fMRI

    Directory of Open Access Journals (Sweden)

    Simon Benjaminsson

    2010-08-01

    Full Text Available Non-parametric data-driven analysis techniques can be used to study datasets with few assumptions about the data and underlying experiment. Variations of Independent Component Analysis (ICA have been the methods mostly used on fMRI data, e.g. in finding resting-state networks thought to reflect the connectivity of the brain. Here we present a novel data analysis technique and demonstrate it on resting-state fMRI data. It is a generic method with few underlying assumptions about the data. The results are built from the statistical relations between all input voxels, resulting in a whole-brain analysis on a voxel level. It has good scalability properties and the parallel implementation is capable of handling large datasets and databases. From the mutual information between the activities of the voxels over time, a distance matrix is created for all voxels in the input space. Multidimensional scaling is used to put the voxels in a lower-dimensional space reflecting the dependency relations based on the distance matrix. By performing clustering in this space we can find the strong statistical regularities in the data, which for the resting-state data turns out to be the resting-state networks. The decomposition is performed in the last step of the algorithm and is computationally simple. This opens up for rapid analysis and visualization of the data on different spatial levels, as well as automatically finding a suitable number of decomposition components.

  4. Model-free and model-based reward prediction errors in EEG.

    Science.gov (United States)

    Sambrook, Thomas D; Hardwick, Ben; Wills, Andy J; Goslin, Jeremy

    2018-05-24

    Learning theorists posit two reinforcement learning systems: model-free and model-based. Model-based learning incorporates knowledge about structure and contingencies in the world to assign candidate actions with an expected value. Model-free learning is ignorant of the world's structure; instead, actions hold a value based on prior reinforcement, with this value updated by expectancy violation in the form of a reward prediction error. Because they use such different learning mechanisms, it has been previously assumed that model-based and model-free learning are computationally dissociated in the brain. However, recent fMRI evidence suggests that the brain may compute reward prediction errors to both model-free and model-based estimates of value, signalling the possibility that these systems interact. Because of its poor temporal resolution, fMRI risks confounding reward prediction errors with other feedback-related neural activity. In the present study, EEG was used to show the presence of both model-based and model-free reward prediction errors and their place in a temporal sequence of events including state prediction errors and action value updates. This demonstration of model-based prediction errors questions a long-held assumption that model-free and model-based learning are dissociated in the brain. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Model-based and model-free Pavlovian reward learning: revaluation, revision, and revelation.

    Science.gov (United States)

    Dayan, Peter; Berridge, Kent C

    2014-06-01

    Evidence supports at least two methods for learning about reward and punishment and making predictions for guiding actions. One method, called model-free, progressively acquires cached estimates of the long-run values of circumstances and actions from retrospective experience. The other method, called model-based, uses representations of the environment, expectations, and prospective calculations to make cognitive predictions of future value. Extensive attention has been paid to both methods in computational analyses of instrumental learning. By contrast, although a full computational analysis has been lacking, Pavlovian learning and prediction has typically been presumed to be solely model-free. Here, we revise that presumption and review compelling evidence from Pavlovian revaluation experiments showing that Pavlovian predictions can involve their own form of model-based evaluation. In model-based Pavlovian evaluation, prevailing states of the body and brain influence value computations, and thereby produce powerful incentive motivations that can sometimes be quite new. We consider the consequences of this revised Pavlovian view for the computational landscape of prediction, response, and choice. We also revisit differences between Pavlovian and instrumental learning in the control of incentive motivation.

  6. Model-free stochastic processes studied with q-wavelet-based informational tools

    International Nuclear Information System (INIS)

    Perez, D.G.; Zunino, L.; Martin, M.T.; Garavaglia, M.; Plastino, A.; Rosso, O.A.

    2007-01-01

    We undertake a model-free investigation of stochastic processes employing q-wavelet based quantifiers, that constitute a generalization of their Shannon counterparts. It is shown that (i) interesting physical information becomes accessible in such a way (ii) for special q values the quantifiers are more sensitive than the Shannon ones and (iii) there exist an implicit relationship between the Hurst parameter H and q within this wavelet framework

  7. Pitchcontrol of wind turbines using model free adaptivecontrol based on wind turbine code

    DEFF Research Database (Denmark)

    Zhang, Yunqian; Chen, Zhe; Cheng, Ming

    2011-01-01

    value is only based on I/O data of the wind turbine is identified and then the wind turbine system is replaced by a dynamic linear time-varying model. In order to verify the correctness and robustness of the proposed model free adaptive pitch controller, the wind turbine code FAST which can predict......As the wind turbine is a nonlinear high-order system, to achieve good pitch control performance, model free adaptive control (MFAC) approach which doesn't need the mathematical model of the wind turbine is adopted in the pitch control system in this paper. A pseudo gradient vector whose estimation...... the wind turbine loads and response in high accuracy is used. The results show that the controller produces good dynamic performance, good robustness and adaptability....

  8. Variability in Dopamine Genes Dissociates Model-Based and Model-Free Reinforcement Learning.

    Science.gov (United States)

    Doll, Bradley B; Bath, Kevin G; Daw, Nathaniel D; Frank, Michael J

    2016-01-27

    Considerable evidence suggests that multiple learning systems can drive behavior. Choice can proceed reflexively from previous actions and their associated outcomes, as captured by "model-free" learning algorithms, or flexibly from prospective consideration of outcomes that might occur, as captured by "model-based" learning algorithms. However, differential contributions of dopamine to these systems are poorly understood. Dopamine is widely thought to support model-free learning by modulating plasticity in striatum. Model-based learning may also be affected by these striatal effects, or by other dopaminergic effects elsewhere, notably on prefrontal working memory function. Indeed, prominent demonstrations linking striatal dopamine to putatively model-free learning did not rule out model-based effects, whereas other studies have reported dopaminergic modulation of verifiably model-based learning, but without distinguishing a prefrontal versus striatal locus. To clarify the relationships between dopamine, neural systems, and learning strategies, we combine a genetic association approach in humans with two well-studied reinforcement learning tasks: one isolating model-based from model-free behavior and the other sensitive to key aspects of striatal plasticity. Prefrontal function was indexed by a polymorphism in the COMT gene, differences of which reflect dopamine levels in the prefrontal cortex. This polymorphism has been associated with differences in prefrontal activity and working memory. Striatal function was indexed by a gene coding for DARPP-32, which is densely expressed in the striatum where it is necessary for synaptic plasticity. We found evidence for our hypothesis that variations in prefrontal dopamine relate to model-based learning, whereas variations in striatal dopamine function relate to model-free learning. Decisions can stem reflexively from their previously associated outcomes or flexibly from deliberative consideration of potential choice outcomes

  9. Model-based and model-free “plug-and-play” building energy efficient control

    International Nuclear Information System (INIS)

    Baldi, Simone; Michailidis, Iakovos; Ravanis, Christos; Kosmatopoulos, Elias B.

    2015-01-01

    Highlights: • “Plug-and-play” Building Optimization and Control (BOC) driven by building data. • Ability to handle the large-scale and complex nature of the BOC problem. • Adaptation to learn the optimal BOC policy when no building model is available. • Comparisons with rule-based and advanced BOC strategies. • Simulation and real-life experiments in a ten-office building. - Abstract: Considerable research efforts in Building Optimization and Control (BOC) have been directed toward the development of “plug-and-play” BOC systems that can achieve energy efficiency without compromising thermal comfort and without the need of qualified personnel engaged in a tedious and time-consuming manual fine-tuning phase. In this paper, we report on how a recently introduced Parametrized Cognitive Adaptive Optimization – abbreviated as PCAO – can be used toward the design of both model-based and model-free “plug-and-play” BOC systems, with minimum human effort required to accomplish the design. In the model-based case, PCAO assesses the performance of its control strategy via a simulation model of the building dynamics; in the model-free case, PCAO optimizes its control strategy without relying on any model of the building dynamics. Extensive simulation and real-life experiments performed on a 10-office building demonstrate the effectiveness of the PCAO–BOC system in providing significant energy efficiency and improved thermal comfort. The mechanisms embedded within PCAO render it capable of automatically and quickly learning an efficient BOC strategy either in the presence of complex nonlinear simulation models of the building dynamics (model-based) or when no model for the building dynamics is available (model-free). Comparative studies with alternative state-of-the-art BOC systems show the effectiveness of the PCAO–BOC solution

  10. Model-Free Coordinated Control for MHTGR-Based Nuclear Steam Supply Systems

    Directory of Open Access Journals (Sweden)

    Zhe Dong

    2016-01-01

    Full Text Available The modular high temperature gas-cooled reactor (MHTGR is a typical small modular reactor (SMR that offers simpler, standardized and safer modular design by being factory built, requiring smaller initial capital investment, and having a shorter construction period. Thanks to its small size, the MHTGRs could be beneficial in providing electric power to remote areas that are deficient in transmission or distribution and in generating local power for large population centers. Based on the multi-modular operation scheme, the inherent safety feature of the MHTGRs can be applicable to large nuclear plants of any desired power rating. The MHTGR-based nuclear steam supplying system (NSSS is constituted by an MHTGR, a side-by-side arranged helical-coil once-through steam generator (OTSG and some connecting pipes. Due to the side-by-side arrangement, there is a tight coupling effect between the MHTGR and OTSG. Moreover, there always exists the parameter perturbation of the NSSSs. Thus, it is meaningful to study the model-free coordinated control of MHTGR-based NSSSs for safe, stable, robust and efficient operation. In this paper, a new model-free coordinated control strategy that regulates the nuclear power, MHTGR outlet helium temperature and OTSG outlet overheated steam temperature by properly adjusting the control rod position, helium flowrate and feed-water flowrate is established for the MHTGR-based NSSSs. Sufficient conditions for the globally asymptotic closed-loop stability is given. Finally, numerical simulation results in the cases of large range power decrease and increase illustrate the satisfactory performance of this newly-developed model-free coordinated NSSS control law.

  11. Model-free prediction and regression a transformation-based approach to inference

    CERN Document Server

    Politis, Dimitris N

    2015-01-01

    The Model-Free Prediction Principle expounded upon in this monograph is based on the simple notion of transforming a complex dataset to one that is easier to work with, e.g., i.i.d. or Gaussian. As such, it restores the emphasis on observable quantities, i.e., current and future data, as opposed to unobservable model parameters and estimates thereof, and yields optimal predictors in diverse settings such as regression and time series. Furthermore, the Model-Free Bootstrap takes us beyond point prediction in order to construct frequentist prediction intervals without resort to unrealistic assumptions such as normality. Prediction has been traditionally approached via a model-based paradigm, i.e., (a) fit a model to the data at hand, and (b) use the fitted model to extrapolate/predict future data. Due to both mathematical and computational constraints, 20th century statistical practice focused mostly on parametric models. Fortunately, with the advent of widely accessible powerful computing in the late 1970s, co...

  12. Parameters Tuning of Model Free Adaptive Control Based on Minimum Entropy

    Institute of Scientific and Technical Information of China (English)

    Chao Ji; Jing Wang; Liulin Cao; Qibing Jin

    2014-01-01

    Dynamic linearization based model free adaptive control(MFAC) algorithm has been widely used in practical systems, in which some parameters should be tuned before it is successfully applied to process industries. Considering the random noise existing in real processes, a parameter tuning method based on minimum entropy optimization is proposed,and the feature of entropy is used to accurately describe the system uncertainty. For cases of Gaussian stochastic noise and non-Gaussian stochastic noise, an entropy recursive optimization algorithm is derived based on approximate model or identified model. The extensive simulation results show the effectiveness of the minimum entropy optimization for the partial form dynamic linearization based MFAC. The parameters tuned by the minimum entropy optimization index shows stronger stability and more robustness than these tuned by other traditional index,such as integral of the squared error(ISE) or integral of timeweighted absolute error(ITAE), when the system stochastic noise exists.

  13. Model-free methods of analyzing domain motions in proteins from simulation : A comparison of normal mode analysis and molecular dynamics simulation of lysozyme

    NARCIS (Netherlands)

    Hayward, S.; Kitao, A.; Berendsen, H.J.C.

    Model-free methods are introduced to determine quantities pertaining to protein domain motions from normal mode analyses and molecular dynamics simulations, For the normal mode analysis, the methods are based on the assumption that in low frequency modes, domain motions can be well approximated by

  14. Dual RBFNNs-Based Model-Free Adaptive Control With Aspen HYSYS Simulation.

    Science.gov (United States)

    Zhu, Yuanming; Hou, Zhongsheng; Qian, Feng; Du, Wenli

    2017-03-01

    In this brief, we propose a new data-driven model-free adaptive control (MFAC) method with dual radial basis function neural networks (RBFNNs) for a class of discrete-time nonlinear systems. The main novelty lies in that it provides a systematic design method for controller structure by the direct usage of I/O data, rather than using the first-principle model or offline identified plant model. The controller structure is determined by equivalent-dynamic-linearization representation of the ideal nonlinear controller, and the controller parameters are tuned by the pseudogradient information extracted from the I/O data of the plant, which can deal with the unknown nonlinear system. The stability of the closed-loop control system and the stability of the training process for RBFNNs are guaranteed by rigorous theoretical analysis. Meanwhile, the effectiveness and the applicability of the proposed method are further demonstrated by the numerical example and Aspen HYSYS simulation of distillation column in crude styrene produce process.

  15. Dissolution process analysis using model-free Noyes-Whitney integral equation.

    Science.gov (United States)

    Hattori, Yusuke; Haruna, Yoshimasa; Otsuka, Makoto

    2013-02-01

    Drug dissolution process of solid dosages is theoretically described by Noyes-Whitney-Nernst equation. However, the analysis of the process is demonstrated assuming some models. Normally, the model-dependent methods are idealized and require some limitations. In this study, Noyes-Whitney integral equation was proposed and applied to represent the drug dissolution profiles of a solid formulation via the non-linear least squares (NLLS) method. The integral equation is a model-free formula involving the dissolution rate constant as a parameter. In the present study, several solid formulations were prepared via changing the blending time of magnesium stearate (MgSt) with theophylline monohydrate, α-lactose monohydrate, and crystalline cellulose. The formula could excellently represent the dissolution profile, and thereby the rate constant and specific surface area could be obtained by NLLS method. Since the long time blending coated the particle surface with MgSt, it was found that the water permeation was disturbed by its layer dissociating into disintegrant particles. In the end, the solid formulations were not disintegrated; however, the specific surface area gradually increased during the process of dissolution. The X-ray CT observation supported this result and demonstrated that the rough surface was dominant as compared to dissolution, and thus, specific surface area of the solid formulation gradually increased. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. A Model-free Approach to Fault Detection of Continuous-time Systems Based on Time Domain Data

    Institute of Scientific and Technical Information of China (English)

    Ping Zhang; Steven X. Ding

    2007-01-01

    In this paper, a model-free approach is presented to design an observer-based fault detection system of linear continuoustime systems based on input and output data in the time domain. The core of the approach is to directly identify parameters of the observer-based residual generator based on a numerically reliable data equation obtained by filtering and sampling the input and output signals.

  17. Propagation of experimental uncertainties using the Lipari-Szabo model-free analysis of protein dynamics

    International Nuclear Information System (INIS)

    Jin Danqing; Andrec, Michael; Montelione, Gaetano T.; Levy, Ronald M.

    1998-01-01

    In this paper we make use of the graphical procedure previously described [Jin, D. et al. (1997) J. Am. Chem. Soc., 119, 6923-6924] to analyze NMR relaxation data using the Lipari-Szabo model-free formalism. The graphical approach is advantageous in that it allows the direct visualization of the experimental uncertainties in the motional parameter space. Some general 'rules' describing the relationship between the precision of the relaxation measurements and the precision of the model-free parameters and how this relationship changes with the overall tumbling time (τm) are summarized. The effect of the precision in the relaxation measurements on the detection of internal motions not close to the extreme narrowing limit is analyzed. We also show that multiple timescale internal motions may be obscured by experimental uncertainty, and that the collection of relaxation data at very high field strength can improve the ability to detect such deviations from the simple Lipari-Szabo model

  18. Kinetics of the Thermal Degradation of Granulated Scrap Tyres: a Model-free Analysis

    Directory of Open Access Journals (Sweden)

    Félix A. LÓPEZ

    2013-12-01

    Full Text Available Pyrolysis is a technology with a promising future in the recycling of scrap tyres. This paper determines the thermal decomposition behaviour and kinetics of granulated scrap tyres (GST by examining the thermogravimetric/derivative thermogravimetric (TGA/DTG data obtained during their pyrolysis in an inert atmosphere at different heating rates. The model-free methods of Friedman, Flynn-Wall-Ozawa and Coats-Redfern were used to determine the reaction kinetics from the DTG data. The apparent activation energy and pre-exponential factor for the degradation of GST were calculated. A comparison with the results obtained by other authors was made.DOI: http://dx.doi.org/10.5755/j01.ms.19.4.2947

  19. Model-free method for isothermal and non-isothermal decomposition kinetics analysis of PET sample

    International Nuclear Information System (INIS)

    Saha, B.; Maiti, A.K.; Ghoshal, A.K.

    2006-01-01

    Pyrolysis, one possible alternative to recover valuable products from waste plastics, has recently been the subject of renewed interest. In the present study, the isoconversion methods, i.e., Vyazovkin model-free approach is applied to study non-isothermal decomposition kinetics of waste PET samples using various temperature integral approximations such as Coats and Redfern, Gorbachev, and Agrawal and Sivasubramanian approximation and direct integration (recursive adaptive Simpson quadrature scheme) to analyze the decomposition kinetics. The results show that activation energy (E α ) is a weak but increasing function of conversion (α) in case of non-isothermal decomposition and strong and decreasing function of conversion in case of isothermal decomposition. This indicates possible existence of nucleation, nuclei growth and gas diffusion mechanism during non-isothermal pyrolysis and nucleation and gas diffusion mechanism during isothermal pyrolysis. Optimum E α dependencies on α obtained for non-isothermal data showed similar nature for all the types of temperature integral approximations

  20. Model-free control

    Science.gov (United States)

    Fliess, Michel; Join, Cédric

    2013-12-01

    'Model-free control'and the corresponding 'intelligent' PID controllers (iPIDs), which already had many successful concrete applications, are presented here for the first time in an unified manner, where the new advances are taken into account. The basics of model-free control is now employing some old functional analysis and some elementary differential algebra. The estimation techniques become quite straightforward via a recent online parameter identification approach. The importance of iPIs and especially of iPs is deduced from the presence of friction. The strange industrial ubiquity of classic PIDs and the great difficulty for tuning them in complex situations is deduced, via an elementary sampling, from their connections with iPIDs. Several numerical simulations are presented which include some infinite-dimensional systems. They demonstrate not only the power of our intelligent controllers but also the great simplicity for tuning them.

  1. The "proactive" model of learning: Integrative framework for model-free and model-based reinforcement learning utilizing the associative learning-based proactive brain concept.

    Science.gov (United States)

    Zsuga, Judit; Biro, Klara; Papp, Csaba; Tajti, Gabor; Gesztelyi, Rudolf

    2016-02-01

    Reinforcement learning (RL) is a powerful concept underlying forms of associative learning governed by the use of a scalar reward signal, with learning taking place if expectations are violated. RL may be assessed using model-based and model-free approaches. Model-based reinforcement learning involves the amygdala, the hippocampus, and the orbitofrontal cortex (OFC). The model-free system involves the pedunculopontine-tegmental nucleus (PPTgN), the ventral tegmental area (VTA) and the ventral striatum (VS). Based on the functional connectivity of VS, model-free and model based RL systems center on the VS that by integrating model-free signals (received as reward prediction error) and model-based reward related input computes value. Using the concept of reinforcement learning agent we propose that the VS serves as the value function component of the RL agent. Regarding the model utilized for model-based computations we turned to the proactive brain concept, which offers an ubiquitous function for the default network based on its great functional overlap with contextual associative areas. Hence, by means of the default network the brain continuously organizes its environment into context frames enabling the formulation of analogy-based association that are turned into predictions of what to expect. The OFC integrates reward-related information into context frames upon computing reward expectation by compiling stimulus-reward and context-reward information offered by the amygdala and hippocampus, respectively. Furthermore we suggest that the integration of model-based expectations regarding reward into the value signal is further supported by the efferent of the OFC that reach structures canonical for model-free learning (e.g., the PPTgN, VTA, and VS). (c) 2016 APA, all rights reserved).

  2. Language acquisition is model-based rather than model-free.

    Science.gov (United States)

    Wang, Felix Hao; Mintz, Toben H

    2016-01-01

    Christiansen & Chater (C&C) propose that learning language is learning to process language. However, we believe that the general-purpose prediction mechanism they propose is insufficient to account for many phenomena in language acquisition. We argue from theoretical considerations and empirical evidence that many acquisition tasks are model-based, and that different acquisition tasks require different, specialized models.

  3. Textural features of dynamic contrast-enhanced MRI derived model-free and model-based parameter maps in glioma grading.

    Science.gov (United States)

    Xie, Tian; Chen, Xiao; Fang, Jingqin; Kang, Houyi; Xue, Wei; Tong, Haipeng; Cao, Peng; Wang, Sumei; Yang, Yizeng; Zhang, Weiguo

    2018-04-01

    Presurgical glioma grading by dynamic contrast-enhanced MRI (DCE-MRI) has unresolved issues. The aim of this study was to investigate the ability of textural features derived from pharmacokinetic model-based or model-free parameter maps of DCE-MRI in discriminating between different grades of gliomas, and their correlation with pathological index. Retrospective. Forty-two adults with brain gliomas. 3.0T, including conventional anatomic sequences and DCE-MRI sequences (variable flip angle T1-weighted imaging and three-dimensional gradient echo volumetric imaging). Regions of interest on the cross-sectional images with maximal tumor lesion. Five commonly used textural features, including Energy, Entropy, Inertia, Correlation, and Inverse Difference Moment (IDM), were generated. All textural features of model-free parameters (initial area under curve [IAUC], maximal signal intensity [Max SI], maximal up-slope [Max Slope]) could effectively differentiate between grade II (n = 15), grade III (n = 13), and grade IV (n = 14) gliomas (P textural features, Entropy and IDM, of four DCE-MRI parameters, including Max SI, Max Slope (model-free parameters), vp (Extended Tofts), and vp (Patlak) could differentiate grade III and IV gliomas (P textural features of any DCE-MRI parameter maps could discriminate between subtypes of grade II and III gliomas (P features revealed relatively lower inter-observer agreement. No significant correlation was found between microvascular density and textural features, compared with a moderate correlation found between cellular proliferation index and those features. Textural features of DCE-MRI parameter maps displayed a good ability in glioma grading. 3 Technical Efficacy: Stage 2 J. Magn. Reson. Imaging 2018;47:1099-1111. © 2017 International Society for Magnetic Resonance in Medicine.

  4. Can model-free reinforcement learning explain deontological moral judgments?

    Science.gov (United States)

    Ayars, Alisabeth

    2016-05-01

    Dual-systems frameworks propose that moral judgments are derived from both an immediate emotional response, and controlled/rational cognition. Recently Cushman (2013) proposed a new dual-system theory based on model-free and model-based reinforcement learning. Model-free learning attaches values to actions based on their history of reward and punishment, and explains some deontological, non-utilitarian judgments. Model-based learning involves the construction of a causal model of the world and allows for far-sighted planning; this form of learning fits well with utilitarian considerations that seek to maximize certain kinds of outcomes. I present three concerns regarding the use of model-free reinforcement learning to explain deontological moral judgment. First, many actions that humans find aversive from model-free learning are not judged to be morally wrong. Moral judgment must require something in addition to model-free learning. Second, there is a dearth of evidence for central predictions of the reinforcement account-e.g., that people with different reinforcement histories will, all else equal, make different moral judgments. Finally, to account for the effect of intention within the framework requires certain assumptions which lack support. These challenges are reasonable foci for future empirical/theoretical work on the model-free/model-based framework. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Model-free stabilization by extremum seeking

    CERN Document Server

    Scheinker, Alexander

    2017-01-01

    With this brief, the authors present algorithms for model-free stabilization of unstable dynamic systems. An extremum-seeking algorithm assigns the role of a cost function to the dynamic system’s control Lyapunov function (clf) aiming at its minimization. The minimization of the clf drives the clf to zero and achieves asymptotic stabilization. This approach does not rely on, or require knowledge of, the system model. Instead, it employs periodic perturbation signals, along with the clf. The same effect is achieved as by using clf-based feedback laws that profit from modeling knowledge, but in a time-average sense. Rather than use integrals of the systems vector field, we employ Lie-bracket-based (i.e., derivative-based) averaging. The brief contains numerous examples and applications, including examples with unknown control directions and experiments with charged particle accelerators. It is intended for theoretical control engineers and mathematicians, and practitioners working in various industrial areas ...

  6. Is there any correlation between model-based perfusion parameters and model-free parameters of time-signal intensity curve on dynamic contrast enhanced MRI in breast cancer patients?

    Energy Technology Data Exchange (ETDEWEB)

    Yi, Boram; Kang, Doo Kyoung; Kim, Tae Hee [Ajou University School of Medicine, Department of Radiology, Suwon, Gyeonggi-do (Korea, Republic of); Yoon, Dukyong [Ajou University School of Medicine, Department of Biomedical Informatics, Suwon (Korea, Republic of); Jung, Yong Sik; Kim, Ku Sang [Ajou University School of Medicine, Department of Surgery, Suwon (Korea, Republic of); Yim, Hyunee [Ajou University School of Medicine, Department of Pathology, Suwon (Korea, Republic of)

    2014-05-15

    To find out any correlation between dynamic contrast-enhanced (DCE) model-based parameters and model-free parameters, and evaluate correlations between perfusion parameters with histologic prognostic factors. Model-based parameters (Ktrans, Kep and Ve) of 102 invasive ductal carcinomas were obtained using DCE-MRI and post-processing software. Correlations between model-based and model-free parameters and between perfusion parameters and histologic prognostic factors were analysed. Mean Kep was significantly higher in cancers showing initial rapid enhancement (P = 0.002) and a delayed washout pattern (P = 0.001). Ve was significantly lower in cancers showing a delayed washout pattern (P = 0.015). Kep significantly correlated with time to peak enhancement (TTP) (ρ = -0.33, P < 0.001) and washout slope (ρ = 0.39, P = 0.002). Ve was significantly correlated with TTP (ρ = 0.33, P = 0.002). Mean Kep was higher in tumours with high nuclear grade (P = 0.017). Mean Ve was lower in tumours with high histologic grade (P = 0.005) and in tumours with negative oestrogen receptor status (P = 0.047). TTP was shorter in tumours with negative oestrogen receptor status (P = 0.037). We could acquire general information about the tumour vascular physiology, interstitial space volume and pathologic prognostic factors by analyzing time-signal intensity curve without a complicated acquisition process for the model-based parameters. (orig.)

  7. Model-Free Adaptive Control Algorithm with Data Dropout Compensation

    Directory of Open Access Journals (Sweden)

    Xuhui Bu

    2012-01-01

    Full Text Available The convergence of model-free adaptive control (MFAC algorithm can be guaranteed when the system is subject to measurement data dropout. The system output convergent speed gets slower as dropout rate increases. This paper proposes a MFAC algorithm with data compensation. The missing data is first estimated using the dynamical linearization method, and then the estimated value is introduced to update control input. The convergence analysis of the proposed MFAC algorithm is given, and the effectiveness is also validated by simulations. It is shown that the proposed algorithm can compensate the effect of the data dropout, and the better output performance can be obtained.

  8. General Form of Model-Free Control Law and Convergence Analyzing

    Directory of Open Access Journals (Sweden)

    Xiuying Li

    2012-01-01

    Full Text Available The general form of model-free control law is introduced, and its convergence is analyzed. Firstly, the necessity to improve the basic form of model free control law is explained, and the functional combination method as the approach of improvement is presented. Then, a series of sufficient conditions of convergence are given. The analysis denotes that these conditions can be satisfied easily in the engineering practice.

  9. Model-Free Adaptive Control for Unknown Nonlinear Zero-Sum Differential Game.

    Science.gov (United States)

    Zhong, Xiangnan; He, Haibo; Wang, Ding; Ni, Zhen

    2018-05-01

    In this paper, we present a new model-free globalized dual heuristic dynamic programming (GDHP) approach for the discrete-time nonlinear zero-sum game problems. First, the online learning algorithm is proposed based on the GDHP method to solve the Hamilton-Jacobi-Isaacs equation associated with optimal regulation control problem. By setting backward one step of the definition of performance index, the requirement of system dynamics, or an identifier is relaxed in the proposed method. Then, three neural networks are established to approximate the optimal saddle point feedback control law, the disturbance law, and the performance index, respectively. The explicit updating rules for these three neural networks are provided based on the data generated during the online learning along the system trajectories. The stability analysis in terms of the neural network approximation errors is discussed based on the Lyapunov approach. Finally, two simulation examples are provided to show the effectiveness of the proposed method.

  10. Model-Free Autotuning Testing on a Model of a Three-Tank Cascade

    Directory of Open Access Journals (Sweden)

    Stanislav VRÁNA

    2009-06-01

    Full Text Available A newly developed model-free autotuning method based on frequency response analysis has been tested on a laboratory set-up that represents a physical model of a three-tank cascade. This laboratory model was chosen for the following reasons: a the laboratory model was ready for computer control; b simultaneously, computer simulation could be effectively utilized, because a mathematical description of the cascade based on quite exactly valid relations was available; c the set-up provided the necessary degree of nonlinearity and changeable properties. The improvement of the laboratory set-up instrumentation presented here was necessary because the results obtained from the first experimental identification did not correspond to the results provided by the simulation. The data was evidently imprecise, because the available sensors and the conditions for process settling were inadequate.

  11. Hyper-chaos encryption using convolutional masking and model free unmasking

    International Nuclear Information System (INIS)

    Qi Guo-Yuan; Matondo Sandra Bazebo

    2014-01-01

    In this paper, during the masking process the encrypted message is convolved and embedded into a Qi hyper-chaotic system characterizing a high disorder degree. The masking scheme was tested using both Qi hyper-chaos and Lorenz chaos and indicated that Qi hyper-chaos based masking can resist attacks of the filtering and power spectrum analysis, while the Lorenz based scheme fails for high amplitude data. To unmask the message at the receiving end, two methods are proposed. In the first method, a model-free synchronizer, i.e. a multivariable higher-order differential feedback controller between the transmitter and receiver is employed to de-convolve the message embedded in the receiving signal. In the second method, no synchronization is required since the message is de-convolved using the information of the estimated derivative. (general)

  12. A Model-Free Diagnostic for Single-Peakedness of Item Responses Using Ordered Conditional Means

    Science.gov (United States)

    Polak, Marike; De Rooij, Mark; Heiser, Willem J.

    2012-01-01

    In this article we propose a model-free diagnostic for single-peakedness (unimodality) of item responses. Presuming a unidimensional unfolding scale and a given item ordering, we approximate item response functions of all items based on ordered conditional means (OCM). The proposed OCM methodology is based on Thurstone & Chave's (1929) "criterion…

  13. Determination of pyrolysis characteristics and kinetics of palm kernel shell using TGA–FTIR and model-free integral methods

    International Nuclear Information System (INIS)

    Ma, Zhongqing; Chen, Dengyu; Gu, Jie; Bao, Binfu; Zhang, Qisheng

    2015-01-01

    Highlights: • Model-free integral kinetics method and analytical TGA–FTIR were conducted on pyrolysis process of PKS. • The pyrolysis mechanism of PKS was elaborated. • Thermal stability was established: lignin > cellulose > xylan. • Detailed compositions in the volatiles of PKS pyrolysis were determinated. • The interaction of biomass three components led to the fluctuation of activation energy in PKS pyrolysis. - Abstract: Palm kernel shell (PKS) from palm oil production is a potential biomass source for bio-energy production. A fundamental understanding of PKS pyrolysis behavior and kinetics is essential to its efficient thermochemical conversion. The thermal degradation profile in derivative thermogravimetry (DTG) analysis shown two significant mass-loss peaks mainly related to the decomposition of hemicellulose and cellulose respectively. This characteristic differentiated with other biomass (e.g. wheat straw and corn stover) presented just one peak or accompanied with an extra “shoulder” peak (e.g. wheat straw). According to the Fourier transform infrared spectrometry (FTIR) analysis, the prominent volatile components generated by the pyrolysis of PKS were CO 2 (2400–2250 cm −1 and 586–726 cm −1 ), aldehydes, ketones, organic acids (1900–1650 cm −1 ), and alkanes, phenols (1475–1000 cm −1 ). The activation energy dependent on the conversion rate was estimated by two model-free integral methods: Flynn–Wall–Ozawa (FWO) and Kissinger–Akahira–Sunose (KAS) method at different heating rates. The fluctuation of activation energy can be interpreted as a result of interactive reactions related to cellulose, hemicellulose and lignin degradation, occurred in the pyrolysis process. Based on TGA–FTIR analysis and model free integral kinetics method, the pyrolysis mechanism of PKS was elaborated in this paper

  14. Formation of model-free motor memories during motor adaptation depends on perturbation schedule.

    Science.gov (United States)

    Orban de Xivry, Jean-Jacques; Lefèvre, Philippe

    2015-04-01

    Motor adaptation to an external perturbation relies on several mechanisms such as model-based, model-free, strategic, or repetition-dependent learning. Depending on the experimental conditions, each of these mechanisms has more or less weight in the final adaptation state. Here we focused on the conditions that lead to the formation of a model-free motor memory (Huang VS, Haith AM, Mazzoni P, Krakauer JW. Neuron 70: 787-801, 2011), i.e., a memory that does not depend on an internal model or on the size or direction of the errors experienced during the learning. The formation of such model-free motor memory was hypothesized to depend on the schedule of the perturbation (Orban de Xivry JJ, Ahmadi-Pajouh MA, Harran MD, Salimpour Y, Shadmehr R. J Neurophysiol 109: 124-136, 2013). Here we built on this observation by directly testing the nature of the motor memory after abrupt or gradual introduction of a visuomotor rotation, in an experimental paradigm where the presence of model-free motor memory can be identified (Huang VS, Haith AM, Mazzoni P, Krakauer JW. Neuron 70: 787-801, 2011). We found that relearning was faster after abrupt than gradual perturbation, which suggests that model-free learning is reduced during gradual adaptation to a visuomotor rotation. In addition, the presence of savings after abrupt introduction of the perturbation but gradual extinction of the motor memory suggests that unexpected errors are necessary to induce a model-free motor memory. Overall, these data support the hypothesis that different perturbation schedules do not lead to a more or less stabilized motor memory but to distinct motor memories with different attributes and neural representations. Copyright © 2015 the American Physiological Society.

  15. Self-consistent residual dipolar coupling based model-free analysis for the robust determination of nanosecond to microsecond protein dynamics

    International Nuclear Information System (INIS)

    Lakomek, Nils-Alexander; Walter, Korvin F. A.; Fares, Christophe; Lange, Oliver F.; Groot, Bert L. de; Grubmueller, Helmut; Brueschweiler, Rafael; Munk, Axel; Becker, Stefan; Meiler, Jens; Griesinger, Christian

    2008-01-01

    Residual dipolar couplings (RDCs) provide information about the dynamic average orientation of inter-nuclear vectors and amplitudes of motion up to milliseconds. They complement relaxation methods, especially on a time-scale window that we have called supra-τ c (τ c c rdc > = 0.72 ± 0.02 compared to LS 2 > = 0.778 ± 0.003 for the Lipari-Szabo order parameters, indicating that the inclusion of the supra-τ c window increases the averaged amplitude of mobility observed in the sub-τ c window by about 34%. For the β-strand spanned by residues Lys48 to Leu50, an alternating pattern of backbone NH RDC order parameter S rdc 2 (NH) = (0.59, 0.72, 0.59) was extracted. The backbone of Lys48, whose side chain is known to be involved in the poly-ubiquitylation process that leads to protein degradation, is very mobile on the supra-τ c time scale (S rdc 2 (NH) = 0.59 ± 0.03), while it is inconspicuous (S LS 2 (NH) = 0.82) on the sub-τ c as well as on μs-ms relaxation dispersion time scales. The results of this work differ from previous RDC dynamics studies of ubiquitin in the sense that the results are essentially independent of structural noise providing a much more robust assessment of dynamic effects that underlie the RDC data

  16. Trajectory Based Traffic Analysis

    DEFF Research Database (Denmark)

    Krogh, Benjamin Bjerre; Andersen, Ove; Lewis-Kelham, Edwin

    2013-01-01

    We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point-and-click a......We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point......-and-click analysis, due to a novel and efficient indexing structure. With the web-site daisy.aau.dk/its/spqdemo/we will demonstrate several analyses, using a very large real-world data set consisting of 1.9 billion GPS records (1.5 million trajectories) recorded from more than 13000 vehicles, and touching most...

  17. Model-free adaptive control of advanced power plants

    Science.gov (United States)

    Cheng, George Shu-Xing; Mulkey, Steven L.; Wang, Qiang

    2015-08-18

    A novel 3-Input-3-Output (3.times.3) Model-Free Adaptive (MFA) controller with a set of artificial neural networks as part of the controller is introduced. A 3.times.3 MFA control system using the inventive 3.times.3 MFA controller is described to control key process variables including Power, Steam Throttle Pressure, and Steam Temperature of boiler-turbine-generator (BTG) units in conventional and advanced power plants. Those advanced power plants may comprise Once-Through Supercritical (OTSC) Boilers, Circulating Fluidized-Bed (CFB) Boilers, and Once-Through Supercritical Circulating Fluidized-Bed (OTSC CFB) Boilers.

  18. Optimal model-free prediction from multivariate time series

    Science.gov (United States)

    Runge, Jakob; Donner, Reik V.; Kurths, Jürgen

    2015-05-01

    Forecasting a time series from multivariate predictors constitutes a challenging problem, especially using model-free approaches. Most techniques, such as nearest-neighbor prediction, quickly suffer from the curse of dimensionality and overfitting for more than a few predictors which has limited their application mostly to the univariate case. Therefore, selection strategies are needed that harness the available information as efficiently as possible. Since often the right combination of predictors matters, ideally all subsets of possible predictors should be tested for their predictive power, but the exponentially growing number of combinations makes such an approach computationally prohibitive. Here a prediction scheme that overcomes this strong limitation is introduced utilizing a causal preselection step which drastically reduces the number of possible predictors to the most predictive set of causal drivers making a globally optimal search scheme tractable. The information-theoretic optimality is derived and practical selection criteria are discussed. As demonstrated for multivariate nonlinear stochastic delay processes, the optimal scheme can even be less computationally expensive than commonly used suboptimal schemes like forward selection. The method suggests a general framework to apply the optimal model-free approach to select variables and subsequently fit a model to further improve a prediction or learn statistical dependencies. The performance of this framework is illustrated on a climatological index of El Niño Southern Oscillation.

  19. Model-Free Trajectory Optimisation for Unmanned Aircraft Serving as Data Ferries for Widespread Sensors

    Directory of Open Access Journals (Sweden)

    Ben Pearre

    2012-10-01

    Full Text Available Given multiple widespread stationary data sources such as ground-based sensors, an unmanned aircraft can fly over the sensors and gather the data via a wireless link. Performance criteria for such a network may incorporate costs such as trajectory length for the aircraft or the energy required by the sensors for radio transmission. Planning is hampered by the complex vehicle and communication dynamics and by uncertainty in the locations of sensors, so we develop a technique based on model-free learning. We present a stochastic optimisation method that allows the data-ferrying aircraft to optimise data collection trajectories through an unknown environment in situ, obviating the need for system identification. We compare two trajectory representations, one that learns near-optimal trajectories at low data requirements but that fails at high requirements, and one that gives up some performance in exchange for a data collection guarantee. With either encoding the ferry is able to learn significantly improved trajectories compared with alternative heuristics. To demonstrate the versatility of the model-free learning approach, we also learn a policy to minimise the radio transmission energy required by the sensor nodes, allowing prolonged network lifetime.

  20. Policy improvement by a model-free Dyna architecture.

    Science.gov (United States)

    Hwang, Kao-Shing; Lo, Chia-Yue

    2013-05-01

    The objective of this paper is to accelerate the process of policy improvement in reinforcement learning. The proposed Dyna-style system combines two learning schemes, one of which utilizes a temporal difference method for direct learning; the other uses relative values for indirect learning in planning between two successive direct learning cycles. Instead of establishing a complicated world model, the approach introduces a simple predictor of average rewards to actor-critic architecture in the simulation (planning) mode. The relative value of a state, defined as the accumulated differences between immediate reward and average reward, is used to steer the improvement process in the right direction. The proposed learning scheme is applied to control a pendulum system for tracking a desired trajectory to demonstrate its adaptability and robustness. Through reinforcement signals from the environment, the system takes the appropriate action to drive an unknown dynamic to track desired outputs in few learning cycles. Comparisons are made between the proposed model-free method, a connectionist adaptive heuristic critic, and an advanced method of Dyna-Q learning in the experiments of labyrinth exploration. The proposed method outperforms its counterparts in terms of elapsed time and convergence rate.

  1. DTI analysis methods : Voxel-based analysis

    NARCIS (Netherlands)

    Van Hecke, Wim; Leemans, Alexander; Emsell, Louise

    2016-01-01

    Voxel-based analysis (VBA) of diffusion tensor imaging (DTI) data permits the investigation of voxel-wise differences or changes in DTI metrics in every voxel of a brain dataset. It is applied primarily in the exploratory analysis of hypothesized group-level alterations in DTI parameters, as it does

  2. A generalized mean-squared displacement from inelastic fixed window scans of incoherent neutron scattering as a model-free indicator of anomalous diffusion confinement

    International Nuclear Information System (INIS)

    Roosen-Runge, F.; Seydel, T.

    2015-01-01

    Elastic fixed window scans of incoherent neutron scattering are an established and frequently employed method to study dynamical changes, usually over a broad temperature range or during a process such as a conformational change in the sample. In particular, the apparent mean-squared displacement can be extracted via a model-free analysis based on a solid physical interpretation as an effective amplitude of molecular motions. Here, we provide a new account of elastic and inelastic fixed window scans, defining a generalized mean-squared displacement for all fixed energy transfers. We show that this generalized mean-squared displacement in principle contains all information on the real mean-square displacement accessible in the instrumental time window. The derived formula provides a clear understanding of the effects of instrumental resolution on the apparent mean-squared displacement. Finally, we show that the generalized mean-square displacement can be used as a model-free indicator on confinement effects within the instrumental time window. (authors)

  3. Model-free approach to the estimation of radiation hazards. I. Theory

    International Nuclear Information System (INIS)

    Zaider, M.; Brenner, D.J.

    1986-01-01

    The experience of the Japanese atomic bomb survivors constitutes to date the major data base for evaluating the effects of low doses of ionizing radiation on human populations. Although numerous analyses have been performed and published concerning this experience, it is clear that no consensus has emerged as to the conclusions that may be drawn to assist in setting realistic radiation protection guidelines. In part this is an inherent consequences of the rather limited amount of data available. In this paper the authors address an equally important problem; namely, the use of arbitrary parametric risk models which have little theoretical foundation, yet almost totally determine the final conclusions drawn. They propose the use of a model-free approach to the estimation of radiation hazards

  4. A model-free description of the experience of the Japanese atomic bomb survivors

    International Nuclear Information System (INIS)

    Zaider, M.; Brenner, D.J.

    1985-01-01

    The effects of low doses of radiation on human populations are primarily evaluated from the experience of the Japanese atomic bomb survivors. Most evaluations performed to date have in common: a) the use of a parametric model for estimating risks, and b) factorizing, in the model, the time post-exposure and dose dependencies. Since there is very little theoretical understanding of carcinogenesis the choice of any one model is arbitrary. Furthermore different models make risk predictions which might differ by factors as large as 100. The Japanese data base is analyzed here using non-parametric monotonic regression methods. These methods make use of only one assumption, namely that the effect is monotonically changing with its covariates (neutron and gamma dose, and time). With these mild restrictions, model-free risk predictions are made of the risks of being irradiated with indirectly ionizing radiation, whether singly or in combination

  5. Model-free adaptive speed control on travelling wave ultrasonic motor

    Science.gov (United States)

    Di, Sisi; Li, Huafeng

    2018-01-01

    This paper introduced a new data-driven control (DDC) method for the speed control of ultrasonic motor (USM). The model-free adaptive control (MFAC) strategy was presented in terms of its principles, algorithms, and parameter selection. To verify the efficiency of the proposed method, a speed-frequency-time model, which contained all the measurable nonlinearity and uncertainties based on experimental data was established for simulation to mimic the USM operation system. Furthermore, the model was identified using particle swarm optimization (PSO) method. Then, the control of the simulated system using MFAC was evaluated under different expectations in terms of overshoot, rise time and steady-state error. Finally, the MFAC results were compared with that of proportion iteration differentiation (PID) to demonstrate its advantages in controlling general random system.

  6. Base compaction specification feasibility analysis.

    Science.gov (United States)

    2012-12-01

    The objective of this research is to establish the technical engineering and cost : analysis concepts that will enable WisDOT management to objectively evaluate the : feasibility of switching construction specification philosophies for aggregate base...

  7. Hand-Based Biometric Analysis

    Science.gov (United States)

    Bebis, George (Inventor); Amayeh, Gholamreza (Inventor)

    2015-01-01

    Hand-based biometric analysis systems and techniques are described which provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an in put image. Additionally, the analysis utilizes re-use of commonly-seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  8. Model-free and analytical EAP reconstruction via spherical polar Fourier diffusion MRI.

    Science.gov (United States)

    Cheng, Jian; Ghosh, Aurobrata; Jiang, Tianzi; Deriche, Rachid

    2010-01-01

    How to estimate the diffusion Ensemble Average Propagator (EAP) from the DWI signals in q-space is an open problem in diffusion MRI field. Many methods were proposed to estimate the Orientation Distribution Function (ODF) that is used to describe the fiber direction. However, ODF is just one of the features of the EAP. Compared with ODF, EAP has the full information about the diffusion process which reflects the complex tissue micro-structure. Diffusion Orientation Transform (DOT) and Diffusion Spectrum Imaging (DSI) are two important methods to estimate the EAP from the signal. However, DOT is based on mono-exponential assumption and DSI needs a lot of samplings and very large b values. In this paper, we propose Spherical Polar Fourier Imaging (SPFI), a novel model-free fast robust analytical EAP reconstruction method, which almost does not need any assumption of data and does not need too many samplings. SPFI naturally combines the DWI signals with different b-values. It is an analytical linear transformation from the q-space signal to the EAP profile represented by Spherical Harmonics (SH). We validated the proposed methods in synthetic data, phantom data and real data. It works well in all experiments, especially for the data with low SNR, low anisotropy, and non-exponential decay.

  9. A Model-Free Scheme for Meme Ranking in Social Media

    Science.gov (United States)

    He, Saike; Zheng, Xiaolong; Zeng, Daniel

    2015-01-01

    The prevalence of social media has greatly catalyzed the dissemination and proliferation of online memes (e.g., ideas, topics, melodies, tags, etc.). However, this information abundance is exceeding the capability of online users to consume it. Ranking memes based on their popularities could promote online advertisement and content distribution. Despite such importance, few existing work can solve this problem well. They are either daunted by unpractical assumptions or incapability of characterizing dynamic information. As such, in this paper, we elaborate a model-free scheme to rank online memes in the context of social media. This scheme is capable to characterize the nonlinear interactions of online users, which mark the process of meme diffusion. Empirical studies on two large-scale, real-world datasets (one in English and one in Chinese) demonstrate the effectiveness and robustness of the proposed scheme. In addition, due to its fine-grained modeling of user dynamics, this ranking scheme can also be utilized to explain meme popularity through the lens of social influence. PMID:26823638

  10. A Model-Free Scheme for Meme Ranking in Social Media.

    Science.gov (United States)

    He, Saike; Zheng, Xiaolong; Zeng, Daniel

    2016-01-01

    The prevalence of social media has greatly catalyzed the dissemination and proliferation of online memes (e.g., ideas, topics, melodies, tags, etc.). However, this information abundance is exceeding the capability of online users to consume it. Ranking memes based on their popularities could promote online advertisement and content distribution. Despite such importance, few existing work can solve this problem well. They are either daunted by unpractical assumptions or incapability of characterizing dynamic information. As such, in this paper, we elaborate a model-free scheme to rank online memes in the context of social media. This scheme is capable to characterize the nonlinear interactions of online users, which mark the process of meme diffusion. Empirical studies on two large-scale, real-world datasets (one in English and one in Chinese) demonstrate the effectiveness and robustness of the proposed scheme. In addition, due to its fine-grained modeling of user dynamics, this ranking scheme can also be utilized to explain meme popularity through the lens of social influence.

  11. Importance-performance analysis based SWOT analysis

    OpenAIRE

    Phadermrod, Boonyarat; Crowder, Richard M.; Wills, Gary B.

    2016-01-01

    SWOT analysis, a commonly used tool for strategic planning, is traditionally a form of brainstorming. Hence, it has been criticised that it is likely to hold subjective views of the individuals who participate in a brainstorming session and that SWOT factors are not prioritized by their significance thus it may result in an improper strategic action. While most studies of SWOT analysis have only focused on solving these shortcomings separately, this study offers an approach to diminish both s...

  12. Model-free adaptive control optimization using a chaotic particle swarm approach

    International Nuclear Information System (INIS)

    Santos Coelho, Leandro dos; Rodrigues Coelho, Antonio Augusto

    2009-01-01

    It is well known that conventional control theories are widely suited for applications where the processes can be reasonably described in advance. However, when the plant's dynamics are hard to characterize precisely or are subject to environmental uncertainties, one may encounter difficulties in applying the conventional controller design methodologies. Despite the difficulty in achieving high control performance, the fine tuning of controller parameters is a tedious task that always requires experts with knowledge in both control theory and process information. Nowadays, more and more studies have focused on the development of adaptive control algorithms that can be directly applied to complex processes whose dynamics are poorly modeled and/or have severe nonlinearities. In this context, the design of a Model-Free Learning Adaptive Control (MFLAC) based on pseudo-gradient concepts and optimization procedure by a Particle Swarm Optimization (PSO) approach using constriction coefficient and Henon chaotic sequences (CPSOH) is presented in this paper. PSO is a stochastic global optimization technique inspired by social behavior of bird flocking. The PSO models the exploration of a problem space by a population of particles. Each particle in PSO has a randomized velocity associated to it, which moves through the space of the problem. Since chaotic mapping enjoys certainty, ergodicity and the stochastic property, the proposed CPSOH introduces chaos mapping which introduces some flexibility in particle movements in each iteration. The chaotic sequences allow also explorations at early stages and exploitations at later stages during the search procedure of CPSOH. Motivation for application of CPSOH approach is to overcome the limitation of the conventional MFLAC design, which cannot guarantee satisfactory control performance when the plant has different gains for the operational range when designed by trial-and-error by user. Numerical results of the MFLAC with CPSOH

  13. Model-free adaptive control optimization using a chaotic particle swarm approach

    Energy Technology Data Exchange (ETDEWEB)

    Santos Coelho, Leandro dos [Industrial and Systems Engineering Graduate Program, LAS/PPGEPS, Pontifical Catholic University of Parana, PUCPR, Imaculada Conceicao, 1155, 80215-901 Curitiba, Parana (Brazil)], E-mail: leandro.coelho@pucpr.br; Rodrigues Coelho, Antonio Augusto [Department of Automation and Systems, Federal University of Santa Catarina, Box 476, 88040-900 Florianopolis, Santa Catarina (Brazil)], E-mail: aarc@das.ufsc.br

    2009-08-30

    It is well known that conventional control theories are widely suited for applications where the processes can be reasonably described in advance. However, when the plant's dynamics are hard to characterize precisely or are subject to environmental uncertainties, one may encounter difficulties in applying the conventional controller design methodologies. Despite the difficulty in achieving high control performance, the fine tuning of controller parameters is a tedious task that always requires experts with knowledge in both control theory and process information. Nowadays, more and more studies have focused on the development of adaptive control algorithms that can be directly applied to complex processes whose dynamics are poorly modeled and/or have severe nonlinearities. In this context, the design of a Model-Free Learning Adaptive Control (MFLAC) based on pseudo-gradient concepts and optimization procedure by a Particle Swarm Optimization (PSO) approach using constriction coefficient and Henon chaotic sequences (CPSOH) is presented in this paper. PSO is a stochastic global optimization technique inspired by social behavior of bird flocking. The PSO models the exploration of a problem space by a population of particles. Each particle in PSO has a randomized velocity associated to it, which moves through the space of the problem. Since chaotic mapping enjoys certainty, ergodicity and the stochastic property, the proposed CPSOH introduces chaos mapping which introduces some flexibility in particle movements in each iteration. The chaotic sequences allow also explorations at early stages and exploitations at later stages during the search procedure of CPSOH. Motivation for application of CPSOH approach is to overcome the limitation of the conventional MFLAC design, which cannot guarantee satisfactory control performance when the plant has different gains for the operational range when designed by trial-and-error by user. Numerical results of the MFLAC with

  14. Simple Model-Free Controller for the Stabilization of Planetary Inverted Pendulum

    Directory of Open Access Journals (Sweden)

    Huanhuan Mai

    2012-01-01

    Full Text Available A simple model-free controller is presented for solving the nonlinear dynamic control problems. As an example of the problem, a planetary gear-type inverted pendulum (PIP is discussed. To control the inherently unstable system which requires real-time control responses, the design of a smart and simple controller is made necessary. The model-free controller proposed includes a swing-up controller part and a stabilization controller part; neither controller has any information about the PIP. Since the input/output scaling parameters of the fuzzy controller are highly sensitive, we use genetic algorithm (GA to obtain the optimal control parameters. The experimental results show the effectiveness and robustness of the present controller.

  15. Model-free prediction of noisy chaotic time series by deep learning

    OpenAIRE

    Yeo, Kyongmin

    2017-01-01

    We present a deep neural network for a model-free prediction of a chaotic dynamical system from noisy observations. The proposed deep learning model aims to predict the conditional probability distribution of a state variable. The Long Short-Term Memory network (LSTM) is employed to model the nonlinear dynamics and a softmax layer is used to approximate a probability distribution. The LSTM model is trained by minimizing a regularized cross-entropy function. The LSTM model is validated against...

  16. Complex Wavelet Based Modulation Analysis

    DEFF Research Database (Denmark)

    Luneau, Jean-Marc; Lebrun, Jérôme; Jensen, Søren Holdt

    2008-01-01

    Low-frequency modulation of sound carry important information for speech and music. The modulation spectrum i commonly obtained by spectral analysis of the sole temporal envelopes of the sub-bands out of a time-frequency analysis. Processing in this domain usually creates undesirable distortions...... polynomial trends. Moreover an analytic Hilbert-like transform is possible with complex wavelets implemented as an orthogonal filter bank. By working in an alternative transform domain coined as “Modulation Subbands”, this transform shows very promising denoising capabilities and suggests new approaches for joint...

  17. CONTEXT BASED FOOD IMAGE ANALYSIS

    OpenAIRE

    He, Ye; Xu, Chang; Khanna, Nitin; Boushey, Carol J.; Delp, Edward J.

    2013-01-01

    We are developing a dietary assessment system that records daily food intake through the use of food images. Recognizing food in an image is difficult due to large visual variance with respect to eating or preparation conditions. This task becomes even more challenging when different foods have similar visual appearance. In this paper we propose to incorporate two types of contextual dietary information, food co-occurrence patterns and personalized learning models, in food image analysis to r...

  18. Lattice and strain analysis of atomic resolution Z-contrast images based on template matching

    Energy Technology Data Exchange (ETDEWEB)

    Zuo, Jian-Min, E-mail: jianzuo@uiuc.edu [Department of Materials Science and Engineering, University of Illinois, Urbana, IL 61801 (United States); Seitz Materials Research Laboratory, University of Illinois, Urbana, IL 61801 (United States); Shah, Amish B. [Center for Microanalysis of Materials, Materials Research Laboratory, University of Illinois at Urbana-Champaign, Urbana, IL 61801 (United States); Kim, Honggyu; Meng, Yifei; Gao, Wenpei [Department of Materials Science and Engineering, University of Illinois, Urbana, IL 61801 (United States); Seitz Materials Research Laboratory, University of Illinois, Urbana, IL 61801 (United States); Rouviére, Jean-Luc [CEA-INAC/UJF-Grenoble UMR-E, SP2M, LEMMA, Minatec, Grenoble 38054 (France)

    2014-01-15

    A real space approach is developed based on template matching for quantitative lattice analysis using atomic resolution Z-contrast images. The method, called TeMA, uses the template of an atomic column, or a group of atomic columns, to transform the image into a lattice of correlation peaks. This is helped by using a local intensity adjusted correlation and by the design of templates. Lattice analysis is performed on the correlation peaks. A reference lattice is used to correct for scan noise and scan distortions in the recorded images. Using these methods, we demonstrate that a precision of few picometers is achievable in lattice measurement using aberration corrected Z-contrast images. For application, we apply the methods to strain analysis of a molecular beam epitaxy (MBE) grown LaMnO{sub 3} and SrMnO{sub 3} superlattice. The results show alternating epitaxial strain inside the superlattice and its variations across interfaces at the spatial resolution of a single perovskite unit cell. Our methods are general, model free and provide high spatial resolution for lattice analysis. - Highlights: • A real space approach is developed for strain analysis using atomic resolution Z-contrast images and template matching. • A precision of few picometers is achievable in the measurement of lattice displacements. • The spatial resolution of a single perovskite unit cell is demonstrated for a LaMnO{sub 3} and SrMnO{sub 3} superlattice grown by MBE.

  19. Model-free adaptive control of supercritical circulating fluidized-bed boilers

    Science.gov (United States)

    Cheng, George Shu-Xing; Mulkey, Steven L

    2014-12-16

    A novel 3-Input-3-Output (3.times.3) Fuel-Air Ratio Model-Free Adaptive (MFA) controller is introduced, which can effectively control key process variables including Bed Temperature, Excess O2, and Furnace Negative Pressure of combustion processes of advanced boilers. A novel 7-input-7-output (7.times.7) MFA control system is also described for controlling a combined 3-Input-3-Output (3.times.3) process of Boiler-Turbine-Generator (BTG) units and a 5.times.5 CFB combustion process of advanced boilers. Those boilers include Circulating Fluidized-Bed (CFB) Boilers and Once-Through Supercritical Circulating Fluidized-Bed (OTSC CFB) Boilers.

  20. A Model-Free No-arbitrage Price Bound for Variance Options

    Energy Technology Data Exchange (ETDEWEB)

    Bonnans, J. Frederic, E-mail: frederic.bonnans@inria.fr [Ecole Polytechnique, INRIA-Saclay (France); Tan Xiaolu, E-mail: xiaolu.tan@polytechnique.edu [Ecole Polytechnique, CMAP (France)

    2013-08-01

    We suggest a numerical approximation for an optimization problem, motivated by its applications in finance to find the model-free no-arbitrage bound of variance options given the marginal distributions of the underlying asset. A first approximation restricts the computation to a bounded domain. Then we propose a gradient projection algorithm together with the finite difference scheme to solve the optimization problem. We prove the general convergence, and derive some convergence rate estimates. Finally, we give some numerical examples to test the efficiency of the algorithm.

  1. Evidence based practice readiness: A concept analysis.

    Science.gov (United States)

    Schaefer, Jessica D; Welton, John M

    2018-01-15

    To analyse and define the concept "evidence based practice readiness" in nurses. Evidence based practice readiness is a term commonly used in health literature, but without a clear understanding of what readiness means. Concept analysis is needed to define the meaning of evidence based practice readiness. A concept analysis was conducted using Walker and Avant's method to clarify the defining attributes of evidence based practice readiness as well as antecedents and consequences. A Boolean search of PubMed and Cumulative Index for Nursing and Allied Health Literature was conducted and limited to those published after the year 2000. Eleven articles met the inclusion criteria for this analysis. Evidence based practice readiness incorporates personal and organisational readiness. Antecedents include the ability to recognize the need for evidence based practice, ability to access and interpret evidence based practice, and a supportive environment. The concept analysis demonstrates the complexity of the concept and its implications for nursing practice. The four pillars of evidence based practice readiness: nursing, training, equipping and leadership support are necessary to achieve evidence based practice readiness. Nurse managers are in the position to address all elements of evidence based practice readiness. Creating an environment that fosters evidence based practice can improve patient outcomes, decreased health care cost, increase nurses' job satisfaction and decrease nursing turnover. © 2018 John Wiley & Sons Ltd.

  2. Model-free optimization based feedforward control for an inkjet printhead

    NARCIS (Netherlands)

    Ezzeldin Mahdy Abdelmonem, M.; Bosch, van den P.P.J.; Jokic, A.; Waarsing, R.

    2010-01-01

    Inkjet is an important technology in document printing and many new industrial applications. As inkjet developments are moving towards higher productivity and quality, it is required to achieve small droplet size which is fired at a high jetting frequency. Inkjet printers are now widely used to form

  3. JAVA based LCD Reconstruction and Analysis Tools

    International Nuclear Information System (INIS)

    Bower, G.

    2004-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  4. Java based LCD reconstruction and analysis tools

    International Nuclear Information System (INIS)

    Bower, Gary; Cassell, Ron; Graf, Norman; Johnson, Tony; Ronan, Mike

    2001-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  5. Structured Performance Analysis for Component Based Systems

    OpenAIRE

    Salmi , N.; Moreaux , Patrice; Ioualalen , M.

    2012-01-01

    International audience; The Component Based System (CBS) paradigm is now largely used to design software systems. In addition, performance and behavioural analysis remains a required step for the design and the construction of efficient systems. This is especially the case of CBS, which involve interconnected components running concurrent processes. % This paper proposes a compositional method for modeling and structured performance analysis of CBS. Modeling is based on Stochastic Well-formed...

  6. Reliability analysis of software based safety functions

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1993-05-01

    The methods applicable in the reliability analysis of software based safety functions are described in the report. Although the safety functions also include other components, the main emphasis in the report is on the reliability analysis of software. The check list type qualitative reliability analysis methods, such as failure mode and effects analysis (FMEA), are described, as well as the software fault tree analysis. The safety analysis based on the Petri nets is discussed. The most essential concepts and models of quantitative software reliability analysis are described. The most common software metrics and their combined use with software reliability models are discussed. The application of software reliability models in PSA is evaluated; it is observed that the recent software reliability models do not produce the estimates needed in PSA directly. As a result from the study some recommendations and conclusions are drawn. The need of formal methods in the analysis and development of software based systems, the applicability of qualitative reliability engineering methods in connection to PSA and the need to make more precise the requirements for software based systems and their analyses in the regulatory guides should be mentioned. (orig.). (46 refs., 13 figs., 1 tab.)

  7. Preference-Based Recommendations for OLAP Analysis

    Science.gov (United States)

    Jerbi, Houssem; Ravat, Franck; Teste, Olivier; Zurfluh, Gilles

    This paper presents a framework for integrating OLAP and recommendations. We focus on the anticipatory recommendation process that assists the user during his OLAP analysis by proposing to him the forthcoming analysis step. We present a context-aware preference model that matches decision-makers intuition, and we discuss a preference-based approach for generating personalized recommendations.

  8. Team-Based Care: A Concept Analysis.

    Science.gov (United States)

    Baik, Dawon

    2017-10-01

    The purpose of this concept analysis is to clarify and analyze the concept of team-based care in clinical practice. Team-based care has garnered attention as a way to enhance healthcare delivery and patient care related to quality and safety. However, there is no consensus on the concept of team-based care; as a result, the lack of common definition impedes further studies on team-based care. This analysis was conducted using Walker and Avant's strategy. Literature searches were conducted using PubMed, Cumulative Index to Nursing and Allied Health Literature (CINAHL), and PsycINFO, with a timeline from January 1985 to December 2015. The analysis demonstrates that the concept of team-based care has three core attributes: (a) interprofessional collaboration, (b) patient-centered approach, and (c) integrated care process. This is accomplished through understanding other team members' roles and responsibilities, a climate of mutual respect, and organizational support. Consequences of team-based care are identified with three aspects: (a) patient, (b) healthcare professional, and (c) healthcare organization. This concept analysis helps better understand the characteristics of team-based care in the clinical practice as well as promote the development of a theoretical definition of team-based care. © 2016 Wiley Periodicals, Inc.

  9. Robust Mediation Analysis Based on Median Regression

    Science.gov (United States)

    Yuan, Ying; MacKinnon, David P.

    2014-01-01

    Mediation analysis has many applications in psychology and the social sciences. The most prevalent methods typically assume that the error distribution is normal and homoscedastic. However, this assumption may rarely be met in practice, which can affect the validity of the mediation analysis. To address this problem, we propose robust mediation analysis based on median regression. Our approach is robust to various departures from the assumption of homoscedasticity and normality, including heavy-tailed, skewed, contaminated, and heteroscedastic distributions. Simulation studies show that under these circumstances, the proposed method is more efficient and powerful than standard mediation analysis. We further extend the proposed robust method to multilevel mediation analysis, and demonstrate through simulation studies that the new approach outperforms the standard multilevel mediation analysis. We illustrate the proposed method using data from a program designed to increase reemployment and enhance mental health of job seekers. PMID:24079925

  10. The PIT-trap-A "model-free" bootstrap procedure for inference about regression models with discrete, multivariate responses.

    Science.gov (United States)

    Warton, David I; Thibaut, Loïc; Wang, Yi Alice

    2017-01-01

    Bootstrap methods are widely used in statistics, and bootstrapping of residuals can be especially useful in the regression context. However, difficulties are encountered extending residual resampling to regression settings where residuals are not identically distributed (thus not amenable to bootstrapping)-common examples including logistic or Poisson regression and generalizations to handle clustered or multivariate data, such as generalised estimating equations. We propose a bootstrap method based on probability integral transform (PIT-) residuals, which we call the PIT-trap, which assumes data come from some marginal distribution F of known parametric form. This method can be understood as a type of "model-free bootstrap", adapted to the problem of discrete and highly multivariate data. PIT-residuals have the key property that they are (asymptotically) pivotal. The PIT-trap thus inherits the key property, not afforded by any other residual resampling approach, that the marginal distribution of data can be preserved under PIT-trapping. This in turn enables the derivation of some standard bootstrap properties, including second-order correctness of pivotal PIT-trap test statistics. In multivariate data, bootstrapping rows of PIT-residuals affords the property that it preserves correlation in data without the need for it to be modelled, a key point of difference as compared to a parametric bootstrap. The proposed method is illustrated on an example involving multivariate abundance data in ecology, and demonstrated via simulation to have improved properties as compared to competing resampling methods.

  11. Chip based electroanalytical systems for cell analysis

    DEFF Research Database (Denmark)

    Spegel, C.; Heiskanen, A.; Skjolding, L.H.D.

    2008-01-01

    ' measurements of processes related to living cells, i.e., systems without lysing the cells. The focus is on chip based amperometric and impedimetric cell analysis systems where measurements utilizing solely carbon fiber microelectrodes (CFME) and other nonchip electrode formats, such as CFME for exocytosis...

  12. Support vector machine learning-based fMRI data group analysis.

    Science.gov (United States)

    Wang, Ze; Childress, Anna R; Wang, Jiongjiong; Detre, John A

    2007-07-15

    To explore the multivariate nature of fMRI data and to consider the inter-subject brain response discrepancies, a multivariate and brain response model-free method is fundamentally required. Two such methods are presented in this paper by integrating a machine learning algorithm, the support vector machine (SVM), and the random effect model. Without any brain response modeling, SVM was used to extract a whole brain spatial discriminance map (SDM), representing the brain response difference between the contrasted experimental conditions. Population inference was then obtained through the random effect analysis (RFX) or permutation testing (PMU) on the individual subjects' SDMs. Applied to arterial spin labeling (ASL) perfusion fMRI data, SDM RFX yielded lower false-positive rates in the null hypothesis test and higher detection sensitivity for synthetic activations with varying cluster size and activation strengths, compared to the univariate general linear model (GLM)-based RFX. For a sensory-motor ASL fMRI study, both SDM RFX and SDM PMU yielded similar activation patterns to GLM RFX and GLM PMU, respectively, but with higher t values and cluster extensions at the same significance level. Capitalizing on the absence of temporal noise correlation in ASL data, this study also incorporated PMU in the individual-level GLM and SVM analyses accompanied by group-level analysis through RFX or group-level PMU. Providing inferences on the probability of being activated or deactivated at each voxel, these individual-level PMU-based group analysis methods can be used to threshold the analysis results of GLM RFX, SDM RFX or SDM PMU.

  13. Network-based analysis of proteomic profiles

    KAUST Repository

    Wong, Limsoon

    2016-01-26

    Mass spectrometry (MS)-based proteomics is a widely used and powerful tool for profiling systems-wide protein expression changes. It can be applied for various purposes, e.g. biomarker discovery in diseases and study of drug responses. Although RNA-based high-throughput methods have been useful in providing glimpses into the underlying molecular processes, the evidences they provide are indirect. Furthermore, RNA and corresponding protein levels have been known to have poor correlation. On the other hand, MS-based proteomics tend to have consistency issues (poor reproducibility and inter-sample agreement) and coverage issues (inability to detect the entire proteome) that need to be urgently addressed. In this talk, I will discuss how these issues can be addressed by proteomic profile analysis techniques that use biological networks (especially protein complexes) as the biological context. In particular, I will describe several techniques that we have been developing for network-based analysis of proteomics profile. And I will present evidence that these techniques are useful in identifying proteomics-profile analysis results that are more consistent, more reproducible, and more biologically coherent, and that these techniques allow expansion of the detected proteome to uncover and/or discover novel proteins.

  14. Texture-based analysis of COPD

    DEFF Research Database (Denmark)

    Sørensen, Lauge; Nielsen, Mads; Lo, Pechin Chien Pau

    2012-01-01

    This study presents a fully automatic, data-driven approach for texture-based quantitative analysis of chronic obstructive pulmonary disease (COPD) in pulmonary computed tomography (CT) images. The approach uses supervised learning where the class labels are, in contrast to previous work, based...... on measured lung function instead of on manually annotated regions of interest (ROIs). A quantitative measure of COPD is obtained by fusing COPD probabilities computed in ROIs within the lung fields where the individual ROI probabilities are computed using a k nearest neighbor (kNN ) classifier. The distance...... and subsequently applied to classify 200 independent images from the same screening trial. The texture-based measure was significantly better at discriminating between subjects with and without COPD than were the two most common quantitative measures of COPD in the literature, which are based on density...

  15. Watershed-based Morphometric Analysis: A Review

    Science.gov (United States)

    Sukristiyanti, S.; Maria, R.; Lestiana, H.

    2018-02-01

    Drainage basin/watershed analysis based on morphometric parameters is very important for watershed planning. Morphometric analysis of watershed is the best method to identify the relationship of various aspects in the area. Despite many technical papers were dealt with in this area of study, there is no particular standard classification and implication of each parameter. It is very confusing to evaluate a value of every morphometric parameter. This paper deals with the meaning of values of the various morphometric parameters, with adequate contextual information. A critical review is presented on each classification, the range of values, and their implications. Besides classification and its impact, the authors also concern about the quality of input data, either in data preparation or scale/the detail level of mapping. This review paper hopefully can give a comprehensive explanation to assist the upcoming research dealing with morphometric analysis.

  16. XML-based analysis interface for particle physics data analysis

    International Nuclear Information System (INIS)

    Hu Jifeng; Lu Xiaorui; Zhang Yangheng

    2011-01-01

    The letter emphasizes on an XML-based interface and its framework for particle physics data analysis. The interface uses a concise XML syntax to describe, in data analysis, the basic tasks: event-selection, kinematic fitting, particle identification, etc. and a basic processing logic: the next step goes on if and only if this step succeeds. The framework can perform an analysis without compiling by loading the XML-interface file, setting p in run-time and running dynamically. An analysis coding in XML instead of C++, easy-to-understood arid use, effectively reduces the work load, and enables users to carry out their analyses quickly. The framework has been developed on the BESⅢ offline software system (BOSS) with the object-oriented C++ programming. These functions, required by the regular tasks and the basic processing logic, are implemented with both standard modules or inherited from the modules in BOSS. The interface and its framework have been tested to perform physics analysis. (authors)

  17. Chapter 11. Community analysis-based methods

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.

  18. Web-based Analysis Services Report

    CERN Document Server

    AUTHOR|(CDS)2108758; Canali, Luca; Grancher, Eric; Lamanna, Massimo; McCance, Gavin; Mato Vila, Pere; Piparo, Danilo; Moscicki, Jakub; Pace, Alberto; Brito Da Rocha, Ricardo; Simko, Tibor; Smith, Tim; Tejedor Saavedra, Enric; CERN. Geneva. IT Department

    2017-01-01

    Web-based services (cloud services) is an important trend to innovate end-user services while optimising the service operational costs. CERN users are constantly proposing new approaches (inspired from services existing on the web, tools used in education or other science or based on their experience in using existing computing services). In addition, industry and open source communities have recently made available a large number of powerful and attractive tools and platforms that enable large scale data processing. “Big Data” software stacks notably provide solutions for scalable storage, distributed compute and data analysis engines, data streaming, web-based interfaces (notebooks). Some of those platforms and tools, typically available as open source products, are experiencing a very fast adoption in industry and science such that they are becoming “de facto” references in several areas of data engineering, data science and machine learning. In parallel to users' requests, WLCG is considering to c...

  19. System based practice: a concept analysis

    Directory of Open Access Journals (Sweden)

    SHAHRAM YAZDANI

    2016-04-01

    Full Text Available Introduction: Systems-Based Practice (SBP is one of the six competencies introduced by the ACGME for physicians to provide high quality of care and also the most challenging of them in performance, training, and evaluation of medical students. This concept analysis clarifies the concept of SBP by identifying its components to make it possible to differentiate it from other similar concepts. For proper training of SBP and to ensure these competencies in physicians, it is necessary to have an operational definition, and SBP’s components must be precisely defined in order to provide valid and reliable assessment tools. Methods: Walker & Avant’s approach to concept analysis was performed in eight stages: choosing a concept, determining the purpose of analysis, identifying all uses of the concept, defining attributes, identifying a model case, identifying borderline, related, and contrary cases, identifying antecedents and consequences, and defining empirical referents. Results: Based on the analysis undertaken, the attributes of SBP includes knowledge of the system, balanced decision between patients’ need and system goals, effective role playing in interprofessional health care team, system level of health advocacy, and acting for system improvement. System thinking and a functional system are antecedents and system goals are consequences. A case model, as well as border, and contrary cases of SBP, has been introduced. Conclusion: The identification of SBP attributes in this study contributes to the body of knowledge in SBP and reduces the ambiguity of this concept to make it possible for applying it in training of different medical specialties. Also, it would be possible to develop and use more precise tools to evaluate SBP competency by using empirical referents of the analysis.

  20. NASA Lunar Base Wireless System Propagation Analysis

    Science.gov (United States)

    Hwu, Shian U.; Upanavage, Matthew; Sham, Catherine C.

    2007-01-01

    There have been many radio wave propagation studies using both experimental and theoretical techniques over the recent years. However, most of studies have been in support of commercial cellular phone wireless applications. The signal frequencies are mostly at the commercial cellular and Personal Communications Service bands. The antenna configurations are mostly one on a high tower and one near the ground to simulate communications between a cellular base station and a mobile unit. There are great interests in wireless communication and sensor systems for NASA lunar missions because of the emerging importance of establishing permanent lunar human exploration bases. Because of the specific lunar terrain geometries and RF frequencies of interest to the NASA missions, much of the published literature for the commercial cellular and PCS bands of 900 and 1800 MHz may not be directly applicable to the lunar base wireless system and environment. There are various communication and sensor configurations required to support all elements of a lunar base. For example, the communications between astronauts, between astronauts and the lunar vehicles, between lunar vehicles and satellites on the lunar orbits. There are also various wireless sensor systems among scientific, experimental sensors and data collection ground stations. This presentation illustrates the propagation analysis of the lunar wireless communication and sensor systems taking into account the three dimensional terrain multipath effects. It is observed that the propagation characteristics are significantly affected by the presence of the lunar terrain. The obtained results indicate the lunar surface material, terrain geometry and antenna location are the important factors affecting the propagation characteristics of the lunar wireless systems. The path loss can be much more severe than the free space propagation and is greatly affected by the antenna height, surface material and operating frequency. The

  1. Gait Correlation Analysis Based Human Identification

    Directory of Open Access Journals (Sweden)

    Jinyan Chen

    2014-01-01

    Full Text Available Human gait identification aims to identify people by a sequence of walking images. Comparing with fingerprint or iris based identification, the most important advantage of gait identification is that it can be done at a distance. In this paper, silhouette correlation analysis based human identification approach is proposed. By background subtracting algorithm, the moving silhouette figure can be extracted from the walking images sequence. Every pixel in the silhouette has three dimensions: horizontal axis (x, vertical axis (y, and temporal axis (t. By moving every pixel in the silhouette image along these three dimensions, we can get a new silhouette. The correlation result between the original silhouette and the new one can be used as the raw feature of human gait. Discrete Fourier transform is used to extract features from this correlation result. Then, these features are normalized to minimize the affection of noise. Primary component analysis method is used to reduce the features’ dimensions. Experiment based on CASIA database shows that this method has an encouraging recognition performance.

  2. Optimal depth-based regional frequency analysis

    Directory of Open Access Journals (Sweden)

    H. Wazneh

    2013-06-01

    Full Text Available Classical methods of regional frequency analysis (RFA of hydrological variables face two drawbacks: (1 the restriction to a particular region which can lead to a loss of some information and (2 the definition of a region that generates a border effect. To reduce the impact of these drawbacks on regional modeling performance, an iterative method was proposed recently, based on the statistical notion of the depth function and a weight function φ. This depth-based RFA (DBRFA approach was shown to be superior to traditional approaches in terms of flexibility, generality and performance. The main difficulty of the DBRFA approach is the optimal choice of the weight function ϕ (e.g., φ minimizing estimation errors. In order to avoid a subjective choice and naïve selection procedures of φ, the aim of the present paper is to propose an algorithm-based procedure to optimize the DBRFA and automate the choice of ϕ according to objective performance criteria. This procedure is applied to estimate flood quantiles in three different regions in North America. One of the findings from the application is that the optimal weight function depends on the considered region and can also quantify the region's homogeneity. By comparing the DBRFA to the canonical correlation analysis (CCA method, results show that the DBRFA approach leads to better performances both in terms of relative bias and mean square error.

  3. Optimal depth-based regional frequency analysis

    Science.gov (United States)

    Wazneh, H.; Chebana, F.; Ouarda, T. B. M. J.

    2013-06-01

    Classical methods of regional frequency analysis (RFA) of hydrological variables face two drawbacks: (1) the restriction to a particular region which can lead to a loss of some information and (2) the definition of a region that generates a border effect. To reduce the impact of these drawbacks on regional modeling performance, an iterative method was proposed recently, based on the statistical notion of the depth function and a weight function φ. This depth-based RFA (DBRFA) approach was shown to be superior to traditional approaches in terms of flexibility, generality and performance. The main difficulty of the DBRFA approach is the optimal choice of the weight function ϕ (e.g., φ minimizing estimation errors). In order to avoid a subjective choice and naïve selection procedures of φ, the aim of the present paper is to propose an algorithm-based procedure to optimize the DBRFA and automate the choice of ϕ according to objective performance criteria. This procedure is applied to estimate flood quantiles in three different regions in North America. One of the findings from the application is that the optimal weight function depends on the considered region and can also quantify the region's homogeneity. By comparing the DBRFA to the canonical correlation analysis (CCA) method, results show that the DBRFA approach leads to better performances both in terms of relative bias and mean square error.

  4. Dependent failure analysis of NPP data bases

    International Nuclear Information System (INIS)

    Cooper, S.E.; Lofgren, E.V.; Samanta, P.K.; Wong Seemeng

    1993-01-01

    A technical approach for analyzing plant-specific data bases for vulnerabilities to dependent failures has been developed and applied. Since the focus of this work is to aid in the formulation of defenses to dependent failures, rather than to quantify dependent failure probabilities, the approach of this analysis is critically different. For instance, the determination of component failure dependencies has been based upon identical failure mechanisms related to component piecepart failures, rather than failure modes. Also, component failures involving all types of component function loss (e.g., catastrophic, degraded, incipient) are equally important to the predictive purposes of dependent failure defense development. Consequently, dependent component failures are identified with a different dependent failure definition which uses a component failure mechanism categorization scheme in this study. In this context, clusters of component failures which satisfy the revised dependent failure definition are termed common failure mechanism (CFM) events. Motor-operated valves (MOVs) in two nuclear power plant data bases have been analyzed with this approach. The analysis results include seven different failure mechanism categories; identified potential CFM events; an assessment of the risk-significance of the potential CFM events using existing probabilistic risk assessments (PRAs); and postulated defenses to the identified potential CFM events. (orig.)

  5. Rweb:Web-based Statistical Analysis

    Directory of Open Access Journals (Sweden)

    Jeff Banfield

    1999-03-01

    Full Text Available Rweb is a freely accessible statistical analysis environment that is delivered through the World Wide Web (WWW. It is based on R, a well known statistical analysis package. The only requirement to run the basic Rweb interface is a WWW browser that supports forms. If you want graphical output you must, of course, have a browser that supports graphics. The interface provides access to WWW accessible data sets, so you may run Rweb on your own data. Rweb can provide a four window statistical computing environment (code input, text output, graphical output, and error information through browsers that support Javascript. There is also a set of point and click modules under development for use in introductory statistics courses.

  6. Qualitative safety analysis in accelerator based systems

    International Nuclear Information System (INIS)

    Sarkar, P.K.; Chowdhury, Lekha M.

    2006-01-01

    In recent developments connected to high energy and high current accelerators, the accelerator driven systems (ADS) and the Radioactive Ion Beam (RIB) facilities come in the forefront of application. For medical and industrial applications high current accelerators often need to be located in populated areas. These facilities pose significant radiological hazard during their operation and accidental situations. We have done a qualitative evaluation of radiological safety analysis using the probabilistic safety analysis (PSA) methods for accelerator-based systems. The major contribution to hazard comes from a target rupture scenario in both ADS and RIB facilities. Other significant contributors to hazard in the facilities are also discussed using fault tree and event tree methodologies. (author)

  7. Thermal-Signature-Based Sleep Analysis Sensor

    Directory of Open Access Journals (Sweden)

    Ali Seba

    2017-10-01

    Full Text Available This paper addresses the development of a new technique in the sleep analysis domain. Sleep is defined as a periodic physiological state during which vigilance is suspended and reactivity to external stimulations diminished. We sleep on average between six and nine hours per night and our sleep is composed of four to six cycles of about 90 min each. Each of these cycles is composed of a succession of several stages of sleep that vary in depth. Analysis of sleep is usually done via polysomnography. This examination consists of recording, among other things, electrical cerebral activity by electroencephalography (EEG, ocular movements by electrooculography (EOG, and chin muscle tone by electromyography (EMG. Recordings are made mostly in a hospital, more specifically in a service for monitoring the pathologies related to sleep. The readings are then interpreted manually by an expert to generate a hypnogram, a curve showing the succession of sleep stages during the night in 30s epochs. The proposed method is based on the follow-up of the thermal signature that makes it possible to classify the activity into three classes: “awakening,” “calm sleep,” and “restless sleep”. The contribution of this non-invasive method is part of the screening of sleep disorders, to be validated by a more complete analysis of the sleep. The measure provided by this new system, based on temperature monitoring (patient and ambient, aims to be integrated into the tele-medicine platform developed within the framework of the Smart-EEG project by the SYEL–SYstèmes ELectroniques team. Analysis of the data collected during the first surveys carried out with this method showed a correlation between thermal signature and activity during sleep. The advantage of this method lies in its simplicity and the possibility of carrying out measurements of activity during sleep and without direct contact with the patient at home or hospitals.

  8. Experimental data base for containment thermalhydraulic analysis

    International Nuclear Information System (INIS)

    Cheng, X.; Bazin, P.; Cornet, P.; Hittner, D.; Jackson, J.D.; Lopez Jimenez, J.; Naviglio, A.; Oriolo, F.; Petzold, H.

    2001-01-01

    This paper describes the joint research project DABASCO which is supported by the European Community under a cost-shared contract and participated by nine European institutions. The main objective of the project is to provide a generic experimental data base for the development of physical models and correlations for containment thermalhydraulic analysis. The project consists of seven separate-effects experimental programs which deal with new innovative conceptual features, e.g. passive decay heat removal and spray systems. The results of the various stages of the test programs will be assessed by industrial partners in relation to their applicability to reactor conditions

  9. Constructing storyboards based on hierarchical clustering analysis

    Science.gov (United States)

    Hasebe, Satoshi; Sami, Mustafa M.; Muramatsu, Shogo; Kikuchi, Hisakazu

    2005-07-01

    There are growing needs for quick preview of video contents for the purpose of improving accessibility of video archives as well as reducing network traffics. In this paper, a storyboard that contains a user-specified number of keyframes is produced from a given video sequence. It is based on hierarchical cluster analysis of feature vectors that are derived from wavelet coefficients of video frames. Consistent use of extracted feature vectors is the key to avoid a repetition of computationally-intensive parsing of the same video sequence. Experimental results suggest that a significant reduction in computational time is gained by this strategy.

  10. Robust Face Recognition Based on Texture Analysis

    Directory of Open Access Journals (Sweden)

    Sanun Srisuk

    2013-01-01

    Full Text Available In this paper, we present a new framework for face recognition with varying illumination based on DCT total variation minimization (DTV, a Gabor filter, a sub-micro-pattern analysis (SMP and discriminated accumulative feature transform (DAFT. We first suppress the illumination effect by using the DCT with the help of TV as a tool for face normalization. The DTV image is then emphasized by the Gabor filter. The facial features are encoded by our proposed method - the SMP. The SMP image is then transformed to the 2D histogram using DAFT. Our system is verified with experiments on the AR and the Yale face database B.

  11. AR(p) -based detrended fluctuation analysis

    Science.gov (United States)

    Alvarez-Ramirez, J.; Rodriguez, E.

    2018-07-01

    Autoregressive models are commonly used for modeling time-series from nature, economics and finance. This work explored simple autoregressive AR(p) models to remove long-term trends in detrended fluctuation analysis (DFA). Crude oil prices and bitcoin exchange rate were considered, with the former corresponding to a mature market and the latter to an emergent market. Results showed that AR(p) -based DFA performs similar to traditional DFA. However, the former DFA provides information on stability of long-term trends, which is valuable for understanding and quantifying the dynamics of complex time series from financial systems.

  12. Similarity-based pattern analysis and recognition

    CERN Document Server

    Pelillo, Marcello

    2013-01-01

    This accessible text/reference presents a coherent overview of the emerging field of non-Euclidean similarity learning. The book presents a broad range of perspectives on similarity-based pattern analysis and recognition methods, from purely theoretical challenges to practical, real-world applications. The coverage includes both supervised and unsupervised learning paradigms, as well as generative and discriminative models. Topics and features: explores the origination and causes of non-Euclidean (dis)similarity measures, and how they influence the performance of traditional classification alg

  13. Motion Analysis Based on Invertible Rapid Transform

    Directory of Open Access Journals (Sweden)

    J. Turan

    1999-06-01

    Full Text Available This paper presents the results of a study on the use of invertible rapid transform (IRT for the motion estimation in a sequence of images. Motion estimation algorithms based on the analysis of the matrix of states (produced in the IRT calculation are described. The new method was used experimentally to estimate crowd and traffic motion from the image data sequences captured at railway stations and at high ways in large cities. The motion vectors may be used to devise a polar plot (showing velocity magnitude and direction for moving objects where the dominant motion tendency can be seen. The experimental results of comparison of the new motion estimation methods with other well known block matching methods (full search, 2D-log, method based on conventional (cross correlation (CC function or phase correlation (PC function for application of crowd motion estimation are also presented.

  14. Online UV-visible spectroscopy and multivariate curve resolution as powerful tool for model-free investigation of laccase-catalysed oxidation.

    Science.gov (United States)

    Kandelbauer, A; Kessler, W; Kessler, R W

    2008-03-01

    The laccase-catalysed transformation of indigo carmine (IC) with and without a redox active mediator was studied using online UV-visible spectroscopy. Deconvolution of the mixture spectra obtained during the reaction was performed on a model-free basis using multivariate curve resolution (MCR). Thereby, the time courses of educts, products, and reaction intermediates involved in the transformation were reconstructed without prior mechanistic assumptions. Furthermore, the spectral signature of a reactive intermediate which could not have been detected by a classical hard-modelling approach was extracted from the chemometric analysis. The findings suggest that the combined use of UV-visible spectroscopy and MCR may lead to unexpectedly deep mechanistic evidence otherwise buried in the experimental data. Thus, although rather an unspecific method, UV-visible spectroscopy can prove useful in the monitoring of chemical reactions when combined with MCR. This offers a wide range of chemists a cheap and readily available, highly sensitive tool for chemical reaction online monitoring.

  15. Recurrence quantity analysis based on matrix eigenvalues

    Science.gov (United States)

    Yang, Pengbo; Shang, Pengjian

    2018-06-01

    Recurrence plots is a powerful tool for visualization and analysis of dynamical systems. Recurrence quantification analysis (RQA), based on point density and diagonal and vertical line structures in the recurrence plots, is considered to be alternative measures to quantify the complexity of dynamical systems. In this paper, we present a new measure based on recurrence matrix to quantify the dynamical properties of a given system. Matrix eigenvalues can reflect the basic characteristics of the complex systems, so we show the properties of the system by exploring the eigenvalues of the recurrence matrix. Considering that Shannon entropy has been defined as a complexity measure, we propose the definition of entropy of matrix eigenvalues (EOME) as a new RQA measure. We confirm that EOME can be used as a metric to quantify the behavior changes of the system. As a given dynamical system changes from a non-chaotic to a chaotic regime, the EOME will increase as well. The bigger EOME values imply higher complexity and lower predictability. We also study the effect of some factors on EOME,including data length, recurrence threshold, the embedding dimension, and additional noise. Finally, we demonstrate an application in physiology. The advantage of this measure lies in a high sensitivity and simple computation.

  16. Visibility Graph Based Time Series Analysis.

    Science.gov (United States)

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  17. Visibility Graph Based Time Series Analysis.

    Directory of Open Access Journals (Sweden)

    Mutua Stephen

    Full Text Available Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  18. DTI Analysis of Presbycusis Using Voxel-Based Analysis.

    Science.gov (United States)

    Ma, W; Li, M; Gao, F; Zhang, X; Shi, L; Yu, L; Zhao, B; Chen, W; Wang, G; Wang, X

    2016-07-14

    Presbycusis is the most common sensory deficit in the aging population. A recent study reported using a DTI-based tractography technique to identify a lack of integrity in a portion of the auditory pathway in patients with presbycusis. The aim of our study was to investigate the white matter pathology of patients with presbycusis by using a voxel-based analysis that is highly sensitive to local intensity changes in DTI data. Fifteen patients with presbycusis and 14 age- and sex-matched healthy controls were scanned on a 3T scanner. Fractional anisotropy, mean diffusivity, axial diffusivity, and radial diffusivity were obtained from the DTI data. Intergroup statistics were implemented on these measurements, which were transformed to Montreal Neurological Institute coordinates by using a nonrigid image registration method called large deformation diffeomorphic metric mapping. Increased axial diffusivity, radial diffusivity, and mean diffusivity and decreased fractional anisotropy were found near the right-side hearing-related areas in patients with presbycusis. Increased radial diffusivity and mean diffusivity were also found near a language-related area (Broca area) in patients with presbycusis. Our findings could be important for exploring reliable imaging evidence of presbycusis and could complement an ROI-based approach. © 2016 American Society of Neuroradiology.

  19. Data Extraction Based on Page Structure Analysis

    Directory of Open Access Journals (Sweden)

    Ren Yichao

    2017-01-01

    Full Text Available The information we need has some confusing problems such as dispersion and different organizational structure. In addition, because of the existence of unstructured data like natural language and images, extracting local content pages is extremely difficult. In the light of of the problems above, this article will apply a method combined with page structure analysis algorithm and page data extraction algorithm to accomplish the gathering of network data. In this way, the problem that traditional complex extraction model behave poorly when dealing with large-scale data is perfectly solved and the page data extraction efficiency is also boosted to a new level. In the meantime, the article will also make a comparison about pages and content of different types between the methods of DOM structure based on the page and HTML regularities of distribution. After all of those, we may find a more efficient extract method.

  20. Joint multifractal analysis based on wavelet leaders

    Science.gov (United States)

    Jiang, Zhi-Qiang; Yang, Yan-Hong; Wang, Gang-Jin; Zhou, Wei-Xing

    2017-12-01

    Mutually interacting components form complex systems and these components usually have long-range cross-correlated outputs. Using wavelet leaders, we propose a method for characterizing the joint multifractal nature of these long-range cross correlations; we call this method joint multifractal analysis based on wavelet leaders (MF-X-WL). We test the validity of the MF-X-WL method by performing extensive numerical experiments on dual binomial measures with multifractal cross correlations and bivariate fractional Brownian motions (bFBMs) with monofractal cross correlations. Both experiments indicate that MF-X-WL is capable of detecting cross correlations in synthetic data with acceptable estimating errors. We also apply the MF-X-WL method to pairs of series from financial markets (returns and volatilities) and online worlds (online numbers of different genders and different societies) and determine intriguing joint multifractal behavior.

  1. Node-based analysis of species distributions

    DEFF Research Database (Denmark)

    Borregaard, Michael Krabbe; Rahbek, Carsten; Fjeldså, Jon

    2014-01-01

    overrepresentation score (SOS) and the geographic node divergence (GND) score, which together combine ecological and evolutionary patterns into a single framework and avoids many of the problems that characterize community phylogenetic methods in current use.This approach goes through each node in the phylogeny...... with case studies on two groups with well-described biogeographical histories: a local-scale community data set of hummingbirds in the North Andes, and a large-scale data set of the distribution of all species of New World flycatchers. The node-based analysis of these two groups generates a set...... of intuitively interpretable patterns that are consistent with current biogeographical knowledge.Importantly, the results are statistically tractable, opening many possibilities for their use in analyses of evolutionary, historical and spatial patterns of species diversity. The method is implemented...

  2. Voxel-Based LIDAR Analysis and Applications

    Science.gov (United States)

    Hagstrom, Shea T.

    One of the greatest recent changes in the field of remote sensing is the addition of high-quality Light Detection and Ranging (LIDAR) instruments. In particular, the past few decades have been greatly beneficial to these systems because of increases in data collection speed and accuracy, as well as a reduction in the costs of components. These improvements allow modern airborne instruments to resolve sub-meter details, making them ideal for a wide variety of applications. Because LIDAR uses active illumination to capture 3D information, its output is fundamentally different from other modalities. Despite this difference, LIDAR datasets are often processed using methods appropriate for 2D images and that do not take advantage of its primary virtue of 3-dimensional data. It is this problem we explore by using volumetric voxel modeling. Voxel-based analysis has been used in many applications, especially medical imaging, but rarely in traditional remote sensing. In part this is because the memory requirements are substantial when handling large areas, but with modern computing and storage this is no longer a significant impediment. Our reason for using voxels to model scenes from LIDAR data is that there are several advantages over standard triangle-based models, including better handling of overlapping surfaces and complex shapes. We show how incorporating system position information from early in the LIDAR point cloud generation process allows radiometrically-correct transmission and other novel voxel properties to be recovered. This voxelization technique is validated on simulated data using the Digital Imaging and Remote Sensing Image Generation (DIRSIG) software, a first-principles based ray-tracer developed at the Rochester Institute of Technology. Voxel-based modeling of LIDAR can be useful on its own, but we believe its primary advantage is when applied to problems where simpler surface-based 3D models conflict with the requirement of realistic geometry. To

  3. [Concept analysis "Competency-based education"].

    Science.gov (United States)

    Loosli, Clarence

    2016-03-01

    Competency-based education (CBE) stands out at global level as the best educational practice. Indeed, CBE is supposed to improve the quality of care provided by newly graduated nurses. Yet, there is a dearth of knowledge in nursing literature regarding CBE concept's definition. CBE is implemented differently in each entity even inside the same discipline in a single country. What accounts for CBE in nursing education ? to clarify CBE concept meaning according to literature review in order to propose a definition. Wilson concept analysis method framed our literature review through two databases: CINHAL and ERIC. following the 11 Wilson techniques analysis, we identified CBE concept as a multidimensional concept clustering three dimensions : learning, teaching and assessment. nurses educators are accountable for providing performants newly graduated professional to the society. Schools should struggle for the visibility and the transparency of means they are using to accomplish their educational activities. This first attempt to understand CBE concept opens a matter of debate concerning further development and clarification of the concept. This first description of CBE concept is a step toward its identification and assessment.

  4. Constraints based analysis of extended cybernetic models.

    Science.gov (United States)

    Mandli, Aravinda R; Venkatesh, Kareenhalli V; Modak, Jayant M

    2015-11-01

    The cybernetic modeling framework provides an interesting approach to model the regulatory phenomena occurring in microorganisms. In the present work, we adopt a constraints based approach to analyze the nonlinear behavior of the extended equations of the cybernetic model. We first show that the cybernetic model exhibits linear growth behavior under the constraint of no resource allocation for the induction of the key enzyme. We then quantify the maximum achievable specific growth rate of microorganisms on mixtures of substitutable substrates under various kinds of regulation and show its use in gaining an understanding of the regulatory strategies of microorganisms. Finally, we show that Saccharomyces cerevisiae exhibits suboptimal dynamic growth with a long diauxic lag phase when growing on a mixture of glucose and galactose and discuss on its potential to achieve optimal growth with a significantly reduced diauxic lag period. The analysis carried out in the present study illustrates the utility of adopting a constraints based approach to understand the dynamic growth strategies of microorganisms. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  5. Neutronographic Texture Analysis of Zirconium Based Alloys

    International Nuclear Information System (INIS)

    Kruz'elová, M; Vratislav, S; Kalvoda, L; Dlouhá, M

    2012-01-01

    Neutron diffraction is a very powerful tool in texture analysis of zirconium based alloys used in nuclear technique. Textures of five samples (two rolled sheets and three tubes) were investigated by using basal pole figures, inversion pole figures, and ODF distribution function. The texture measurement was performed at diffractometer KSN2 on the Laboratory of Neutron Diffraction, Department of Solid State Engineering, Faculty of Nuclear Sciences and Physical Engineering, CTU in Prague. Procedures for studying textures with thermal neutrons and procedures for obtaining texture parameters (direct and inverse pole figures, three dimensional orientation distribution function) are also described. Observed data were processed by software packages HEXAL and GSAS. Our results can be summarized as follows: i) All samples of zirconium alloys show the distribution of middle area into two maxima in basal pole figures. This is caused by alloying elements. A characteristic split of the basal pole maxima tilted from the normal direction toward the transverse direction can be observed for all samples, ii) Sheet samples prefer orientation of planes (100) and (110) perpendicular to rolling direction and orientation of planes (002) perpendicular to normal direction, iii) Basal planes of tubes are oriented parallel to tube axis, meanwhile (100) planes are oriented perpendicular to tube axis. Level of resulting texture and maxima position is different for tubes and for sheets. The obtained results are characteristic for zirconium based alloys.

  6. Interactive analysis of geodata based intelligence

    Science.gov (United States)

    Wagner, Boris; Eck, Ralf; Unmüessig, Gabriel; Peinsipp-Byma, Elisabeth

    2016-05-01

    When a spatiotemporal events happens, multi-source intelligence data is gathered to understand the problem, and strategies for solving the problem are investigated. The difficulties arising from handling spatial and temporal intelligence data represent the main problem. The map might be the bridge to visualize the data and to get the most understand model for all stakeholders. For the analysis of geodata based intelligence data, a software was developed as a working environment that combines geodata with optimized ergonomics. The interaction with the common operational picture (COP) is so essentially facilitated. The composition of the COP is based on geodata services, which are normalized by international standards of the Open Geospatial Consortium (OGC). The basic geodata are combined with intelligence data from images (IMINT) and humans (HUMINT), stored in a NATO Coalition Shared Data Server (CSD). These intelligence data can be combined with further information sources, i.e., live sensors. As a result a COP is generated and an interaction suitable for the specific workspace is added. This allows the users to work interactively with the COP, i.e., searching with an on board CSD client for suitable intelligence data and integrate them into the COP. Furthermore, users can enrich the scenario with findings out of the data of interactive live sensors and add data from other sources. This allows intelligence services to contribute effectively to the process by what military and disaster management are organized.

  7. [Operating cost analysis of anaesthesia: activity based costing (ABC analysis)].

    Science.gov (United States)

    Majstorović, Branislava M; Kastratović, Dragana A; Vučović, Dragan S; Milaković, Branko D; Miličić, Biljana R

    2011-01-01

    Cost of anaesthesiology represent defined measures to determine a precise profile of expenditure estimation of surgical treatment, which is important regarding planning of healthcare activities, prices and budget. In order to determine the actual value of anaestesiological services, we started with the analysis of activity based costing (ABC) analysis. Retrospectively, in 2005 and 2006, we estimated the direct costs of anestesiological services (salaries, drugs, supplying materials and other: analyses and equipment.) of the Institute of Anaesthesia and Resuscitation of the Clinical Centre of Serbia. The group included all anesthetized patients of both sexes and all ages. We compared direct costs with direct expenditure, "each cost object (service or unit)" of the Republican Healthcare Insurance. The Summary data of the Departments of Anaesthesia documented in the database of the Clinical Centre of Serbia. Numerical data were utilized and the numerical data were estimated and analyzed by computer programs Microsoft Office Excel 2003 and SPSS for Windows. We compared using the linear model of direct costs and unit costs of anaesthesiological services from the Costs List of the Republican Healthcare Insurance. Direct costs showed 40% of costs were spent on salaries, (32% on drugs and supplies, and 28% on other costs, such as analyses and equipment. The correlation of the direct costs of anaestesiological services showed a linear correlation with the unit costs of the Republican Healthcare Insurance. During surgery, costs of anaesthesia would increase by 10% the surgical treatment cost of patients. Regarding the actual costs of drugs and supplies, we do not see any possibility of costs reduction. Fixed elements of direct costs provide the possibility of rationalization of resources in anaesthesia.

  8. OHBM 2017: Practical intensity based meta-analysis

    OpenAIRE

    Maumet, Camille

    2017-01-01

    "Practical intensity-based meta-analysis" slides from my talk in the OHBM 2017 educational talk on Neuroimaging meta-analysis.http://www.humanbrainmapping.org/files/2017/ED Courses/Neuroimaging Meta-Analysis.pdf

  9. Micromechanics Based Failure Analysis of Heterogeneous Materials

    Science.gov (United States)

    Sertse, Hamsasew M.

    are performed for both brittle failure/high cycle fatigue (HCF) for negligible plastic strain and ductile failure/low cycle fatigue (LCF) for large plastic strain. The proposed approach is incorporated in SwiftComp and used to predict the initial failure envelope, stress-strain curve for various loading conditions, and fatigue life of heterogeneous materials. The combined effects of strain hardening and progressive fatigue damage on the effective properties of heterogeneous materials are also studied. The capability of the current approach is validated using several representative examples of heterogeneous materials including binary composites, continuous fiber-reinforced composites, particle-reinforced composites, discontinuous fiber-reinforced composites, and woven composites. The predictions of MSG are also compared with the predictions obtained using various micromechanics approaches such as Generalized Methods of Cells (GMC), Mori-Tanaka (MT), and Double Inclusions (DI) and Representative Volume Element (RVE) Analysis (called as 3-dimensional finite element analysis (3D FEA) in this document). This study demonstrates that a micromechanics based failure analysis has a great potential to rigorously and more accurately analyze initiation and progression of damage in heterogeneous materials. However, this approach requires material properties specific to damage analysis, which are needed to be independently calibrated for each constituent.

  10. Temporal expression-based analysis of metabolism.

    Directory of Open Access Journals (Sweden)

    Sara B Collins

    Full Text Available Metabolic flux is frequently rerouted through cellular metabolism in response to dynamic changes in the intra- and extra-cellular environment. Capturing the mechanisms underlying these metabolic transitions in quantitative and predictive models is a prominent challenge in systems biology. Progress in this regard has been made by integrating high-throughput gene expression data into genome-scale stoichiometric models of metabolism. Here, we extend previous approaches to perform a Temporal Expression-based Analysis of Metabolism (TEAM. We apply TEAM to understanding the complex metabolic dynamics of the respiratorily versatile bacterium Shewanella oneidensis grown under aerobic, lactate-limited conditions. TEAM predicts temporal metabolic flux distributions using time-series gene expression data. Increased predictive power is achieved by supplementing these data with a large reference compendium of gene expression, which allows us to take into account the unique character of the distribution of expression of each individual gene. We further propose a straightforward method for studying the sensitivity of TEAM to changes in its fundamental free threshold parameter θ, and reveal that discrete zones of distinct metabolic behavior arise as this parameter is changed. By comparing the qualitative characteristics of these zones to additional experimental data, we are able to constrain the range of θ to a small, well-defined interval. In parallel, the sensitivity analysis reveals the inherently difficult nature of dynamic metabolic flux modeling: small errors early in the simulation propagate to relatively large changes later in the simulation. We expect that handling such "history-dependent" sensitivities will be a major challenge in the future development of dynamic metabolic-modeling techniques.

  11. Derivative based sensitivity analysis of gamma index

    Directory of Open Access Journals (Sweden)

    Biplab Sarkar

    2015-01-01

    Full Text Available Originally developed as a tool for patient-specific quality assurance in advanced treatment delivery methods to compare between measured and calculated dose distributions, the gamma index (γ concept was later extended to compare between any two dose distributions. It takes into effect both the dose difference (DD and distance-to-agreement (DTA measurements in the comparison. Its strength lies in its capability to give a quantitative value for the analysis, unlike other methods. For every point on the reference curve, if there is at least one point in the evaluated curve that satisfies the pass criteria (e.g., δDD = 1%, δDTA = 1 mm, the point is included in the quantitative score as "pass." Gamma analysis does not account for the gradient of the evaluated curve - it looks at only the minimum gamma value, and if it is <1, then the point passes, no matter what the gradient of evaluated curve is. In this work, an attempt has been made to present a derivative-based method for the identification of dose gradient. A mathematically derived reference profile (RP representing the penumbral region of 6 MV 10 cm × 10 cm field was generated from an error function. A general test profile (GTP was created from this RP by introducing 1 mm distance error and 1% dose error at each point. This was considered as the first of the two evaluated curves. By its nature, this curve is a smooth curve and would satisfy the pass criteria for all points in it. The second evaluated profile was generated as a sawtooth test profile (STTP which again would satisfy the pass criteria for every point on the RP. However, being a sawtooth curve, it is not a smooth one and would be obviously poor when compared with the smooth profile. Considering the smooth GTP as an acceptable profile when it passed the gamma pass criteria (1% DD and 1 mm DTA against the RP, the first and second order derivatives of the DDs (δD', δD" between these two curves were derived and used as the

  12. Communication Base Station Log Analysis Based on Hierarchical Clustering

    Directory of Open Access Journals (Sweden)

    Zhang Shao-Hua

    2017-01-01

    Full Text Available Communication base stations generate massive data every day, these base station logs play an important value in mining of the business circles. This paper use data mining technology and hierarchical clustering algorithm to group the scope of business circle for the base station by recording the data of these base stations.Through analyzing the data of different business circle based on feature extraction and comparing different business circle category characteristics, which can choose a suitable area for operators of commercial marketing.

  13. A Requirements Analysis Model Based on QFD

    Institute of Scientific and Technical Information of China (English)

    TANG Zhi-wei; Nelson K.H.Tang

    2004-01-01

    The enterprise resource planning (ERP) system has emerged to offer an integrated IT solution and more and more enterprises are increasing by adopting this system and regarding it as an important innovation. However, there is already evidence of high failure risks in ERP project implementation, one major reason is poor analysis of the requirements for system implementation. In this paper, the importance of requirements analysis for ERP project implementation is highlighted, and a requirements analysis model by applying quality function deployment (QFD) is presented, which will support to conduct requirements analysis for ERP project.

  14. Bariatric surgery: an evidence-based analysis.

    Science.gov (United States)

    2005-01-01

    To conduct an evidence-based analysis of the effectiveness and cost-effectiveness of bariatric surgery. Obesity is defined as a body mass index (BMI) of at last 30 kg/m(2).() Morbid obesity is defined as a BMI of at least 40 kg/m(2) or at least 35 kg/m(2) with comorbid conditions. Comorbid conditions associated with obesity include diabetes, hypertension, dyslipidemias, obstructive sleep apnea, weight-related arthropathies, and stress urinary incontinence. It is also associated with depression, and cancers of the breast, uterus, prostate, and colon, and is an independent risk factor for cardiovascular disease. Obesity is also associated with higher all-cause mortality at any age, even after adjusting for potential confounding factors like smoking. A person with a BMI of 30 kg/m(2) has about a 50% higher risk of dying than does someone with a healthy BMI. The risk more than doubles at a BMI of 35 kg/m(2). An expert estimated that about 160,000 people are morbidly obese in Ontario. In the United States, the prevalence of morbid obesity is 4.7% (1999-2000). In Ontario, the 2004 Chief Medical Officer of Health Report said that in 2003, almost one-half of Ontario adults were overweight (BMI 25-29.9 kg/m(2)) or obese (BMI ≥ 30 kg/m(2)). About 57% of Ontario men and 42% of Ontario women were overweight or obese. The proportion of the population that was overweight or obese increased gradually from 44% in 1990 to 49% in 2000, and it appears to have stabilized at 49% in 2003. The report also noted that the tendency to be overweight and obese increases with age up to 64 years. BMI should be used cautiously for people aged 65 years and older, because the "normal" range may begin at slightly above 18.5 kg/m(2) and extend into the "overweight" range. The Chief Medical Officer of Health cautioned that these data may underestimate the true extent of the problem, because they were based on self reports, and people tend to over-report their height and under-report their weight

  15. Canonical analysis based on mutual information

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Vestergaard, Jacob Schack

    2015-01-01

    combinations with the information theoretical measure mutual information (MI). We term this type of analysis canonical information analysis (CIA). MI allows for the actual joint distribution of the variables involved and not just second order statistics. While CCA is ideal for Gaussian data, CIA facilitates...

  16. Modeling and Analysis of Space Based Transceivers

    Science.gov (United States)

    Moore, Michael S.; Price, Jeremy C.; Abbott, Ben; Liebetreu, John; Reinhart, Richard C.; Kacpura, Thomas J.

    2007-01-01

    This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.

  17. Graphics based PC analysis of alpha spectra

    International Nuclear Information System (INIS)

    Chapman, T.C.

    1991-01-01

    New personal computer (PC) software performs interactive analysis of alpha spectra using EGA graphics. Spectra are collected with a commercial MCA board and analyzed using the software described here. The operator is required to approve each peak integration area before analysis proceeds. Sample analysis can use detector efficiencies or spike yields or both. Background corrections are made and upper limit values are calculated when specified. Nuclide identification uses a library of up to 64 nuclides with up to 8 alpha lines for each nuclide. Any one of 32 subset libraries can be used in an analysis. Analysis time is short and is limited by interaction with the operator, not by calculation time. Utilities include nuclide library editing, library subset editing, energy calibration, efficiency calibration, and background update

  18. Using Willie's Acid-Base Box for Blood Gas Analysis

    Science.gov (United States)

    Dietz, John R.

    2011-01-01

    In this article, the author describes a method developed by Dr. William T. Lipscomb for teaching blood gas analysis of acid-base status and provides three examples using Willie's acid-base box. Willie's acid-base box is constructed using three of the parameters of standard arterial blood gas analysis: (1) pH; (2) bicarbonate; and (3) CO[subscript…

  19. A model-free approach to eliminate autocorrelation when testing for process capability

    DEFF Research Database (Denmark)

    Vanmann, Kerstin; Kulahci, Murat

    2008-01-01

    There is an increasing use of on-line data acquisition systems in industry. This usually leads to autocorrelated data and implies that the assumption of independent observations has to be re-examined. Most decision procedures for capability analysis assume independent data. In this article we pre...

  20. Cash-Flow Analysis Base of the Company's Performance Evaluation

    OpenAIRE

    Radu Riana Iren; Mihalcea Lucean; Negoescu Gheorghe

    2013-01-01

    Analyses based on the study of financial flows allow coherent merge to study the financial equilibrium of the firm's performance. If static analysis to assess the financial imbalance at some point, but does not explain its evolution, in contrast, dynamic analysis highlights the evolution of financial imbalance, but does not indicate the extent of it. It follows that the two kinds of analysis are complementary and should be pursued simultaneously. Dynamic analysis is based on the concept of st...

  1. Web Based Distributed Coastal Image Analysis System, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This project develops Web based distributed image analysis system processing the Moderate Resolution Imaging Spectroradiometer (MODIS) data to provide decision...

  2. Modelling Free Flow Speed on Two-Lane Rural Highways in Bosnia and Herzegovina

    Directory of Open Access Journals (Sweden)

    Ivan Lovrić

    2014-04-01

    Full Text Available Free flow speed is used as a parameter in transportation planning and capacity analysis models, as well as speed-flow diagrams. Many of these models suggest estimating free flow speed according to measurements from similar highways, which is not a practical method for use in B&H. This paper first discusses problems with using these methodologies in conditions prevailing in B&H and then presents a free flow speed evaluation model developed from a comprehensive field survey conducted on nine homogeneous sections of state and regional roads.

  3. Environmentally based Cost-Benefit Analysis

    International Nuclear Information System (INIS)

    Magnell, M.

    1993-11-01

    The fundamentals of the basic elements of a new comprehensive economic assessment, MILA, developed in Sweden with inspiration from the Total Cost Assessment-model are presented. The core of the MILA approach is an expanded cost and benefit inventory. But MILA also includes a complementary addition of an internal waste stream analysis, a tool for evaluation of environmental conflicts in monetary terms, an extended time horizon and direct allocation of costs and revenues to products and processes. However, MILA does not ensure profitability for environmentally sound projects. Essentially, MILA is an approach of refining investment and profitability analysis of a project, investment or product. 109 refs., 38 figs

  4. Scope-Based Method Cache Analysis

    DEFF Research Database (Denmark)

    Huber, Benedikt; Hepp, Stefan; Schoeberl, Martin

    2014-01-01

    The quest for time-predictable systems has led to the exploration of new hardware architectures that simplify analysis and reasoning in the temporal domain, while still providing competitive performance. For the instruction memory, the method cache is a conceptually attractive solution, as it req......The quest for time-predictable systems has led to the exploration of new hardware architectures that simplify analysis and reasoning in the temporal domain, while still providing competitive performance. For the instruction memory, the method cache is a conceptually attractive solution...

  5. Network-based analysis of proteomic profiles

    KAUST Repository

    Wong, Limsoon

    2016-01-01

    -based high-throughput methods have been useful in providing glimpses into the underlying molecular processes, the evidences they provide are indirect. Furthermore, RNA and corresponding protein levels have been known to have poor correlation. On the other

  6. Automatic circuit analysis based on mask information

    International Nuclear Information System (INIS)

    Preas, B.T.; Lindsay, B.W.; Gwyn, C.W.

    1976-01-01

    The Circuit Mask Translator (CMAT) code has been developed which converts integrated circuit mask information into a circuit schematic. Logical operations, pattern recognition, and special functions are used to identify and interconnect diodes, transistors, capacitors, and resistances. The circuit topology provided by the translator is compatible with the input required for a circuit analysis program

  7. Epoch-based analysis of speech signals

    Indian Academy of Sciences (India)

    on speech production characteristics, but also helps in accurate analysis of speech. .... include time delay estimation, speech enhancement from single and multi- ...... log. (. E[k]. ∑K−1 l=0. E[l]. ) ,. (7) where K is the number of samples in the ...

  8. Knowledge-based analysis of phenotypes

    KAUST Repository

    Hoendorf, Robert

    2016-01-01

    a formal framework to describe phenotypes. A formalized theory of phenotypes is not only useful for domain analysis, but can also be applied to assist in the diagnosis of rare genetic diseases, and I will show how our results on the ontology

  9. EEG Based Analysis of Cognitive Load Enhance Instructional Analysis

    Science.gov (United States)

    Dan, Alex; Reiner, Miriam

    2017-01-01

    One of the recommended approaches in instructional design methods is to optimize the value of working memory capacity and avoid cognitive overload. Educational neuroscience offers novel processes and methodologies to analyze cognitive load based on physiological measures. Observing psychophysiological changes when they occur in response to the…

  10. Optimisation of NMR dynamic models II. A new methodology for the dual optimisation of the model-free parameters and the Brownian rotational diffusion tensor

    International Nuclear Information System (INIS)

    D'Auvergne, Edward J.; Gooley, Paul R.

    2008-01-01

    Finding the dynamics of an entire macromolecule is a complex problem as the model-free parameter values are intricately linked to the Brownian rotational diffusion of the molecule, mathematically through the autocorrelation function of the motion and statistically through model selection. The solution to this problem was formulated using set theory as an element of the universal set U-the union of all model-free spaces (d'Auvergne EJ and Gooley PR (2007) Mol BioSyst 3(7), 483-494). The current procedure commonly used to find the universal solution is to initially estimate the diffusion tensor parameters, to optimise the model-free parameters of numerous models, and then to choose the best model via model selection. The global model is then optimised and the procedure repeated until convergence. In this paper a new methodology is presented which takes a different approach to this diffusion seeded model-free paradigm. Rather than starting with the diffusion tensor this iterative protocol begins by optimising the model-free parameters in the absence of any global model parameters, selecting between all the model-free models, and finally optimising the diffusion tensor. The new model-free optimisation protocol will be validated using synthetic data from Schurr JM et al. (1994) J Magn Reson B 105(3), 211-224 and the relaxation data of the bacteriorhodopsin (1-36)BR fragment from Orekhov VY (1999) J Biomol NMR 14(4), 345-356. To demonstrate the importance of this new procedure the NMR relaxation data of the Olfactory Marker Protein (OMP) of Gitti R et al. (2005) Biochem 44(28), 9673-9679 is reanalysed. The result is that the dynamics for certain secondary structural elements is very different from those originally reported

  11. Thanatophoric dysplasia: case-based bioethical analysis

    Directory of Open Access Journals (Sweden)

    Edgar Abarca López

    2014-04-01

    Full Text Available This paper presents a case report of thanatophoric displasia diagnosed in the prenatal period using ultrasound standards. The course of the case pregnancy, birth process, and postnatal period is described. This report invites bioethical analysis using its principles, appealing to human dignity, diversity and otherness, particularly in the mother-child dyad and their family. An early diagnosis allows parental support as they face the course of this condition and its potentially fatal outcome.

  12. A Model-Free Diagnosis Approach for Intake Leakage Detection and Characterization in Diesel Engines

    Directory of Open Access Journals (Sweden)

    Ghaleb Hoblos

    2015-07-01

    Full Text Available Feature selection is an essential step for data classification used in fault detection and diagnosis processes. In this work, a new approach is proposed, which combines a feature selection algorithm and a neural network tool for leak detection and characterization tasks in diesel engine air paths. The Chi square classifier is used as the feature selection algorithm and the neural network based on Levenberg-Marquardt is used in system behavior modeling. The obtained neural network is used for leak detection and characterization. The model is learned and validated using data generated by xMOD. This tool is used again for testing. The effectiveness of the proposed approach is illustrated in simulation when the system operates on a low speed/load and the considered leak affecting the air path is very small.

  13. Movement Pattern Analysis Based on Sequence Signatures

    Directory of Open Access Journals (Sweden)

    Seyed Hossein Chavoshi

    2015-09-01

    Full Text Available Increased affordability and deployment of advanced tracking technologies have led researchers from various domains to analyze the resulting spatio-temporal movement data sets for the purpose of knowledge discovery. Two different approaches can be considered in the analysis of moving objects: quantitative analysis and qualitative analysis. This research focuses on the latter and uses the qualitative trajectory calculus (QTC, a type of calculus that represents qualitative data on moving point objects (MPOs, and establishes a framework to analyze the relative movement of multiple MPOs. A visualization technique called sequence signature (SESI is used, which enables to map QTC patterns in a 2D indexed rasterized space in order to evaluate the similarity of relative movement patterns of multiple MPOs. The applicability of the proposed methodology is illustrated by means of two practical examples of interacting MPOs: cars on a highway and body parts of a samba dancer. The results show that the proposed method can be effectively used to analyze interactions of multiple MPOs in different domains.

  14. Gender-Based Analysis On-Line Dialogue. Final Report.

    Science.gov (United States)

    2001

    An online dialogue on gender-based analysis (GBA) was held from February 15 to March 7, 2001. Invitations and a background paper titled "Why Gender-Based Analysis?" were sent to 350 women's organizations and individuals throughout Canada. Efforts were made to ensure that aboriginal and Metis women, visible minority women, and women with…

  15. Model-free information-theoretic approach to infer leadership in pairs of zebrafish.

    Science.gov (United States)

    Butail, Sachit; Mwaffo, Violet; Porfiri, Maurizio

    2016-04-01

    Collective behavior affords several advantages to fish in avoiding predators, foraging, mating, and swimming. Although fish schools have been traditionally considered egalitarian superorganisms, a number of empirical observations suggest the emergence of leadership in gregarious groups. Detecting and classifying leader-follower relationships is central to elucidate the behavioral and physiological causes of leadership and understand its consequences. Here, we demonstrate an information-theoretic approach to infer leadership from positional data of fish swimming. In this framework, we measure social interactions between fish pairs through the mathematical construct of transfer entropy, which quantifies the predictive power of a time series to anticipate another, possibly coupled, time series. We focus on the zebrafish model organism, which is rapidly emerging as a species of choice in preclinical research for its genetic similarity to humans and reduced neurobiological complexity with respect to mammals. To overcome experimental confounds and generate test data sets on which we can thoroughly assess our approach, we adapt and calibrate a data-driven stochastic model of zebrafish motion for the simulation of a coupled dynamical system of zebrafish pairs. In this synthetic data set, the extent and direction of the coupling between the fish are systematically varied across a wide parameter range to demonstrate the accuracy and reliability of transfer entropy in inferring leadership. Our approach is expected to aid in the analysis of collective behavior, providing a data-driven perspective to understand social interactions.

  16. Analysis of Financial Position Based on the Balance Sheet

    OpenAIRE

    Spineanu-Georgescu Luciana

    2011-01-01

    Analysis of financial position based on the balance sheet is mainly aimed at assessing the extent to which financial structure chosen by the firm, namely, financial resources, covering the needs reflected in the balance sheet financed. This is done through an analysis known as horizontal analysis balance sheet financial imbalances.

  17. Desiccant-Based Preconditioning Market Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, J.

    2001-01-11

    A number of important conclusions can be drawn as a result of this broad, first-phase market evaluation. The more important conclusions include the following: (1) A very significant market opportunity will exist for specialized outdoor air-handling units (SOAHUs) as more construction and renovation projects are designed to incorporate the recommendations made by the ASHRAE 62-1989 standard. Based on this investigation, the total potential market is currently $725,000,000 annually (see Table 6, Sect. 3). Based on the market evaluations completed, it is estimated that approximately $398,000,000 (55%) of this total market could be served by DBC systems if they were made cost-effective through mass production. Approximately $306,000,000 (42%) of the total can be served by a non-regenerated, desiccant-based total recovery approach, based on the information provided by this investigation. Approximately $92,000,000 (13%) can be served by a regenerated desiccant-based cooling approach (see Table 7, Sect. 3). (2) A projection of the market selling price of various desiccant-based SOAHU systems was prepared using prices provided by Trane for central-station, air-handling modules currently manufactured. The wheel-component pricing was added to these components by SEMCO. This resulted in projected pricing for these systems that is significantly less than that currently offered by custom suppliers (see Table 4, Sect. 2). Estimated payback periods for all SOAHU approaches were quite short when compared with conventional over-cooling and reheat systems. Actual paybacks may vary significantly depending on site-specific considerations. (3) In comparing cost vs benefit of each SOAHU approach, it is critical that the total system design be evaluated. For example, the cost premium of a DBC system is very significant when compared to a conventional air handling system, yet the reduced chiller, boiler, cooling tower, and other expense often equals or exceeds this premium, resulting in a

  18. Prostate cancer detection from model-free T1-weighted time series and diffusion imaging

    Science.gov (United States)

    Haq, Nandinee F.; Kozlowski, Piotr; Jones, Edward C.; Chang, Silvia D.; Goldenberg, S. Larry; Moradi, Mehdi

    2015-03-01

    The combination of Dynamic Contrast Enhanced (DCE) images with diffusion MRI has shown great potential in prostate cancer detection. The parameterization of DCE images to generate cancer markers is traditionally performed based on pharmacokinetic modeling. However, pharmacokinetic models make simplistic assumptions about the tissue perfusion process, require the knowledge of contrast agent concentration in a major artery, and the modeling process is sensitive to noise and fitting instabilities. We address this issue by extracting features directly from the DCE T1-weighted time course without modeling. In this work, we employed a set of data-driven features generated by mapping the DCE T1 time course to its principal component space, along with diffusion MRI features to detect prostate cancer. The optimal set of DCE features is extracted with sparse regularized regression through a Least Absolute Shrinkage and Selection Operator (LASSO) model. We show that when our proposed features are used within the multiparametric MRI protocol to replace the pharmacokinetic parameters, the area under ROC curve is 0.91 for peripheral zone classification and 0.87 for whole gland classification. We were able to correctly classify 32 out of 35 peripheral tumor areas identified in the data when the proposed features were used with support vector machine classification. The proposed feature set was used to generate cancer likelihood maps for the prostate gland.

  19. Model-Free Machine Learning in Biomedicine: Feasibility Study in Type 1 Diabetes

    Science.gov (United States)

    Daskalaki, Elena; Diem, Peter; Mougiakakou, Stavroula G.

    2016-01-01

    Although reinforcement learning (RL) is suitable for highly uncertain systems, the applicability of this class of algorithms to medical treatment may be limited by the patient variability which dictates individualised tuning for their usually multiple algorithmic parameters. This study explores the feasibility of RL in the framework of artificial pancreas development for type 1 diabetes (T1D). In this approach, an Actor-Critic (AC) learning algorithm is designed and developed for the optimisation of insulin infusion for personalised glucose regulation. AC optimises the daily basal insulin rate and insulin:carbohydrate ratio for each patient, on the basis of his/her measured glucose profile. Automatic, personalised tuning of AC is based on the estimation of information transfer (IT) from insulin to glucose signals. Insulin-to-glucose IT is linked to patient-specific characteristics related to total daily insulin needs and insulin sensitivity (SI). The AC algorithm is evaluated using an FDA-accepted T1D simulator on a large patient database under a complex meal protocol, meal uncertainty and diurnal SI variation. The results showed that 95.66% of time was spent in normoglycaemia in the presence of meal uncertainty and 93.02% when meal uncertainty and SI variation were simultaneously considered. The time spent in hypoglycaemia was 0.27% in both cases. The novel tuning method reduced the risk of severe hypoglycaemia, especially in patients with low SI. PMID:27441367

  20. Model-Free Machine Learning in Biomedicine: Feasibility Study in Type 1 Diabetes.

    Directory of Open Access Journals (Sweden)

    Elena Daskalaki

    Full Text Available Although reinforcement learning (RL is suitable for highly uncertain systems, the applicability of this class of algorithms to medical treatment may be limited by the patient variability which dictates individualised tuning for their usually multiple algorithmic parameters. This study explores the feasibility of RL in the framework of artificial pancreas development for type 1 diabetes (T1D. In this approach, an Actor-Critic (AC learning algorithm is designed and developed for the optimisation of insulin infusion for personalised glucose regulation. AC optimises the daily basal insulin rate and insulin:carbohydrate ratio for each patient, on the basis of his/her measured glucose profile. Automatic, personalised tuning of AC is based on the estimation of information transfer (IT from insulin to glucose signals. Insulin-to-glucose IT is linked to patient-specific characteristics related to total daily insulin needs and insulin sensitivity (SI. The AC algorithm is evaluated using an FDA-accepted T1D simulator on a large patient database under a complex meal protocol, meal uncertainty and diurnal SI variation. The results showed that 95.66% of time was spent in normoglycaemia in the presence of meal uncertainty and 93.02% when meal uncertainty and SI variation were simultaneously considered. The time spent in hypoglycaemia was 0.27% in both cases. The novel tuning method reduced the risk of severe hypoglycaemia, especially in patients with low SI.

  1. Knowledge-based analysis of phenotypes

    KAUST Repository

    Hoendorf, Robert

    2016-01-27

    Phenotypes are the observable characteristics of an organism, and they are widely recorded in biology and medicine. To facilitate data integration, ontologies that formally describe phenotypes are being developed in several domains. I will describe a formal framework to describe phenotypes. A formalized theory of phenotypes is not only useful for domain analysis, but can also be applied to assist in the diagnosis of rare genetic diseases, and I will show how our results on the ontology of phenotypes is now applied in biomedical research.

  2. Model Based Analysis of Insider Threats

    DEFF Research Database (Denmark)

    Chen, Taolue; Han, Tingting; Kammueller, Florian

    2016-01-01

    In order to detect malicious insider attacks it is important to model and analyse infrastructures and policies of organisations and the insiders acting within them. We extend formal approaches that allow modelling such scenarios by quantitative aspects to enable a precise analysis of security...... designs. Our framework enables evaluating the risks of an insider attack to happen quantitatively. The framework first identifies an insider's intention to perform an inside attack, using Bayesian networks, and in a second phase computes the probability of success for an inside attack by this actor, using...

  3. Fault Analysis-based Logic Encryption (Preprint)

    Science.gov (United States)

    2013-11-01

    publication of this paper. This material is based on work fund- ed by AFRL under contract No. FA8750-11-2-0274. Received and cleared for public release by...AFRL on November 19, 2012, case number 88ABW-2012-6072. Any opinions, findings and conclusions or recommendations expressed in this material are...those of the authors and do not necessarily reflect the views of AFRL or its contractors. 10 REFERENCES [1] KPMG . (2006) Managing the risks of

  4. Discretization analysis of bifurcation based nonlinear amplifiers

    Science.gov (United States)

    Feldkord, Sven; Reit, Marco; Mathis, Wolfgang

    2017-09-01

    Recently, for modeling biological amplification processes, nonlinear amplifiers based on the supercritical Andronov-Hopf bifurcation have been widely analyzed analytically. For technical realizations, digital systems have become the most relevant systems in signal processing applications. The underlying continuous-time systems are transferred to the discrete-time domain using numerical integration methods. Within this contribution, effects on the qualitative behavior of the Andronov-Hopf bifurcation based systems concerning numerical integration methods are analyzed. It is shown exemplarily that explicit Runge-Kutta methods transform the truncated normalform equation of the Andronov-Hopf bifurcation into the normalform equation of the Neimark-Sacker bifurcation. Dependent on the order of the integration method, higher order terms are added during this transformation.A rescaled normalform equation of the Neimark-Sacker bifurcation is introduced that allows a parametric design of a discrete-time system which corresponds to the rescaled Andronov-Hopf system. This system approximates the characteristics of the rescaled Hopf-type amplifier for a large range of parameters. The natural frequency and the peak amplitude are preserved for every set of parameters. The Neimark-Sacker bifurcation based systems avoid large computational effort that would be caused by applying higher order integration methods to the continuous-time normalform equations.

  5. Confidence-Based Learning in Investment Analysis

    Science.gov (United States)

    Serradell-Lopez, Enric; Lara-Navarra, Pablo; Castillo-Merino, David; González-González, Inés

    The aim of this study is to determine the effectiveness of using multiple choice tests in subjects related to the administration and business management. To this end we used a multiple-choice test with specific questions to verify the extent of knowledge gained and the confidence and trust in the answers. The tests were performed in a group of 200 students at the bachelor's degree in Business Administration and Management. The analysis made have been implemented in one subject of the scope of investment analysis and measured the level of knowledge gained and the degree of trust and security in the responses at two different times of the course. The measurements have been taken into account different levels of difficulty in the questions asked and the time spent by students to complete the test. The results confirm that students are generally able to obtain more knowledge along the way and get increases in the degree of trust and confidence in the answers. It is confirmed as the difficulty level of the questions set a priori by the heads of the subjects are related to levels of security and confidence in the answers. It is estimated that the improvement in the skills learned is viewed favourably by businesses and are especially important for job placement of students.

  6. Model-free aftershock forecasts constructed from similar sequences in the past

    Science.gov (United States)

    van der Elst, N.; Page, M. T.

    2017-12-01

    The basic premise behind aftershock forecasting is that sequences in the future will be similar to those in the past. Forecast models typically use empirically tuned parametric distributions to approximate past sequences, and project those distributions into the future to make a forecast. While parametric models do a good job of describing average outcomes, they are not explicitly designed to capture the full range of variability between sequences, and can suffer from over-tuning of the parameters. In particular, parametric forecasts may produce a high rate of "surprises" - sequences that land outside the forecast range. Here we present a non-parametric forecast method that cuts out the parametric "middleman" between training data and forecast. The method is based on finding past sequences that are similar to the target sequence, and evaluating their outcomes. We quantify similarity as the Poisson probability that the observed event count in a past sequence reflects the same underlying intensity as the observed event count in the target sequence. Event counts are defined in terms of differential magnitude relative to the mainshock. The forecast is then constructed from the distribution of past sequences outcomes, weighted by their similarity. We compare the similarity forecast with the Reasenberg and Jones (RJ95) method, for a set of 2807 global aftershock sequences of M≥6 mainshocks. We implement a sequence-specific RJ95 forecast using a global average prior and Bayesian updating, but do not propagate epistemic uncertainty. The RJ95 forecast is somewhat more precise than the similarity forecast: 90% of observed sequences fall within a factor of two of the median RJ95 forecast value, whereas the fraction is 85% for the similarity forecast. However, the surprise rate is much higher for the RJ95 forecast; 10% of observed sequences fall in the upper 2.5% of the (Poissonian) forecast range. The surprise rate is less than 3% for the similarity forecast. The similarity

  7. A Goal based methodology for HAZOP analysis

    DEFF Research Database (Denmark)

    Rossing, Netta Liin; Lind, Morten; Jensen, Niels

    2010-01-01

    to nodes with simple functions such as liquid transport, gas transport, liquid storage, gas-liquid contacting etc. From the functions of the nodes the selection of relevant process variables and deviation variables follows directly. The knowledge required to perform the pre-meeting HAZOP task of dividing...... the plant along functional lines is that of chemical unit operations and transport processes plus a some familiarity with the plant a hand. Thus the preparatory work may be performed by a chemical engineer with just an introductory course in risk assessment. The goal based methodology lends itself directly...

  8. LEARNING DIFFICULTIES: AN ANALYSIS BASED ON VIGOTSKY

    Directory of Open Access Journals (Sweden)

    Adriane Cenci

    2010-06-01

    Full Text Available We aimed, along the text, to bring a reflection upon learning difficulties based on Socio-Historical Theory, relating what is observed in schools to what has been discussed about learning difficulties and the theory proposed by Vygotsky in the early XX century. We understand that children enter school carrying experiences and knowledge from their cultural group and that school ignores such knowledge very often. Then, it is in such disengagement that emerges what we started to call learning difficulties. One cannot forget to see a child as a whole – a student is a social being constituted by culture, language and specific values to which one must be attentive.

  9. Network Anomaly Detection Based on Wavelet Analysis

    Directory of Open Access Journals (Sweden)

    Ali A. Ghorbani

    2008-11-01

    Full Text Available Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  10. Sandia National Laboratories analysis code data base

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, C.W.

    1994-11-01

    Sandia National Laboratories, mission is to solve important problems in the areas of national defense, energy security, environmental integrity, and industrial technology. The Laboratories` strategy for accomplishing this mission is to conduct research to provide an understanding of the important physical phenomena underlying any problem, and then to construct validated computational models of the phenomena which can be used as tools to solve the problem. In the course of implementing this strategy, Sandia`s technical staff has produced a wide variety of numerical problem-solving tools which they use regularly in the design, analysis, performance prediction, and optimization of Sandia components, systems and manufacturing processes. This report provides the relevant technical and accessibility data on the numerical codes used at Sandia, including information on the technical competency or capability area that each code addresses, code ``ownership`` and release status, and references describing the physical models and numerical implementation.

  11. PUBLIC DEBT ANALYSIS BASED ON SUSTAINABILITY INDICATORS

    Directory of Open Access Journals (Sweden)

    Elena DASCALU

    2016-09-01

    Full Text Available This article is an analysis of public debt, in terms of sustainability and vulnerability indicators, under a functioning market economy. The problems encountered regarding the high level of public debt or the potential risks of budgetary pressure converge to the idea that sustainability of public finances should be a major challenge for public policy. Thus, the policy adequate to address public finance sustainability must have as its starting point the overall strategy of the European Union, as well as the economic development of Member States, focusing on the most important performance components, namely, reducing public debt levels, increasing productivity and employment and, last but not the least, reforming social security systems. In order to achieve sustainable levels of public debt, the European Union Member States are required to establish and accomplish medium term strategic budgetary goals to ensure a downward trend in public debt.

  12. Sandia National Laboratories analysis code data base

    Science.gov (United States)

    Peterson, C. W.

    1994-11-01

    Sandia National Laboratories' mission is to solve important problems in the areas of national defense, energy security, environmental integrity, and industrial technology. The laboratories' strategy for accomplishing this mission is to conduct research to provide an understanding of the important physical phenomena underlying any problem, and then to construct validated computational models of the phenomena which can be used as tools to solve the problem. In the course of implementing this strategy, Sandia's technical staff has produced a wide variety of numerical problem-solving tools which they use regularly in the design, analysis, performance prediction, and optimization of Sandia components, systems, and manufacturing processes. This report provides the relevant technical and accessibility data on the numerical codes used at Sandia, including information on the technical competency or capability area that each code addresses, code 'ownership' and release status, and references describing the physical models and numerical implementation.

  13. Network Anomaly Detection Based on Wavelet Analysis

    Science.gov (United States)

    Lu, Wei; Ghorbani, Ali A.

    2008-12-01

    Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  14. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  15. Platoon Dispersion Analysis Based on Diffusion Theory

    Directory of Open Access Journals (Sweden)

    Badhrudeen Mohamed

    2017-01-01

    Full Text Available Urbanization and gro wing demand for travel, causes the traffic system to work ineffectively in most urban areas leadin g to traffic congestion. Many approaches have been adopted to address this problem, one among them being the signal co-ordination. This can be achieved if the platoon of vehicles that gets discharged at one signal gets green at consecutive signals with minimal delay. However, platoons tend to get dispersed as they travel and this dispersion phenomenon should be taken into account for effective signal coordination. Reported studies in this area are from the homogeneous and lane disciplined traffic conditions. This paper analyse the platoon dispersion characteristics under heterogeneous and lane-less traffic conditions. Out of the various modeling techniques reported, the approach based on diffusion theory is used in this study. The diffusion theory based models so far assumed thedata to follow normal distribution. However, in the present study, the data was found to follow lognormal distribution and hence the implementation was carried out using lognormal distribution. The parameters of lognormal distribution were calibrated for the study condition. For comparison purpose, normal distribution was also calibrated and the results were evaluated. It was foun d that model with log normal distribution performed better in all cases than the o ne with normal distribution.

  16. Advanced overlay analysis through design based metrology

    Science.gov (United States)

    Ji, Sunkeun; Yoo, Gyun; Jo, Gyoyeon; Kang, Hyunwoo; Park, Minwoo; Kim, Jungchan; Park, Chanha; Yang, Hyunjo; Yim, Donggyu; Maruyama, Kotaro; Park, Byungjun; Yamamoto, Masahiro

    2015-03-01

    As design rule shrink, overlay has been critical factor for semiconductor manufacturing. However, the overlay error which is determined by a conventional measurement with an overlay mark based on IBO and DBO often does not represent the physical placement error in the cell area. The mismatch may arise from the size or pitch difference between the overlay mark and the cell pattern. Pattern distortion caused by etching or CMP also can be a source of the mismatch. In 2014, we have demonstrated that method of overlay measurement in the cell area by using DBM (Design Based Metrology) tool has more accurate overlay value than conventional method by using an overlay mark. We have verified the reproducibility by measuring repeatable patterns in the cell area, and also demonstrated the reliability by comparing with CD-SEM data. We have focused overlay mismatching between overlay mark and cell area until now, further more we have concerned with the cell area having different pattern density and etch loading. There appears a phenomenon which has different overlay values on the cells with diverse patterning environment. In this paper, the overlay error was investigated from cell edge to center. For this experiment, we have verified several critical layers in DRAM by using improved(Better resolution and speed) DBM tool, NGR3520.

  17. The Route Analysis Based On Flight Plan

    Science.gov (United States)

    Feriyanto, Nur; Saleh, Chairul; Fauzi, Achmad; Rachman Dzakiyullah, Nur; Riza Iwaputra, Kahfi

    2016-02-01

    Economic development effects use of air transportation since the business process in every aspect was increased. Many people these days was prefer using airplane because it can save time and money. This situation also effects flight routes, many airlines offer new routes to deal with competition. Managing flight routes is one of the problems that must be faced in order to find the efficient and effective routes. This paper investigates the best routes based on flight performance by determining the amount of block fuel for the Jakarta-Denpasar flight route. Moreover, in this work compares a two kinds of aircraft and tracks by calculating flight distance, flight time and block fuel. The result shows Jakarta-Denpasar in the Track II has effective and efficient block fuel that can be performed by Airbus 320-200 aircraft. This study can contribute to practice in making an effective decision, especially helping executive management of company due to selecting appropriate aircraft and the track in the flight plan based on the block fuel consumption for business operation.

  18. α-Cut method based importance measure for criticality analysis in fuzzy probability – Based fault tree analysis

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry; Sony Tjahyani, D.T.; Widodo, Surip; Tjahjono, Hendro

    2017-01-01

    Highlights: •FPFTA deals with epistemic uncertainty using fuzzy probability. •Criticality analysis is important for reliability improvement. •An α-cut method based importance measure is proposed for criticality analysis in FPFTA. •The α-cut method based importance measure utilises α-cut multiplication, α-cut subtraction, and area defuzzification technique. •Benchmarking confirm that the proposed method is feasible for criticality analysis in FPFTA. -- Abstract: Fuzzy probability – based fault tree analysis (FPFTA) has been recently developed and proposed to deal with the limitations of conventional fault tree analysis. In FPFTA, reliabilities of basic events, intermediate events and top event are characterized by fuzzy probabilities. Furthermore, the quantification of the FPFTA is based on fuzzy multiplication rule and fuzzy complementation rule to propagate uncertainties from basic event to the top event. Since the objective of the fault tree analysis is to improve the reliability of the system being evaluated, it is necessary to find the weakest path in the system. For this purpose, criticality analysis can be implemented. Various importance measures, which are based on conventional probabilities, have been developed and proposed for criticality analysis in fault tree analysis. However, not one of those importance measures can be applied for criticality analysis in FPFTA, which is based on fuzzy probability. To be fully applied in nuclear power plant probabilistic safety assessment, FPFTA needs to have its corresponding importance measure. The objective of this study is to develop an α-cut method based importance measure to evaluate and rank the importance of basic events for criticality analysis in FPFTA. To demonstrate the applicability of the proposed measure, a case study is performed and its results are then benchmarked to the results generated by the four well known importance measures in conventional fault tree analysis. The results

  19. Is facet analysis based on rationalism?

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2014-01-01

    In several writings I have claimed that the basis of knowledge organisation (KO) must be found in subject knowledge, and that researchers and practitioners in KO must achieve knowledge about the domains that they are organising. Domain knowledge is not neutral, but rather is based on competing...... epistemologies and worldviews, and the classifier is therefore participating in struggles related to worldviews. Different traditions, approaches and paradigms in knowledge organisation research (and practice) can best be understood as more or less associated with one of four epistemologies: empiricism......, rationalism, historicism/hermeneutics, or pragmatism/critical theory (of which only the last position fully acknowledges the non-neutrality of knowledge organisation). Ranganathan – and the whole facet-analytic school – has formerly been exemplified as a rather clear example of rationalism. Some have objected...

  20. Image based SAR product simulation for analysis

    Science.gov (United States)

    Domik, G.; Leberl, F.

    1987-01-01

    SAR product simulation serves to predict SAR image gray values for various flight paths. Input typically consists of a digital elevation model and backscatter curves. A new method is described of product simulation that employs also a real SAR input image for image simulation. This can be denoted as 'image-based simulation'. Different methods to perform this SAR prediction are presented and advantages and disadvantages discussed. Ascending and descending orbit images from NASA's SIR-B experiment were used for verification of the concept: input images from ascending orbits were converted into images from a descending orbit; the results are compared to the available real imagery to verify that the prediction technique produces meaningful image data.

  1. Analysis of Vehicle-Based Security Operations

    Energy Technology Data Exchange (ETDEWEB)

    Carter, Jason M [ORNL; Paul, Nate R [ORNL

    2015-01-01

    Vehicle-to-vehicle (V2V) communications promises to increase roadway safety by providing each vehicle with 360 degree situational awareness of other vehicles in proximity, and by complementing onboard sensors such as radar or camera in detecting imminent crash scenarios. In the United States, approximately three hundred million automobiles could participate in a fully deployed V2V system if Dedicated Short-Range Communication (DSRC) device use becomes mandatory. The system s reliance on continuous communication, however, provides a potential means for unscrupulous persons to transmit false data in an attempt to cause crashes, create traffic congestion, or simply render the system useless. V2V communications must be highly scalable while retaining robust security and privacy preserving features to meet the intra-vehicle and vehicle-to-infrastructure communication requirements for a growing vehicle population. Oakridge National Research Laboratory is investigating a Vehicle-Based Security System (VBSS) to provide security and privacy for a fully deployed V2V and V2I system. In the VBSS an On-board Unit (OBU) generates short-term certificates and signs Basic Safety Messages (BSM) to preserve privacy and enhance security. This work outlines a potential VBSS structure and its operational concepts; it examines how a vehicle-based system might feasibly provide security and privacy, highlights remaining challenges, and explores potential mitigations to address those challenges. Certificate management alternatives that attempt to meet V2V security and privacy requirements have been examined previously by the research community including privacy-preserving group certificates, shared certificates, and functional encryption. Due to real-world operational constraints, adopting one of these approaches for VBSS V2V communication is difficult. Timely misbehavior detection and revocation are still open problems for any V2V system. We explore the alternative approaches that may be

  2. Simulation based analysis of laser beam brazing

    Science.gov (United States)

    Dobler, Michael; Wiethop, Philipp; Schmid, Daniel; Schmidt, Michael

    2016-03-01

    Laser beam brazing is a well-established joining technology in car body manufacturing with main applications in the joining of divided tailgates and the joining of roof and side panels. A key advantage of laser brazed joints is the seam's visual quality which satisfies highest requirements. However, the laser beam brazing process is very complex and process dynamics are only partially understood. In order to gain deeper knowledge of the laser beam brazing process, to determine optimal process parameters and to test process variants, a transient three-dimensional simulation model of laser beam brazing is developed. This model takes into account energy input, heat transfer as well as fluid and wetting dynamics that lead to the formation of the brazing seam. A validation of the simulation model is performed by metallographic analysis and thermocouple measurements for different parameter sets of the brazing process. These results show that the multi-physical simulation model not only can be used to gain insight into the laser brazing process but also offers the possibility of process optimization in industrial applications. The model's capabilities in determining optimal process parameters are exemplarily shown for the laser power. Small deviations in the energy input can affect the brazing results significantly. Therefore, the simulation model is used to analyze the effect of the lateral laser beam position on the energy input and the resulting brazing seam.

  3. An SQL-based approach to physics analysis

    International Nuclear Information System (INIS)

    Limper, Dr Maaike

    2014-01-01

    As part of the CERN openlab collaboration a study was made into the possibility of performing analysis of the data collected by the experiments at the Large Hadron Collider (LHC) through SQL-queries on data stored in a relational database. Currently LHC physics analysis is done using data stored in centrally produced 'ROOT-ntuple' files that are distributed through the LHC computing grid. The SQL-based approach to LHC physics analysis presented in this paper allows calculations in the analysis to be done at the database and can make use of the database's in-built parallelism features. Using this approach it was possible to reproduce results for several physics analysis benchmarks. The study shows the capability of the database to handle complex analysis tasks but also illustrates the limits of using row-based storage for storing physics analysis data, as performance was limited by the I/O read speed of the system.

  4. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  5. Indoor air quality analysis based on Hadoop

    International Nuclear Information System (INIS)

    Tuo, Wang; Liang, Yu; Weihong, Cui; Yunhua, Sun; Song, Tian

    2014-01-01

    The air of the office environment is our research object. The data of temperature, humidity, concentrations of carbon dioxide, carbon monoxide and ammonia are collected peer one to eight seconds by the sensor monitoring system. And all the data are stored in the Hbase database of Hadoop platform. With the help of HBase feature of column-oriented store and versioned (automatically add the time column), the time-series data sets are bulit based on the primary key Row-key and timestamp. The parallel computing programming model MapReduce is used to process millions of data collected by sensors. By analysing the changing trend of parameters' value at different time of the same day and at the same time of various dates, the impact of human factor and other factors on the room microenvironment is achieved according to the liquidity of the office staff. Moreover, the effective way to improve indoor air quality is proposed in the end of this paper

  6. Indoor air quality analysis based on Hadoop

    Science.gov (United States)

    Tuo, Wang; Yunhua, Sun; Song, Tian; Liang, Yu; Weihong, Cui

    2014-03-01

    The air of the office environment is our research object. The data of temperature, humidity, concentrations of carbon dioxide, carbon monoxide and ammonia are collected peer one to eight seconds by the sensor monitoring system. And all the data are stored in the Hbase database of Hadoop platform. With the help of HBase feature of column-oriented store and versioned (automatically add the time column), the time-series data sets are bulit based on the primary key Row-key and timestamp. The parallel computing programming model MapReduce is used to process millions of data collected by sensors. By analysing the changing trend of parameters' value at different time of the same day and at the same time of various dates, the impact of human factor and other factors on the room microenvironment is achieved according to the liquidity of the office staff. Moreover, the effective way to improve indoor air quality is proposed in the end of this paper.

  7. Analysis of composition-based metagenomic classification.

    Science.gov (United States)

    Higashi, Susan; Barreto, André da Motta Salles; Cantão, Maurício Egidio; de Vasconcelos, Ana Tereza Ribeiro

    2012-01-01

    An essential step of a metagenomic study is the taxonomic classification, that is, the identification of the taxonomic lineage of the organisms in a given sample. The taxonomic classification process involves a series of decisions. Currently, in the context of metagenomics, such decisions are usually based on empirical studies that consider one specific type of classifier. In this study we propose a general framework for analyzing the impact that several decisions can have on the classification problem. Instead of focusing on any specific classifier, we define a generic score function that provides a measure of the difficulty of the classification task. Using this framework, we analyze the impact of the following parameters on the taxonomic classification problem: (i) the length of n-mers used to encode the metagenomic sequences, (ii) the similarity measure used to compare sequences, and (iii) the type of taxonomic classification, which can be conventional or hierarchical, depending on whether the classification process occurs in a single shot or in several steps according to the taxonomic tree. We defined a score function that measures the degree of separability of the taxonomic classes under a given configuration induced by the parameters above. We conducted an extensive computational experiment and found out that reasonable values for the parameters of interest could be (i) intermediate values of n, the length of the n-mers; (ii) any similarity measure, because all of them resulted in similar scores; and (iii) the hierarchical strategy, which performed better in all of the cases. As expected, short n-mers generate lower configuration scores because they give rise to frequency vectors that represent distinct sequences in a similar way. On the other hand, large values for n result in sparse frequency vectors that represent differently metagenomic fragments that are in fact similar, also leading to low configuration scores. Regarding the similarity measure, in

  8. Model-Based Reasoning in Humans Becomes Automatic with Training.

    Directory of Open Access Journals (Sweden)

    Marcos Economides

    2015-09-01

    Full Text Available Model-based and model-free reinforcement learning (RL have been suggested as algorithmic realizations of goal-directed and habitual action strategies. Model-based RL is more flexible than model-free but requires sophisticated calculations using a learnt model of the world. This has led model-based RL to be identified with slow, deliberative processing, and model-free RL with fast, automatic processing. In support of this distinction, it has recently been shown that model-based reasoning is impaired by placing subjects under cognitive load--a hallmark of non-automaticity. Here, using the same task, we show that cognitive load does not impair model-based reasoning if subjects receive prior training on the task. This finding is replicated across two studies and a variety of analysis methods. Thus, task familiarity permits use of model-based reasoning in parallel with other cognitive demands. The ability to deploy model-based reasoning in an automatic, parallelizable fashion has widespread theoretical implications, particularly for the learning and execution of complex behaviors. It also suggests a range of important failure modes in psychiatric disorders.

  9. Model-free iterative control of repetitive dynamics for high-speed scanning in atomic force microscopy.

    Science.gov (United States)

    Li, Yang; Bechhoefer, John

    2009-01-01

    We introduce an algorithm for calculating, offline or in real time and with no explicit system characterization, the feedforward input required for repetitive motions of a system. The algorithm is based on the secant method of numerical analysis and gives accurate motion at frequencies limited only by the signal-to-noise ratio and the actuator power and range. We illustrate the secant-solver algorithm on a stage used for atomic force microscopy.

  10. RESEARCH NOTE Genome-based exome-sequencing analysis ...

    Indian Academy of Sciences (India)

    Navya

    2017-02-22

    Feb 22, 2017 ... Genome-based exome-sequencing analysis identifies GYG1, DIS3L, DDRGK1 genes ... Cardiology Division, Department of Internal Medicine, Severance .... with p values of <0.05 byanalyzing differences in allele distribution.

  11. Risk-based decision analysis for groundwater operable units

    International Nuclear Information System (INIS)

    Chiaramonte, G.R.

    1995-01-01

    This document proposes a streamlined approach and methodology for performing risk assessment in support of interim remedial measure (IRM) decisions involving the remediation of contaminated groundwater on the Hanford Site. This methodology, referred to as ''risk-based decision analysis,'' also supports the specification of target cleanup volumes and provides a basis for design and operation of the groundwater remedies. The risk-based decision analysis can be completed within a short time frame and concisely documented. The risk-based decision analysis is more versatile than the qualitative risk assessment (QRA), because it not only supports the need for IRMs, but also provides criteria for defining the success of the IRMs and provides the risk-basis for decisions on final remedies. For these reasons, it is proposed that, for groundwater operable units, the risk-based decision analysis should replace the more elaborate, costly, and time-consuming QRA

  12. Adding trend data to Depletion-Based Stock Reduction Analysis

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A Bayesian model of Depletion-Based Stock Reduction Analysis (DB-SRA), informed by a time series of abundance indexes, was developed, using the Sampling Importance...

  13. Web-Based Analysis for Decision Support Systems

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... Faculty of Computing and Information Technology, King Abdulaziz University, Jeddah, SA ... impact of web-based analysis on DSSs and how it affects ... Internal users. ... and control of models, especially accounting, financial,.

  14. An Artificial Intelligence-Based Environment Quality Analysis System

    OpenAIRE

    Oprea , Mihaela; Iliadis , Lazaros

    2011-01-01

    Part 20: Informatics and Intelligent Systems Applications for Quality of Life information Services (ISQLIS) Workshop; International audience; The paper describes an environment quality analysis system based on a combination of some artificial intelligence techniques, artificial neural networks and rule-based expert systems. Two case studies of the system use are discussed: air pollution analysis and flood forecasting with their impact on the environment and on the population health. The syste...

  15. Trojan detection model based on network behavior analysis

    International Nuclear Information System (INIS)

    Liu Junrong; Liu Baoxu; Wang Wenjin

    2012-01-01

    Based on the analysis of existing Trojan detection technology, this paper presents a Trojan detection model based on network behavior analysis. First of all, we abstract description of the Trojan network behavior, then according to certain rules to establish the characteristic behavior library, and then use the support vector machine algorithm to determine whether a Trojan invasion. Finally, through the intrusion detection experiments, shows that this model can effectively detect Trojans. (authors)

  16. Chemical analysis and base-promoted hydrolysis of locally ...

    African Journals Online (AJOL)

    Abstract. The study was on the chemical analysis and base- promoted hydrolysis of extracted shea nut fat. The local method of extraction of the shea nut oil was employed in comparison with literature report. A simple cold-process alkali hydrolysis of the shea nut oil was used in producing the soap. The chemical analysis of ...

  17. Web-Based Trainer for Electrical Circuit Analysis

    Science.gov (United States)

    Weyten, L.; Rombouts, P.; De Maeyer, J.

    2009-01-01

    A Web-based system for training electric circuit analysis is presented in this paper. It is centered on symbolic analysis techniques and it not only verifies the student's final answer, but it also tracks and coaches him/her through all steps of his/her reasoning path. The system mimics homework assignments, enhanced by immediate personalized…

  18. Content Analysis of a Computer-Based Faculty Activity Repository

    Science.gov (United States)

    Baker-Eveleth, Lori; Stone, Robert W.

    2013-01-01

    The research presents an analysis of faculty opinions regarding the introduction of a new computer-based faculty activity repository (FAR) in a university setting. The qualitative study employs content analysis to better understand the phenomenon underlying these faculty opinions and to augment the findings from a quantitative study. A web-based…

  19. Geographic Object-Based Image Analysis: Towards a new paradigm

    NARCIS (Netherlands)

    Blaschke, T.; Hay, G.J.; Kelly, M.; Lang, S.; Hofmann, P.; Addink, E.A.|info:eu-repo/dai/nl/224281216; Queiroz Feitosa, R.; van der Meer, F.D.|info:eu-repo/dai/nl/138940908; van der Werff, H.M.A.; van Coillie, F.; Tiede, A.

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis – GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature

  20. Real Time Engineering Analysis Based on a Generative Component Implementation

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Klitgaard, Jens

    2007-01-01

    The present paper outlines the idea of a conceptual design tool with real time engineering analysis which can be used in the early conceptual design phase. The tool is based on a parametric approach using Generative Components with embedded structural analysis. Each of these components uses the g...

  1. Product Lifecycle Management Architecture: A Model Based Systems Engineering Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Noonan, Nicholas James [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-07-01

    This report is an analysis of the Product Lifecycle Management (PLM) program. The analysis is centered on a need statement generated by a Nuclear Weapons (NW) customer. The need statement captured in this report creates an opportunity for the PLM to provide a robust service as a solution. Lifecycles for both the NW and PLM are analyzed using Model Based System Engineering (MBSE).

  2. Decadal shifts of East Asian summer monsoon in a climate model free of explicit GHGs and aerosols

    Science.gov (United States)

    Lin, Renping; Zhu, Jiang; Zheng, Fei

    2016-12-01

    The East Asian summer monsoon (EASM) experienced decadal transitions over the past few decades, and the associated "wetter-South-drier-North" shifts in rainfall patterns in China significantly affected the social and economic development in China. Two viewpoints stand out to explain these decadal shifts, regarding the shifts either a result of internal variability of climate system or that of external forcings (e.g. greenhouse gases (GHGs) and anthropogenic aerosols). However, most climate models, for example, the Atmospheric Model Intercomparison Project (AMIP)-type simulations and the Coupled Model Intercomparison Project (CMIP)-type simulations, fail to simulate the variation patterns, leaving the mechanisms responsible for these shifts still open to dispute. In this study, we conducted a successful simulation of these decadal transitions in a coupled model where we applied ocean data assimilation in the model free of explicit aerosols and GHGs forcing. The associated decadal shifts of the three-dimensional spatial structure in the 1990s, including the eastward retreat, the northward shift of the western Pacific subtropical high (WPSH), and the south-cool-north-warm pattern of the upper-level tropospheric temperature, were all well captured. Our simulation supports the argument that the variations of the oceanic fields are the dominant factor responsible for the EASM decadal transitions.

  3. Space shuttle booster multi-engine base flow analysis

    Science.gov (United States)

    Tang, H. H.; Gardiner, C. R.; Anderson, W. A.; Navickas, J.

    1972-01-01

    A comprehensive review of currently available techniques pertinent to several prominent aspects of the base thermal problem of the space shuttle booster is given along with a brief review of experimental results. A tractable engineering analysis, capable of predicting the power-on base pressure, base heating, and other base thermal environmental conditions, such as base gas temperature, is presented and used for an analysis of various space shuttle booster configurations. The analysis consists of a rational combination of theoretical treatments of the prominent flow interaction phenomena in the base region. These theories consider jet mixing, plume flow, axisymmetric flow effects, base injection, recirculating flow dynamics, and various modes of heat transfer. Such effects as initial boundary layer expansion at the nozzle lip, reattachment, recompression, choked vent flow, and nonisoenergetic mixing processes are included in the analysis. A unified method was developed and programmed to numerically obtain compatible solutions for the various flow field components in both flight and ground test conditions. Preliminary prediction for a 12-engine space shuttle booster base thermal environment was obtained for a typical trajectory history. Theoretical predictions were also obtained for some clustered-engine experimental conditions. Results indicate good agreement between the data and theoretical predicitons.

  4. Some Linguistic-based and temporal analysis on Wikipedia

    International Nuclear Information System (INIS)

    Yasseri, T.

    2010-01-01

    Wikipedia as a web-based, collaborative, multilingual encyclopaedia project is a very suitable field to carry out research on social dynamics and to investigate the complex concepts of conflict, collaboration, competition, dispute, etc in a large community (∼26 Million) of Wikipedia users. The other face of Wikipedia as a productive society, is its output, consisting of (∼17) Millions of articles written unsupervised by unprofessional editors in more than 270 different languages. In this talk we report some analysis performed on Wikipedia in two different approaches: temporal analysis to characterize disputes and controversies among users and linguistic-based analysis to characterize linguistic features of English texts in Wikipedia. (author)

  5. An Evidence-Based Videotaped Running Biomechanics Analysis.

    Science.gov (United States)

    Souza, Richard B

    2016-02-01

    Running biomechanics play an important role in the development of injuries. Performing a running biomechanics analysis on injured runners can help to develop treatment strategies. This article provides a framework for a systematic video-based running biomechanics analysis plan based on the current evidence on running injuries, using 2-dimensional (2D) video and readily available tools. Fourteen measurements are proposed in this analysis plan from lateral and posterior video. Identifying simple 2D surrogates for 3D biomechanic variables of interest allows for widespread translation of best practices, and have the best opportunity to impact the highly prevalent problem of the injured runner. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Trial-Based Functional Analysis Informs Treatment for Vocal Scripting.

    Science.gov (United States)

    Rispoli, Mandy; Brodhead, Matthew; Wolfe, Katie; Gregori, Emily

    2018-05-01

    Research on trial-based functional analysis has primarily focused on socially maintained challenging behaviors. However, procedural modifications may be necessary to clarify ambiguous assessment results. The purposes of this study were to evaluate the utility of iterative modifications to trial-based functional analysis on the identification of putative reinforcement and subsequent treatment for vocal scripting. For all participants, modifications to the trial-based functional analysis identified a primary function of automatic reinforcement. The structure of the trial-based format led to identification of social attention as an abolishing operation for vocal scripting. A noncontingent attention treatment was evaluated using withdrawal designs for each participant. This noncontingent attention treatment resulted in near zero levels of vocal scripting for all participants. Implications for research and practice are presented.

  7. A scenario-based procedure for seismic risk analysis

    International Nuclear Information System (INIS)

    Kluegel, J.-U.; Mualchin, L.; Panza, G.F.

    2006-12-01

    A new methodology for seismic risk analysis based on probabilistic interpretation of deterministic or scenario-based hazard analysis, in full compliance with the likelihood principle and therefore meeting the requirements of modern risk analysis, has been developed. The proposed methodology can easily be adjusted to deliver its output in a format required for safety analysts and civil engineers. The scenario-based approach allows the incorporation of all available information collected in a geological, seismotectonic and geotechnical database of the site of interest as well as advanced physical modelling techniques to provide a reliable and robust deterministic design basis for civil infrastructures. The robustness of this approach is of special importance for critical infrastructures. At the same time a scenario-based seismic hazard analysis allows the development of the required input for probabilistic risk assessment (PRA) as required by safety analysts and insurance companies. The scenario-based approach removes the ambiguity in the results of probabilistic seismic hazard analysis (PSHA) which relies on the projections of Gutenberg-Richter (G-R) equation. The problems in the validity of G-R projections, because of incomplete to total absence of data for making the projections, are still unresolved. Consequently, the information from G-R must not be used in decisions for design of critical structures or critical elements in a structure. The scenario-based methodology is strictly based on observable facts and data and complemented by physical modelling techniques, which can be submitted to a formalised validation process. By means of sensitivity analysis, knowledge gaps related to lack of data can be dealt with easily, due to the limited amount of scenarios to be investigated. The proposed seismic risk analysis can be used with confidence for planning, insurance and engineering applications. (author)

  8. Quantitative agreement between [(15)O]H2O PET and model free QUASAR MRI-derived cerebral blood flow and arterial blood volume

    NARCIS (Netherlands)

    Heijtel, D. F. R.; Petersen, E. T.; Mutsaerts, H. J. M. M.; Bakker, E.; Schober, P.; Stevens, M. F.; van Berckel, B. N. M.; Majoie, C. B. L. M.; Booij, J.; van Osch, M. J. P.; van Bavel, E. T.; Boellaard, R.; Lammertsma, A. A.; Nederveen, A. J.

    2016-01-01

    The purpose of this study was to assess whether there was an agreement between quantitative cerebral blood flow (CBF) and arterial cerebral blood volume (CBVA) measurements by [(15)O]H2O positron emission tomography (PET) and model-free QUASAR MRI. Twelve healthy subjects were scanned within a week

  9. Abnormal traffic flow data detection based on wavelet analysis

    Directory of Open Access Journals (Sweden)

    Xiao Qian

    2016-01-01

    Full Text Available In view of the traffic flow data of non-stationary, the abnormal data detection is difficult.proposed basing on the wavelet analysis and least squares method of abnormal traffic flow data detection in this paper.First using wavelet analysis to make the traffic flow data of high frequency and low frequency component and separation, and then, combined with least square method to find abnormal points in the reconstructed signal data.Wavelet analysis and least square method, the simulation results show that using wavelet analysis of abnormal traffic flow data detection, effectively reduce the detection results of misjudgment rate and false negative rate.

  10. Emergy Analysis and Sustainability Efficiency Analysis of Different Crop-Based Biodiesel in Life Cycle Perspective

    DEFF Research Database (Denmark)

    Ren, Jingzheng; Manzardo, Alessandro; Mazzi, Anna

    2013-01-01

    kinds of crop-based biodiesel including soybean-, rapeseed-, sunflower-, jatropha- and palm-based biodiesel production options are studied by emergy analysis; soybean-based scenario is recognized as the most sustainable scenario that should be chosen for further study in China. DEA method is used...

  11. Analysis of security protocols based on challenge-response

    Institute of Scientific and Technical Information of China (English)

    LUO JunZhou; YANG Ming

    2007-01-01

    Security protocol is specified as the procedure of challenge-response, which uses applied cryptography to confirm the existence of other principals and fulfill some data negotiation such as session keys. Most of the existing analysis methods,which either adopt theorem proving techniques such as state exploration or logic reasoning techniques such as authentication logic, face the conflicts between analysis power and operability. To solve the problem, a new efficient method is proposed that provides SSM semantics-based definition of secrecy and authentication goals and applies authentication logic as fundamental analysis techniques,in which secrecy analysis is split into two parts: Explicit-Information-Leakage and Implicit-Information-Leakage, and correspondence analysis is concluded as the analysis of the existence relationship of Strands and the agreement of Strand parameters. This new method owns both the power of the Strand Space Model and concision of authentication logic.

  12. Bayesian-network-based safety risk analysis in construction projects

    International Nuclear Information System (INIS)

    Zhang, Limao; Wu, Xianguo; Skibniewski, Miroslaw J.; Zhong, Jingbing; Lu, Yujie

    2014-01-01

    This paper presents a systemic decision support approach for safety risk analysis under uncertainty in tunnel construction. Fuzzy Bayesian Networks (FBN) is used to investigate causal relationships between tunnel-induced damage and its influential variables based upon the risk/hazard mechanism analysis. Aiming to overcome limitations on the current probability estimation, an expert confidence indicator is proposed to ensure the reliability of the surveyed data for fuzzy probability assessment of basic risk factors. A detailed fuzzy-based inference procedure is developed, which has a capacity of implementing deductive reasoning, sensitivity analysis and abductive reasoning. The “3σ criterion” is adopted to calculate the characteristic values of a triangular fuzzy number in the probability fuzzification process, and the α-weighted valuation method is adopted for defuzzification. The construction safety analysis progress is extended to the entire life cycle of risk-prone events, including the pre-accident, during-construction continuous and post-accident control. A typical hazard concerning the tunnel leakage in the construction of Wuhan Yangtze Metro Tunnel in China is presented as a case study, in order to verify the applicability of the proposed approach. The results demonstrate the feasibility of the proposed approach and its application potential. A comparison of advantages and disadvantages between FBN and fuzzy fault tree analysis (FFTA) as risk analysis tools is also conducted. The proposed approach can be used to provide guidelines for safety analysis and management in construction projects, and thus increase the likelihood of a successful project in a complex environment. - Highlights: • A systemic Bayesian network based approach for safety risk analysis is developed. • An expert confidence indicator for probability fuzzification is proposed. • Safety risk analysis progress is extended to entire life cycle of risk-prone events. • A typical

  13. Static Analysis for Event-Based XML Processing

    DEFF Research Database (Denmark)

    Møller, Anders

    2008-01-01

    Event-based processing of XML data - as exemplified by the popular SAX framework - is a powerful alternative to using W3C's DOM or similar tree-based APIs. The event-based approach is a streaming fashion with minimal memory consumption. This paper discusses challenges for creating program analyses...... for SAX applications. In particular, we consider the problem of statically guaranteeing the a given SAX program always produces only well-formed and valid XML output. We propose an analysis technique based on ecisting anglyses of Servlets, string operations, and XML graphs....

  14. Crystallographically-based analysis of the NMR spectra of maghemite

    International Nuclear Information System (INIS)

    Spiers, K.M.; Cashion, J.D.

    2012-01-01

    All possible iron environments with respect to nearest neighbour vacancies in vacancy-ordered and vacancy-disordered maghemite have been evaluated and used as the foundation for a crystallographically-based analysis of the published NMR spectra of maghemite. The spectral components have been assigned to particular configurations and excellent agreement obtained in comparing predicted spectra with published spectra taken in applied magnetic fields. The broadness of the published NMR lines has been explained by calculations of the magnetic dipole fields at the various iron sites and consideration of the supertransferred hyperfine fields. - Highlights: ► Analysis of 57 Fe NMR of maghemite based on vacancy ordering and nearest neighbour vacancies. ► Assignment of NMR spectral components based on crystallographic analysis of unique iron sites. ► Strong agreement between predicted spectra and published spectra taken in applied magnetic fields. ► Maghemite NMR spectral broadening due to various iron sites and supertransferred hyperfine field.

  15. Discrete Discriminant analysis based on tree-structured graphical models

    DEFF Research Database (Denmark)

    Perez de la Cruz, Gonzalo; Eslava, Guillermina

    The purpose of this paper is to illustrate the potential use of discriminant analysis based on tree{structured graphical models for discrete variables. This is done by comparing its empirical performance using estimated error rates for real and simulated data. The results show that discriminant a...... analysis based on tree{structured graphical models is a simple nonlinear method competitive with, and sometimes superior to, other well{known linear methods like those assuming mutual independence between variables and linear logistic regression.......The purpose of this paper is to illustrate the potential use of discriminant analysis based on tree{structured graphical models for discrete variables. This is done by comparing its empirical performance using estimated error rates for real and simulated data. The results show that discriminant...

  16. Similar words analysis based on POS-CBOW language model

    Directory of Open Access Journals (Sweden)

    Dongru RUAN

    2015-10-01

    Full Text Available Similar words analysis is one of the important aspects in the field of natural language processing, and it has important research and application values in text classification, machine translation and information recommendation. Focusing on the features of Sina Weibo's short text, this paper presents a language model named as POS-CBOW, which is a kind of continuous bag-of-words language model with the filtering layer and part-of-speech tagging layer. The proposed approach can adjust the word vectors' similarity according to the cosine similarity and the word vectors' part-of-speech metrics. It can also filter those similar words set on the base of the statistical analysis model. The experimental result shows that the similar words analysis algorithm based on the proposed POS-CBOW language model is better than that based on the traditional CBOW language model.

  17. CASAS: Cancer Survival Analysis Suite, a web based application.

    Science.gov (United States)

    Rupji, Manali; Zhang, Xinyan; Kowalski, Jeanne

    2017-01-01

    We present CASAS, a shiny R based tool for interactive survival analysis and visualization of results. The tool provides a web-based one stop shop to perform the following types of survival analysis:  quantile, landmark and competing risks, in addition to standard survival analysis.  The interface makes it easy to perform such survival analyses and obtain results using the interactive Kaplan-Meier and cumulative incidence plots.  Univariate analysis can be performed on one or several user specified variable(s) simultaneously, the results of which are displayed in a single table that includes log rank p-values and hazard ratios along with their significance. For several quantile survival analyses from multiple cancer types, a single summary grid is constructed. The CASAS package has been implemented in R and is available via http://shinygispa.winship.emory.edu/CASAS/. The developmental repository is available at https://github.com/manalirupji/CASAS/.

  18. Matrix-based introduction to multivariate data analysis

    CERN Document Server

    Adachi, Kohei

    2016-01-01

    This book enables readers who may not be familiar with matrices to understand a variety of multivariate analysis procedures in matrix forms. Another feature of the book is that it emphasizes what model underlies a procedure and what objective function is optimized for fitting the model to data. The author believes that the matrix-based learning of such models and objective functions is the fastest way to comprehend multivariate data analysis. The text is arranged so that readers can intuitively capture the purposes for which multivariate analysis procedures are utilized: plain explanations of the purposes with numerical examples precede mathematical descriptions in almost every chapter. This volume is appropriate for undergraduate students who already have studied introductory statistics. Graduate students and researchers who are not familiar with matrix-intensive formulations of multivariate data analysis will also find the book useful, as it is based on modern matrix formulations with a special emphasis on ...

  19. Cognitive Task Analysis Based Training for Cyber Situation Awareness

    OpenAIRE

    Huang , Zequn; Shen , Chien-Chung; Doshi , Sheetal; Thomas , Nimmi; Duong , Ha

    2015-01-01

    Part 1: Innovative Methods; International audience; Cyber attacks have been increasing significantly in both number and complexity, prompting the need for better training of cyber defense analysts. To conduct effective training for cyber situation awareness, it becomes essential to design realistic training scenarios. In this paper, we present a Cognitive Task Analysis based approach to address this training need. The technique of Cognitive Task Analysis is to capture and represent knowledge ...

  20. European Climate - Energy Security Nexus. A model based scenario analysis

    International Nuclear Information System (INIS)

    Criqui, Patrick; Mima, Silvana

    2011-01-01

    In this research, we have provided an overview of the climate-security nexus in the European sector through a model based scenario analysis with POLES model. The analysis underline that under stringent climate policies, Europe take advantage of a double dividend in its capacity to develop a new cleaner energy model and in lower vulnerability to potential shocks on the international energy markets. (authors)

  1. Managerial Methods Based on Analysis, Recommended to a Boarding House

    Directory of Open Access Journals (Sweden)

    Solomia Andreş

    2015-06-01

    Full Text Available The paper presents a few theoretical and practical contributions regarding the implementing of analysis based methods, respectively a SWOT and an economic analysis, from the perspective and the demands of a firm management which functions with profits due to the activity of a boarding house. The two types of managerial methods recommended to the firm offer real and complex information necessary for the knowledge of the firm status and the elaboration of prediction for the maintaining of business viability.

  2. Fast Template-based Shape Analysis using Diffeomorphic Iterative Centroid

    OpenAIRE

    Cury , Claire; Glaunès , Joan Alexis; Chupin , Marie; Colliot , Olivier

    2014-01-01

    International audience; A common approach for the analysis of anatomical variability relies on the estimation of a representative template of the population, followed by the study of this population based on the parameters of the deformations going from the template to the population. The Large Deformation Diffeomorphic Metric Mapping framework is widely used for shape analysis of anatomical structures, but computing a template with such framework is computationally expensive. In this paper w...

  3. Reliability-Based Robustness Analysis for a Croatian Sports Hall

    DEFF Research Database (Denmark)

    Čizmar, Dean; Kirkegaard, Poul Henning; Sørensen, John Dalsgaard

    2011-01-01

    This paper presents a probabilistic approach for structural robustness assessment for a timber structure built a few years ago. The robustness analysis is based on a structural reliability based framework for robustness and a simplified mechanical system modelling of a timber truss system....... A complex timber structure with a large number of failure modes is modelled with only a few dominant failure modes. First, a component based robustness analysis is performed based on the reliability indices of the remaining elements after the removal of selected critical elements. The robustness...... is expressed and evaluated by a robustness index. Next, the robustness is assessed using system reliability indices where the probabilistic failure model is modelled by a series system of parallel systems....

  4. Digital Printing Quality Detection and Analysis Technology Based on CCD

    Science.gov (United States)

    He, Ming; Zheng, Liping

    2017-12-01

    With the help of CCD digital printing quality detection and analysis technology, it can carry out rapid evaluation and objective detection of printing quality, and can play a certain control effect on printing quality. It can be said CDD digital printing quality testing and analysis of the rational application of technology, its digital printing and printing materials for a variety of printing equipments to improve the quality of a very positive role. In this paper, we do an in-depth study and discussion based on the CCD digital print quality testing and analysis technology.

  5. Open access for ALICE analysis based on virtualization technology

    CERN Document Server

    Buncic, P; Schutz, Y

    2015-01-01

    Open access is one of the important leverages for long-term data preservation for a HEP experiment. To guarantee the usability of data analysis tools beyond the experiment lifetime it is crucial that third party users from the scientific community have access to the data and associated software. The ALICE Collaboration has developed a layer of lightweight components built on top of virtualization technology to hide the complexity and details of the experiment-specific software. Users can perform basic analysis tasks within CernVM, a lightweight generic virtual machine, paired with an ALICE specific contextualization. Once the virtual machine is launched, a graphical user interface is automatically started without any additional configuration. This interface allows downloading the base ALICE analysis software and running a set of ALICE analysis modules. Currently the available tools include fully documented tutorials for ALICE analysis, such as the measurement of strange particle production or the nuclear modi...

  6. Automatic Video-based Analysis of Human Motion

    DEFF Research Database (Denmark)

    Fihl, Preben

    The human motion contains valuable information in many situations and people frequently perform an unconscious analysis of the motion of other people to understand their actions, intentions, and state of mind. An automatic analysis of human motion will facilitate many applications and thus has...... received great interest from both industry and research communities. The focus of this thesis is on video-based analysis of human motion and the thesis presents work within three overall topics, namely foreground segmentation, action recognition, and human pose estimation. Foreground segmentation is often...... the first important step in the analysis of human motion. By separating foreground from background the subsequent analysis can be focused and efficient. This thesis presents a robust background subtraction method that can be initialized with foreground objects in the scene and is capable of handling...

  7. Nucleonica. Web-based software tools for simulation and analysis

    International Nuclear Information System (INIS)

    Magill, J.; Dreher, R.; Soti, Z.

    2014-01-01

    The authors present a description of the Nucleonica web-based portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data resources, and some applications is given.

  8. Research of second harmonic generation images based on texture analysis

    Science.gov (United States)

    Liu, Yao; Li, Yan; Gong, Haiming; Zhu, Xiaoqin; Huang, Zufang; Chen, Guannan

    2014-09-01

    Texture analysis plays a crucial role in identifying objects or regions of interest in an image. It has been applied to a variety of medical image processing, ranging from the detection of disease and the segmentation of specific anatomical structures, to differentiation between healthy and pathological tissues. Second harmonic generation (SHG) microscopy as a potential noninvasive tool for imaging biological tissues has been widely used in medicine, with reduced phototoxicity and photobleaching. In this paper, we clarified the principles of texture analysis including statistical, transform, structural and model-based methods and gave examples of its applications, reviewing studies of the technique. Moreover, we tried to apply texture analysis to the SHG images for the differentiation of human skin scar tissues. Texture analysis method based on local binary pattern (LBP) and wavelet transform was used to extract texture features of SHG images from collagen in normal and abnormal scars, and then the scar SHG images were classified into normal or abnormal ones. Compared with other texture analysis methods with respect to the receiver operating characteristic analysis, LBP combined with wavelet transform was demonstrated to achieve higher accuracy. It can provide a new way for clinical diagnosis of scar types. At last, future development of texture analysis in SHG images were discussed.

  9. Breath analysis based on micropreconcentrator for early cancer diagnosis

    Science.gov (United States)

    Lee, Sang-Seok

    2018-02-01

    We are developing micropreconcentrators based on micro/nanotechnology to detect trace levels of volatile organic compound (VOC) gases contained in human and canine exhaled breath. The possibility of using exhaled VOC gases as biomarkers for various cancer diagnoses has been previously discussed. For early cancer diagnosis, detection of trace levels of VOC gas is indispensable. Using micropreconcentrators based on MEMS technology or nanotechnology is very promising for detection of VOC gas. A micropreconcentrator based breath analysis technique also has advantages from the viewpoints of cost performance and availability for various cancers diagnosis. In this paper, we introduce design, fabrication and evaluation results of our MEMS and nanotechnology based micropreconcentrators. In the MEMS based device, we propose a flower leaf type Si microstructure, and its shape and configuration are optimized quantitatively by finite element method simulation. The nanotechnology based micropreconcentrator consists of carbon nanotube (CNT) structures. As a result, we achieve ppb level VOC gas detection with our micropreconcentrators and usual gas chromatography system that can detect on the order of ppm VOC in gas samples. In performance evaluation, we also confirm that the CNT based micropreconcentrator shows 115 times better concentration ratio than that of the Si based micropreconcentrator. Moreover, we discuss a commercialization idea for new cancer diagnosis using breath analysis. Future work and preliminary clinical testing in dogs is also discussed.

  10. EBprot: Statistical analysis of labeling-based quantitative proteomics data.

    Science.gov (United States)

    Koh, Hiromi W L; Swa, Hannah L F; Fermin, Damian; Ler, Siok Ghee; Gunaratne, Jayantha; Choi, Hyungwon

    2015-08-01

    Labeling-based proteomics is a powerful method for detection of differentially expressed proteins (DEPs). The current data analysis platform typically relies on protein-level ratios, which is obtained by summarizing peptide-level ratios for each protein. In shotgun proteomics, however, some proteins are quantified with more peptides than others, and this reproducibility information is not incorporated into the differential expression (DE) analysis. Here, we propose a novel probabilistic framework EBprot that directly models the peptide-protein hierarchy and rewards the proteins with reproducible evidence of DE over multiple peptides. To evaluate its performance with known DE states, we conducted a simulation study to show that the peptide-level analysis of EBprot provides better receiver-operating characteristic and more accurate estimation of the false discovery rates than the methods based on protein-level ratios. We also demonstrate superior classification performance of peptide-level EBprot analysis in a spike-in dataset. To illustrate the wide applicability of EBprot in different experimental designs, we applied EBprot to a dataset for lung cancer subtype analysis with biological replicates and another dataset for time course phosphoproteome analysis of EGF-stimulated HeLa cells with multiplexed labeling. Through these examples, we show that the peptide-level analysis of EBprot is a robust alternative to the existing statistical methods for the DE analysis of labeling-based quantitative datasets. The software suite is freely available on the Sourceforge website http://ebprot.sourceforge.net/. All MS data have been deposited in the ProteomeXchange with identifier PXD001426 (http://proteomecentral.proteomexchange.org/dataset/PXD001426/). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Nuclear power company activity based costing management analysis

    International Nuclear Information System (INIS)

    Xu Dan

    2012-01-01

    With Nuclear Energy Industry development, Nuclear Power Company has the continual promoting stress of inner management to the sustainable marketing operation development. In view of this, it is very imminence that Nuclear Power Company should promote the cost management levels and built the nuclear safety based lower cost competitive advantage. Activity based costing management (ABCM) transfer the cost management emphases from the 'product' to the 'activity' using the value chain analysis methods, cost driver analysis methods and so on. According to the analysis of the detail activities and the value chains, cancel the unnecessary activity, low down the resource consuming of the necessary activity, and manage the cost from the source, achieve the purpose of reducing cost, boosting efficiency and realizing the management value. It gets the conclusion from the detail analysis with the nuclear power company procedure and activity, and also with the selection to 'pieces analysis' of the important cost related project in the nuclear power company. The conclusion is that the activities of the nuclear power company has the obviously performance. It can use the management of ABC method. And with the management of the procedure and activity, it is helpful to realize the nuclear safety based low cost competitive advantage in the nuclear power company. (author)

  12. Modeling and Grid impedance Variation Analysis of Parallel Connected Grid Connected Inverter based on Impedance Based Harmonic Analysis

    DEFF Research Database (Denmark)

    Kwon, JunBum; Wang, Xiongfei; Bak, Claus Leth

    2014-01-01

    This paper addresses the harmonic compensation error problem existing with parallel connected inverter in the same grid interface conditions by means of impedance-based analysis and modeling. Unlike the single grid connected inverter, it is found that multiple parallel connected inverters and grid...... impedance can make influence to each other if they each have a harmonic compensation function. The analysis method proposed in this paper is based on the relationship between the overall output impedance and input impedance of parallel connected inverter, where controller gain design method, which can...

  13. Digital Simulation-Based Training: A Meta-Analysis

    Science.gov (United States)

    Gegenfurtner, Andreas; Quesada-Pallarès, Carla; Knogler, Maximilian

    2014-01-01

    This study examines how design characteristics in digital simulation-based learning environments moderate self-efficacy and transfer of learning. Drawing on social cognitive theory and the cognitive theory of multimedia learning, the meta-analysis psychometrically cumulated k?=?15 studies of 25 years of research with a total sample size of…

  14. PETRA - an Activity-based Approach to Travel Demand Analysis

    DEFF Research Database (Denmark)

    Fosgerau, Mogens

    2001-01-01

    This paper concerns the PETRA model developed by COWI in a project funded by the Danish Ministry of Transport, the Danish Transport Council and the Danish Energy Research Program. The model provides an alternative approach to activity based travel demand analysis that excludes the time dimension...

  15. Automatic Online Lecture Highlighting Based on Multimedia Analysis

    Science.gov (United States)

    Che, Xiaoyin; Yang, Haojin; Meinel, Christoph

    2018-01-01

    Textbook highlighting is widely considered to be beneficial for students. In this paper, we propose a comprehensive solution to highlight the online lecture videos in both sentence- and segment-level, just as is done with paper books. The solution is based on automatic analysis of multimedia lecture materials, such as speeches, transcripts, and…

  16. Analysis of Computer Network Information Based on "Big Data"

    Science.gov (United States)

    Li, Tianli

    2017-11-01

    With the development of the current era, computer network and large data gradually become part of the people's life, people use the computer to provide convenience for their own life, but at the same time there are many network information problems has to pay attention. This paper analyzes the information security of computer network based on "big data" analysis, and puts forward some solutions.

  17. Numerical analysis of a polysilicon-based resistive memory device

    KAUST Repository

    Berco, Dan; Chand, Umesh

    2018-01-01

    This study investigates a conductive bridge resistive memory device based on a Cu top electrode, 10-nm polysilicon resistive switching layer and a TiN bottom electrode, by numerical analysis for $$10^{3}$$103 programming and erase simulation cycles

  18. Prototype-based analysis of GAMA galaxy catalogue data

    NARCIS (Netherlands)

    Nolte, A.; Wang, L.; Biehl, M; Verleysen, Michel

    2018-01-01

    We present a prototype-based machine learning analysis of labeled galaxy catalogue data containing parameters from the Galaxy and Mass Assembly (GAMA) survey. Using both an unsupervised and supervised method, the Self-Organizing Map and Generalized Relevance Matrix Learning Vec- tor Quantization, we

  19. GIS based analysis of future district heating potential in Denmark

    DEFF Research Database (Denmark)

    Nielsen, Steffen; Möller, Bernd

    2013-01-01

    in Denmark have been mapped in a heat atlas which includes all buildings and their heat demands. This article focuses on developing a method for assessing the costs associated with supplying these buildings with DH. The analysis is based on the existing DH areas in Denmark. By finding the heat production...

  20. Geopolitical E-Analysis Based on E-Learning Content

    Science.gov (United States)

    Dinicu, Anca; Oancea, Romana

    2017-01-01

    In a world of great complexity, understanding the manner states act and react becomes more and more an intriguing quest due to the multiple relations of dependence and interdependence that characterize "the global puzzle". Within this context, an analysis based on a geopolitical approach becomes a very useful means used to determine not…

  1. Time Series Analysis Based on Running Mann Whitney Z Statistics

    Science.gov (United States)

    A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...

  2. Automated analysis of security requirements through risk-based argumentation

    NARCIS (Netherlands)

    Yu, Yijun; Nunes Leal Franqueira, V.; Tun, Thein Tan; Wieringa, Roelf J.; Nuseibeh, Bashar

    2015-01-01

    Computer-based systems are increasingly being exposed to evolving security threats, which often reveal new vulnerabilities. A formal analysis of the evolving threats is difficult due to a number of practical considerations such as incomplete knowledge about the design, limited information about

  3. Likelihood-based Dynamic Factor Analysis for Measurement and Forecasting

    NARCIS (Netherlands)

    Jungbacker, B.M.J.P.; Koopman, S.J.

    2015-01-01

    We present new results for the likelihood-based analysis of the dynamic factor model. The latent factors are modelled by linear dynamic stochastic processes. The idiosyncratic disturbance series are specified as autoregressive processes with mutually correlated innovations. The new results lead to

  4. Phylogenetic diversity analysis of Trichoderma species based on ...

    African Journals Online (AJOL)

    vi-4177/CSAU be assigned as the type strains of a species of genus Trichoderma based on phylogenetic tree analysis together with the 18S rRNA gene sequence search in Ribosomal Database Project, small subunit rRNA and large subunit ...

  5. Model-based analysis and simulation of regenerative heat wheel

    DEFF Research Database (Denmark)

    Wu, Zhuang; Melnik, Roderick V. N.; Borup, F.

    2006-01-01

    The rotary regenerator (also called the heat wheel) is an important component of energy intensive sectors, which is used in many heat recovery systems. In this paper, a model-based analysis of a rotary regenerator is carried out with a major emphasis given to the development and implementation of...

  6. Vision-based human motion analysis: An overview

    NARCIS (Netherlands)

    Poppe, Ronald Walter

    2007-01-01

    Markerless vision-based human motion analysis has the potential to provide an inexpensive, non-obtrusive solution for the estimation of body poses. The significant research effort in this domain has been motivated by the fact that many application areas, including surveillance, Human-Computer

  7. Knowledge-based analysis of functional impacts of mutations in ...

    Indian Academy of Sciences (India)

    Knowledge-based analysis of functional impacts of mutations in microRNA seed regions. Supplementary figure 1. Summary of predicted miRNA targets from ... All naturally occurred SNPs in seed regions of human miRNAs. The information of the columns is given in the second sheet. Hihly expressed miRNAs are ...

  8. Computer-Based Interaction Analysis with DEGREE Revisited

    Science.gov (United States)

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  9. Intelligent assembly time analysis, using a digital knowledge based approach

    NARCIS (Netherlands)

    Jin, Y.; Curran, R.; Butterfield, J.; Burke, R.; Welch, B.

    2009-01-01

    The implementation of effective time analysis methods fast and accurately in the era of digital manufacturing has become a significant challenge for aerospace manufacturers hoping to build and maintain a competitive advantage. This paper proposes a structure oriented, knowledge-based approach for

  10. Advancing School-Based Interventions through Economic Analysis

    Science.gov (United States)

    Olsson, Tina M.; Ferrer-Wreder, Laura; Eninger, Lilianne

    2014-01-01

    Commentators interested in school-based prevention programs point to the importance of economic issues for the future of prevention efforts. Many of the processes and aims of prevention science are dependent upon prevention resources. Although economic analysis is an essential tool for assessing resource use, the attention given economic analysis…

  11. An Analysis of Losses to the Southern Commercial Timberland Base

    Science.gov (United States)

    Ian A. Munn; David Cleaves

    1998-01-01

    Demographic and physical factors influencing the conversion of commercial timberland iu the south to non-forestry uses between the last two Forest Inventory Analysis (FIA) surveys were investigated. GIS techniques linked Census data and FIA plot level data. Multinomial logit regression identified factors associated with losses to the timberland base. Conversion to...

  12. System of gait analysis based on ground reaction force assessment

    Directory of Open Access Journals (Sweden)

    František Vaverka

    2015-12-01

    Full Text Available Background: Biomechanical analysis of gait employs various methods used in kinematic and kinetic analysis, EMG, and others. One of the most frequently used methods is kinetic analysis based on the assessment of the ground reaction forces (GRF recorded on two force plates. Objective: The aim of the study was to present a method of gait analysis based on the assessment of the GRF recorded during the stance phase of two steps. Methods: The GRF recorded with a force plate on one leg during stance phase has three components acting in directions: Fx - mediolateral, Fy - anteroposterior, and Fz - vertical. A custom-written MATLAB script was used for gait analysis in this study. This software displays instantaneous force data for both legs as Fx(t, Fy(t and Fz(t curves, automatically determines the extremes of functions and sets the visual markers defining the individual points of interest. Positions of these markers can be easily adjusted by the rater, which may be necessary if the GRF has an atypical pattern. The analysis is fully automated and analyzing one trial takes only 1-2 minutes. Results: The method allows quantification of temporal variables of the extremes of the Fx(t, Fy(t, Fz(t functions, durations of the braking and propulsive phase, duration of the double support phase, the magnitudes of reaction forces in extremes of measured functions, impulses of force, and indices of symmetry. The analysis results in a standardized set of 78 variables (temporal, force, indices of symmetry which can serve as a basis for further research and diagnostics. Conclusions: The resulting set of variable offers a wide choice for selecting a specific group of variables with consideration to a particular research topic. The advantage of this method is the standardization of the GRF analysis, low time requirements allowing rapid analysis of a large number of trials in a short time, and comparability of the variables obtained during different research measurements.

  13. Orthogonal Analysis Based Performance Optimization for Vertical Axis Wind Turbine

    Directory of Open Access Journals (Sweden)

    Lei Song

    2016-01-01

    Full Text Available Geometrical shape of a vertical axis wind turbine (VAWT is composed of multiple structural parameters. Since there are interactions among the structural parameters, traditional research approaches, which usually focus on one parameter at a time, cannot obtain performance of the wind turbine accurately. In order to exploit overall effect of a novel VAWT, we firstly use a single parameter optimization method to obtain optimal values of the structural parameters, respectively, by Computational Fluid Dynamics (CFD method; based on the results, we then use an orthogonal analysis method to investigate the influence of interactions of the structural parameters on performance of the wind turbine and to obtain optimization combination of the structural parameters considering the interactions. Results of analysis of variance indicate that interactions among the structural parameters have influence on performance of the wind turbine, and optimization results based on orthogonal analysis have higher wind energy utilization than that of traditional research approaches.

  14. Control volume based hydrocephalus research; analysis of human data

    Science.gov (United States)

    Cohen, Benjamin; Wei, Timothy; Voorhees, Abram; Madsen, Joseph; Anor, Tomer

    2010-11-01

    Hydrocephalus is a neuropathophysiological disorder primarily diagnosed by increased cerebrospinal fluid volume and pressure within the brain. To date, utilization of clinical measurements have been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Pressure volume models and electric circuit analogs enforce volume conservation principles in terms of pressure. Control volume analysis, through the integral mass and momentum conservation equations, ensures that pressure and volume are accounted for using first principles fluid physics. This approach is able to directly incorporate the diverse measurements obtained by clinicians into a simple, direct and robust mechanics based framework. Clinical data obtained for analysis are discussed along with data processing techniques used to extract terms in the conservation equation. Control volume analysis provides a non-invasive, physics-based approach to extracting pressure information from magnetic resonance velocity data that cannot be measured directly by pressure instrumentation.

  15. [Electroencephalogram Feature Selection Based on Correlation Coefficient Analysis].

    Science.gov (United States)

    Zhou, Jinzhi; Tang, Xiaofang

    2015-08-01

    In order to improve the accuracy of classification with small amount of motor imagery training data on the development of brain-computer interface (BCD systems, we proposed an analyzing method to automatically select the characteristic parameters based on correlation coefficient analysis. Throughout the five sample data of dataset IV a from 2005 BCI Competition, we utilized short-time Fourier transform (STFT) and correlation coefficient calculation to reduce the number of primitive electroencephalogram dimension, then introduced feature extraction based on common spatial pattern (CSP) and classified by linear discriminant analysis (LDA). Simulation results showed that the average rate of classification accuracy could be improved by using correlation coefficient feature selection method than those without using this algorithm. Comparing with support vector machine (SVM) optimization features algorithm, the correlation coefficient analysis can lead better selection parameters to improve the accuracy of classification.

  16. Content-based analysis and indexing of sports video

    Science.gov (United States)

    Luo, Ming; Bai, Xuesheng; Xu, Guang-you

    2001-12-01

    An explosion of on-line image and video data in digital form is already well underway. With the exponential rise in interactive information exploration and dissemination through the World-Wide Web, the major inhibitors of rapid access to on-line video data are the management of capture and storage, and content-based intelligent search and indexing techniques. This paper proposes an approach for content-based analysis and event-based indexing of sports video. It includes a novel method to organize shots - classifying shots as close shots and far shots, an original idea of blur extent-based event detection, and an innovative local mutation-based algorithm for caption detection and retrieval. Results on extensive real TV programs demonstrate the applicability of our approach.

  17. Protein expression based multimarker analysis of breast cancer samples

    International Nuclear Information System (INIS)

    Presson, Angela P; Horvath, Steve; Yoon, Nam K; Bagryanova, Lora; Mah, Vei; Alavi, Mohammad; Maresh, Erin L; Rajasekaran, Ayyappan K; Goodglick, Lee; Chia, David

    2011-01-01

    Tissue microarray (TMA) data are commonly used to validate the prognostic accuracy of tumor markers. For example, breast cancer TMA data have led to the identification of several promising prognostic markers of survival time. Several studies have shown that TMA data can also be used to cluster patients into clinically distinct groups. Here we use breast cancer TMA data to cluster patients into distinct prognostic groups. We apply weighted correlation network analysis (WGCNA) to TMA data consisting of 26 putative tumor biomarkers measured on 82 breast cancer patients. Based on this analysis we identify three groups of patients with low (5.4%), moderate (22%) and high (50%) mortality rates, respectively. We then develop a simple threshold rule using a subset of three markers (p53, Na-KATPase-β1, and TGF β receptor II) that can approximately define these mortality groups. We compare the results of this correlation network analysis with results from a standard Cox regression analysis. We find that the rule-based grouping variable (referred to as WGCNA*) is an independent predictor of survival time. While WGCNA* is based on protein measurements (TMA data), it validated in two independent Affymetrix microarray gene expression data (which measure mRNA abundance). We find that the WGCNA patient groups differed by 35% from mortality groups defined by a more conventional stepwise Cox regression analysis approach. We show that correlation network methods, which are primarily used to analyze the relationships between gene products, are also useful for analyzing the relationships between patients and for defining distinct patient groups based on TMA data. We identify a rule based on three tumor markers for predicting breast cancer survival outcomes

  18. Effects of Interventions Based in Behavior Analysis on Motor Skill Acquisition: A Meta-Analysis

    Science.gov (United States)

    Alstot, Andrew E.; Kang, Minsoo; Alstot, Crystal D.

    2013-01-01

    Techniques based in applied behavior analysis (ABA) have been shown to be useful across a variety of settings to improve numerous behaviors. Specifically within physical activity settings, several studies have examined the effect of interventions based in ABA on a variety of motor skills, but the overall effects of these interventions are unknown.…

  19. Environmental Assessment: General Plan-Based Environmental Impact Analysis Process, Laughlin Air Force Base

    Science.gov (United States)

    2007-05-01

    BASED ENVIROMENTAL IMPACT ANALYSIS PROCESS LAUGHLIN AIR FORCE BASE, TEXAS AGENCY: 47th Flying Training Wing (FTW), Laughlin Air Force Base (AFB), Texas...m3 micrograms per cubic meter US United States USACE United States Army Corp of Engineers USC United States Code USCB United States Census Bureau...effects and annoyance in that very few flight operations and ground engine runs occur between 2200 hours and 0700 hours. BMPs include restricting the

  20. Thermodynamics-based Metabolite Sensitivity Analysis in metabolic networks.

    Science.gov (United States)

    Kiparissides, A; Hatzimanikatis, V

    2017-01-01

    The increasing availability of large metabolomics datasets enhances the need for computational methodologies that can organize the data in a way that can lead to the inference of meaningful relationships. Knowledge of the metabolic state of a cell and how it responds to various stimuli and extracellular conditions can offer significant insight in the regulatory functions and how to manipulate them. Constraint based methods, such as Flux Balance Analysis (FBA) and Thermodynamics-based flux analysis (TFA), are commonly used to estimate the flow of metabolites through genome-wide metabolic networks, making it possible to identify the ranges of flux values that are consistent with the studied physiological and thermodynamic conditions. However, unless key intracellular fluxes and metabolite concentrations are known, constraint-based models lead to underdetermined problem formulations. This lack of information propagates as uncertainty in the estimation of fluxes and basic reaction properties such as the determination of reaction directionalities. Therefore, knowledge of which metabolites, if measured, would contribute the most to reducing this uncertainty can significantly improve our ability to define the internal state of the cell. In the present work we combine constraint based modeling, Design of Experiments (DoE) and Global Sensitivity Analysis (GSA) into the Thermodynamics-based Metabolite Sensitivity Analysis (TMSA) method. TMSA ranks metabolites comprising a metabolic network based on their ability to constrain the gamut of possible solutions to a limited, thermodynamically consistent set of internal states. TMSA is modular and can be applied to a single reaction, a metabolic pathway or an entire metabolic network. This is, to our knowledge, the first attempt to use metabolic modeling in order to provide a significance ranking of metabolites to guide experimental measurements. Copyright © 2016 International Metabolic Engineering Society. Published by Elsevier

  1. Model-based gene set analysis for Bioconductor.

    Science.gov (United States)

    Bauer, Sebastian; Robinson, Peter N; Gagneur, Julien

    2011-07-01

    Gene Ontology and other forms of gene-category analysis play a major role in the evaluation of high-throughput experiments in molecular biology. Single-category enrichment analysis procedures such as Fisher's exact test tend to flag large numbers of redundant categories as significant, which can complicate interpretation. We have recently developed an approach called model-based gene set analysis (MGSA), that substantially reduces the number of redundant categories returned by the gene-category analysis. In this work, we present the Bioconductor package mgsa, which makes the MGSA algorithm available to users of the R language. Our package provides a simple and flexible application programming interface for applying the approach. The mgsa package has been made available as part of Bioconductor 2.8. It is released under the conditions of the Artistic license 2.0. peter.robinson@charite.de; julien.gagneur@embl.de.

  2. Kronecker Algebra-based Deadlock Analysis for Railway Systems

    Directory of Open Access Journals (Sweden)

    Robert Mittermayr

    2012-09-01

    Full Text Available Deadlock analysis for railway systems differs in several aspects from deadlock analysis in computer science. While the problem of deadlock analysis for standard computer systems is well-understood, multi-threaded embedded computer systems pose new challenges. A novel approach in this area can easily be applied to deadlock analysis in the domain of railway systems. The approach is based on Kronecker algebra. A lazy implementation of the matrix operations even allows analysing exponentially sized systems in a very efficient manner. The running time of the algorithm does not depend on the problem size but on the size of the solution. While other approaches suffer from the fact that additional constraints make the problem and its solution harder, our approach delivers its results faster if constraints are added. In addition, our approach is complete and sound for railway systems, i.e., it generates neither false positives nor false negatives.

  3. The analysis phase in development of knowledge-based systems

    International Nuclear Information System (INIS)

    Brooking, A.G.

    1986-01-01

    Over the past twenty years computer scientists have realized that, in order to produce reliable software that is easily modifiable, a proven methodology is required. Unlike conventional systems there is little knowledge of the life cycle of these knowledge-based systems. However, if the life cycle of conventional systems, it is not unreasonable to assume that analysis will come first. With respect to the analysis task there is an enormous difference in types of analysis. Conventional systems analysis is predominately concerned with what happens within the system. Typically, procedures will be noted in the way they relate to each other, the way data moves and changes within the system. There is often an example, on paper or machine, that can be observed

  4. A Knowledge-based Environment for Software Process Performance Analysis

    Directory of Open Access Journals (Sweden)

    Natália Chaves Lessa Schots

    2015-08-01

    Full Text Available Background: Process performance analysis is a key step for implementing continuous improvement in software organizations. However, the knowledge to execute such analysis is not trivial and the person responsible to executing it must be provided with appropriate support. Aim: This paper presents a knowledge-based environment, named SPEAKER, proposed for supporting software organizations during the execution of process performance analysis. SPEAKER comprises a body of knowledge and a set of activities and tasks for software process performance analysis along with supporting tools to executing these activities and tasks. Method: We conducted an informal literature reviews and a systematic mapping study, which provided basic requirements for the proposed environment. We implemented the SPEAKER environment integrating supporting tools for the execution of activities and tasks of performance analysis and the knowledge necessary to execute them, in order to meet the variability presented by the characteristics of these activities. Results: In this paper, we describe each SPEAKER module and the individual evaluations of these modules, and also present an example of use comprising how the environment can guide the user through a specific performance analysis activity. Conclusion: Although we only conducted individual evaluations of SPEAKER’s modules, the example of use indicates the feasibility of the proposed environment. Therefore, the environment as a whole will be further evaluated to verify if it attains its goal of assisting in the execution of process performance analysis by non-specialist people.

  5. Cryptographic protocol security analysis based on bounded constructing algorithm

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    An efficient approach to analyzing cryptographic protocols is to develop automatic analysis tools based on formal methods. However, the approach has encountered the high computational complexity problem due to reasons that participants of protocols are arbitrary, their message structures are complex and their executions are concurrent. We propose an efficient automatic verifying algorithm for analyzing cryptographic protocols based on the Cryptographic Protocol Algebra (CPA) model proposed recently, in which algebraic techniques are used to simplify the description of cryptographic protocols and their executions. Redundant states generated in the analysis processes are much reduced by introducing a new algebraic technique called Universal Polynomial Equation and the algorithm can be used to verify the correctness of protocols in the infinite states space. We have implemented an efficient automatic analysis tool for cryptographic protocols, called ACT-SPA, based on this algorithm, and used the tool to check more than 20 cryptographic protocols. The analysis results show that this tool is more efficient, and an attack instance not offered previously is checked by using this tool.

  6. Barcode server: a visualization-based genome analysis system.

    Directory of Open Access Journals (Sweden)

    Fenglou Mao

    Full Text Available We have previously developed a computational method for representing a genome as a barcode image, which makes various genomic features visually apparent. We have demonstrated that this visual capability has made some challenging genome analysis problems relatively easy to solve. We have applied this capability to a number of challenging problems, including (a identification of horizontally transferred genes, (b identification of genomic islands with special properties and (c binning of metagenomic sequences, and achieved highly encouraging results. These application results inspired us to develop this barcode-based genome analysis server for public service, which supports the following capabilities: (a calculation of the k-mer based barcode image for a provided DNA sequence; (b detection of sequence fragments in a given genome with distinct barcodes from those of the majority of the genome, (c clustering of provided DNA sequences into groups having similar barcodes; and (d homology-based search using Blast against a genome database for any selected genomic regions deemed to have interesting barcodes. The barcode server provides a job management capability, allowing processing of a large number of analysis jobs for barcode-based comparative genome analyses. The barcode server is accessible at http://csbl1.bmb.uga.edu/Barcode.

  7. Setting Standards for Medically-Based Running Analysis

    Science.gov (United States)

    Vincent, Heather K.; Herman, Daniel C.; Lear-Barnes, Leslie; Barnes, Robert; Chen, Cong; Greenberg, Scott; Vincent, Kevin R.

    2015-01-01

    Setting standards for medically based running analyses is necessary to ensure that runners receive a high-quality service from practitioners. Medical and training history, physical and functional tests, and motion analysis of running at self-selected and faster speeds are key features of a comprehensive analysis. Self-reported history and movement symmetry are critical factors that require follow-up therapy or long-term management. Pain or injury is typically the result of a functional deficit above or below the site along the kinematic chain. PMID:25014394

  8. A microprocessor based picture analysis system for automatic track measurements

    International Nuclear Information System (INIS)

    Heinrich, W.; Trakowski, W.; Beer, J.; Schucht, R.

    1982-01-01

    In the last few years picture analysis became a powerful technique for measurements of nuclear tracks in plastic detectors. For this purpose rather expensive commercial systems are available. Two inexpensive microprocessor based systems with different resolution were developed. The video pictures of particles seen through a microscope are digitized in real time and the picture analysis is done by software. The microscopes are equipped with stages driven by stepping motors, which are controlled by separate microprocessors. A PDP 11/03 supervises the operation of all microprocessors and stores the measured data on its mass storage devices. (author)

  9. Parametric Design and Mechanical Analysis of Beams based on SINOVATION

    Science.gov (United States)

    Xu, Z. G.; Shen, W. D.; Yang, D. Y.; Liu, W. M.

    2017-07-01

    In engineering practice, engineer needs to carry out complicated calculation when the loads on the beam are complex. The processes of analysis and calculation take a lot of time and the results are unreliable. So VS2005 and ADK are used to develop a software for beams design based on the 3D CAD software SINOVATION with C ++ programming language. The software can realize the mechanical analysis and parameterized design of various types of beams and output the report of design in HTML format. Efficiency and reliability of design of beams are improved.

  10. FARO base case post-test analysis by COMETA code

    Energy Technology Data Exchange (ETDEWEB)

    Annunziato, A.; Addabbo, C. [Joint Research Centre, Ispra (Italy)

    1995-09-01

    The paper analyzes the COMETA (Core Melt Thermal-Hydraulic Analysis) post test calculations of FARO Test L-11, the so-called Base Case Test. The FARO Facility, located at JRC Ispra, is used to simulate the consequences of Severe Accidents in Nuclear Power Plants under a variety of conditions. The COMETA Code has a 6 equations two phase flow field and a 3 phases corium field: the jet, the droplets and the fused-debris bed. The analysis shown that the code is able to pick-up all the major phenomena occurring during the fuel-coolant interaction pre-mixing phase.

  11. Data analysis using a data base driven graphics animation system

    International Nuclear Information System (INIS)

    Schwieder, D.H.; Stewart, H.D.; Curtis, J.N.

    1985-01-01

    A graphics animation system has been developed at the Idaho National Engineering Laboratory (INEL) to assist engineers in the analysis of large amounts of time series data. Most prior attempts at computer animation of data involve the development of large and expensive problem-specific systems. This paper discusses a generalized interactive computer animation system designed to be used in a wide variety of data analysis applications. By using relational data base storage of graphics and control information, considerable flexibility in design and development of animated displays is achieved

  12. Model based process-product design and analysis

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    This paper gives a perspective on modelling and the important role it has within product-process design and analysis. Different modelling issues related to development and application of systematic model-based solution approaches for product-process design is discussed and the need for a hybrid...... model-based framework is highlighted. This framework should be able to manage knowledge-data, models, and associated methods and tools integrated with design work-flows and data-flows for specific product-process design problems. In particular, the framework needs to manage models of different types......, forms and complexity, together with their associated parameters. An example of a model-based system for design of chemicals based formulated products is also given....

  13. YersiniaBase: a genomic resource and analysis platform for comparative analysis of Yersinia.

    Science.gov (United States)

    Tan, Shi Yang; Dutta, Avirup; Jakubovics, Nicholas S; Ang, Mia Yang; Siow, Cheuk Chuen; Mutha, Naresh Vr; Heydari, Hamed; Wee, Wei Yee; Wong, Guat Jah; Choo, Siew Woh

    2015-01-16

    Yersinia is a Gram-negative bacteria that includes serious pathogens such as the Yersinia pestis, which causes plague, Yersinia pseudotuberculosis, Yersinia enterocolitica. The remaining species are generally considered non-pathogenic to humans, although there is evidence that at least some of these species can cause occasional infections using distinct mechanisms from the more pathogenic species. With the advances in sequencing technologies, many genomes of Yersinia have been sequenced. However, there is currently no specialized platform to hold the rapidly-growing Yersinia genomic data and to provide analysis tools particularly for comparative analyses, which are required to provide improved insights into their biology, evolution and pathogenicity. To facilitate the ongoing and future research of Yersinia, especially those generally considered non-pathogenic species, a well-defined repository and analysis platform is needed to hold the Yersinia genomic data and analysis tools for the Yersinia research community. Hence, we have developed the YersiniaBase, a robust and user-friendly Yersinia resource and analysis platform for the analysis of Yersinia genomic data. YersiniaBase has a total of twelve species and 232 genome sequences, of which the majority are Yersinia pestis. In order to smooth the process of searching genomic data in a large database, we implemented an Asynchronous JavaScript and XML (AJAX)-based real-time searching system in YersiniaBase. Besides incorporating existing tools, which include JavaScript-based genome browser (JBrowse) and Basic Local Alignment Search Tool (BLAST), YersiniaBase also has in-house developed tools: (1) Pairwise Genome Comparison tool (PGC) for comparing two user-selected genomes; (2) Pathogenomics Profiling Tool (PathoProT) for comparative pathogenomics analysis of Yersinia genomes; (3) YersiniaTree for constructing phylogenetic tree of Yersinia. We ran analyses based on the tools and genomic data in YersiniaBase and the

  14. Dynamic Chest Image Analysis: Model-Based Perfusion Analysis in Dynamic Pulmonary Imaging

    Directory of Open Access Journals (Sweden)

    Kiuru Aaro

    2003-01-01

    Full Text Available The "Dynamic Chest Image Analysis" project aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the dynamic pulmonary imaging technique. We have proposed and evaluated a multiresolutional method with an explicit ventilation model for ventilation analysis. This paper presents a new model-based method for pulmonary perfusion analysis. According to perfusion properties, we first devise a novel mathematical function to form a perfusion model. A simple yet accurate approach is further introduced to extract cardiac systolic and diastolic phases from the heart, so that this cardiac information may be utilized to accelerate the perfusion analysis and improve its sensitivity in detecting pulmonary perfusion abnormalities. This makes perfusion analysis not only fast but also robust in computation; consequently, perfusion analysis becomes computationally feasible without using contrast media. Our clinical case studies with 52 patients show that this technique is effective for pulmonary embolism even without using contrast media, demonstrating consistent correlations with computed tomography (CT and nuclear medicine (NM studies. This fluoroscopical examination takes only about 2 seconds for perfusion study with only low radiation dose to patient, involving no preparation, no radioactive isotopes, and no contrast media.

  15. Heart Sound Biometric System Based on Marginal Spectrum Analysis

    Science.gov (United States)

    Zhao, Zhidong; Shen, Qinqin; Ren, Fangqin

    2013-01-01

    This work presents a heart sound biometric system based on marginal spectrum analysis, which is a new feature extraction technique for identification purposes. This heart sound identification system is comprised of signal acquisition, pre-processing, feature extraction, training, and identification. Experiments on the selection of the optimal values for the system parameters are conducted. The results indicate that the new spectrum coefficients result in a significant increase in the recognition rate of 94.40% compared with that of the traditional Fourier spectrum (84.32%) based on a database of 280 heart sounds from 40 participants. PMID:23429515

  16. Single base pair mutation analysis by PNA directed PCR clamping

    DEFF Research Database (Denmark)

    Ørum, H.; Nielsen, P.E.; Egholm, M.

    1993-01-01

    A novel method that allows direct analysis of single base mutation by the polymerase chain reaction (PCR) is described. The method utilizes the finding that PNAs (peptide nucleic acids) recognize and bind to their complementary nucleic acid sequences with higher thermal stability and specificity...... allows selective amplification/suppression of target sequences that differ by only one base pair. Finally we show that PNAs can be designed in such a way that blockage can be accomplished when the PNA target sequence is located between the PCR primers....

  17. Adaptive Human aware Navigation based on Motion Pattern Analysis

    DEFF Research Database (Denmark)

    Tranberg, Søren; Svenstrup, Mikael; Andersen, Hans Jørgen

    2009-01-01

    Respecting people’s social spaces is an important prerequisite for acceptable and natural robot navigation in human environments. In this paper, we describe an adaptive system for mobile robot navigation based on estimates of whether a person seeks to interact with the robot or not. The estimates...... are based on run-time motion pattern analysis compared to stored experience in a database. Using a potential field centered around the person, the robot positions itself at the most appropriate place relative to the person and the interaction status. The system is validated through qualitative tests...

  18. Earthquake response analysis of a base isolated building

    International Nuclear Information System (INIS)

    Mazda, T.; Shiojiri, H.; Sawada, Y.; Harada, O.; Kawai, N.; Ontsuka, S.

    1989-01-01

    Recently, the seismic isolation has become one of the popular methods in the design of important structures or equipments against the earthquakes. However, it is desired to accumulate the demonstration data on reliability of seismically isolated structures and to establish the analysis methods of those structures. Based on the above recognition, the vibration tests of a base isolated building were carried out in Tsukuba Science City. After that, many earthquake records have been obtained at the building. In order to examine the validity of numerical models, earthquake response analyses have been executed by using both lumped mass model, and finite element model

  19. Linear feature selection in texture analysis - A PLS based method

    DEFF Research Database (Denmark)

    Marques, Joselene; Igel, Christian; Lillholm, Martin

    2013-01-01

    We present a texture analysis methodology that combined uncommitted machine-learning techniques and partial least square (PLS) in a fully automatic framework. Our approach introduces a robust PLS-based dimensionality reduction (DR) step to specifically address outliers and high-dimensional feature...... and considering all CV groups, the methods selected 36 % of the original features available. The diagnosis evaluation reached a generalization area-under-the-ROC curve of 0.92, which was higher than established cartilage-based markers known to relate to OA diagnosis....

  20. Open access for ALICE analysis based on virtualization technology

    International Nuclear Information System (INIS)

    Buncic, P; Gheata, M; Schutz, Y

    2015-01-01

    Open access is one of the important leverages for long-term data preservation for a HEP experiment. To guarantee the usability of data analysis tools beyond the experiment lifetime it is crucial that third party users from the scientific community have access to the data and associated software. The ALICE Collaboration has developed a layer of lightweight components built on top of virtualization technology to hide the complexity and details of the experiment-specific software. Users can perform basic analysis tasks within CernVM, a lightweight generic virtual machine, paired with an ALICE specific contextualization. Once the virtual machine is launched, a graphical user interface is automatically started without any additional configuration. This interface allows downloading the base ALICE analysis software and running a set of ALICE analysis modules. Currently the available tools include fully documented tutorials for ALICE analysis, such as the measurement of strange particle production or the nuclear modification factor in Pb-Pb collisions. The interface can be easily extended to include an arbitrary number of additional analysis modules. We present the current status of the tools used by ALICE through the CERN open access portal, and the plans for future extensions of this system. (paper)

  1. Multiple Beta Spectrum Analysis Method Based on Spectrum Fitting

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Uk Jae; Jung, Yun Song; Kim, Hee Reyoung [UNIST, Ulsan (Korea, Republic of)

    2016-05-15

    When the sample of several mixed radioactive nuclides is measured, it is difficult to divide each nuclide due to the overlapping of spectrums. For this reason, simple mathematical analysis method for spectrum analysis of the mixed beta ray source has been studied. However, existing research was in need of more accurate spectral analysis method as it has a problem of accuracy. The study will describe the contents of the separation methods of the mixed beta ray source through the analysis of the beta spectrum slope based on the curve fitting to resolve the existing problem. The fitting methods including It was understood that sum of sine fitting method was the best one of such proposed methods as Fourier, polynomial, Gaussian and sum of sine to obtain equation for distribution of mixed beta spectrum. It was shown to be the most appropriate for the analysis of the spectrum with various ratios of mixed nuclides. It was thought that this method could be applied to rapid spectrum analysis of the mixed beta ray source.

  2. Transit Traffic Analysis Zone Delineating Method Based on Thiessen Polygon

    Directory of Open Access Journals (Sweden)

    Shuwei Wang

    2014-04-01

    Full Text Available A green transportation system composed of transit, busses and bicycles could be a significant in alleviating traffic congestion. However, the inaccuracy of current transit ridership forecasting methods is imposing a negative impact on the development of urban transit systems. Traffic Analysis Zone (TAZ delineating is a fundamental and essential step in ridership forecasting, existing delineating method in four-step models have some problems in reflecting the travel characteristics of urban transit. This paper aims to come up with a Transit Traffic Analysis Zone delineation method as supplement of traditional TAZs in transit service analysis. The deficiencies of current TAZ delineating methods were analyzed, and the requirements of Transit Traffic Analysis Zone (TTAZ were summarized. Considering these requirements, Thiessen Polygon was introduced into TTAZ delineating. In order to validate its feasibility, Beijing was then taken as an example to delineate TTAZs, followed by a spatial analysis of office buildings within a TTAZ and transit station departure passengers. Analysis result shows that the TTAZs based on Thiessen polygon could reflect the transit travel characteristic and is of in-depth research value.

  3. Job optimization in ATLAS TAG-based distributed analysis

    Science.gov (United States)

    Mambelli, M.; Cranshaw, J.; Gardner, R.; Maeno, T.; Malon, D.; Novak, M.

    2010-04-01

    The ATLAS experiment is projected to collect over one billion events/year during the first few years of operation. The efficient selection of events for various physics analyses across all appropriate samples presents a significant technical challenge. ATLAS computing infrastructure leverages the Grid to tackle the analysis across large samples by organizing data into a hierarchical structure and exploiting distributed computing to churn through the computations. This includes events at different stages of processing: RAW, ESD (Event Summary Data), AOD (Analysis Object Data), DPD (Derived Physics Data). Event Level Metadata Tags (TAGs) contain information about each event stored using multiple technologies accessible by POOL and various web services. This allows users to apply selection cuts on quantities of interest across the entire sample to compile a subset of events that are appropriate for their analysis. This paper describes new methods for organizing jobs using the TAGs criteria to analyze ATLAS data. It further compares different access patterns to the event data and explores ways to partition the workload for event selection and analysis. Here analysis is defined as a broader set of event processing tasks including event selection and reduction operations ("skimming", "slimming" and "thinning") as well as DPD making. Specifically it compares analysis with direct access to the events (AOD and ESD data) to access mediated by different TAG-based event selections. We then compare different ways of splitting the processing to maximize performance.

  4. Fuzzy-Set Based Sentiment Analysis of Big Social Data

    DEFF Research Database (Denmark)

    Mukkamala, Raghava Rao; Hussain, Abid; Vatrapu, Ravi

    2014-01-01

    Computational approaches to social media analytics are largely limited to graph theoretical approaches such as social network analysis (SNA) informed by the social philosophical approach of relational sociology. There are no other unified modelling approaches to social data that integrate...... the conceptual, formal, software, analytical and empirical realms. In this paper, we first present and discuss a theory and conceptual model of social data. Second, we outline a formal model based on fuzzy set theory and describe the operational semantics of the formal model with a real-world social data example...... from Facebook. Third, we briefly present and discuss the Social Data Analytics Tool (SODATO) that realizes the conceptual model in software and provisions social data analysis based on the conceptual and formal models. Fourth, we use SODATO to fetch social data from the facebook wall of a global brand...

  5. Estimating Driving Performance Based on EEG Spectrum Analysis

    Directory of Open Access Journals (Sweden)

    Jung Tzyy-Ping

    2005-01-01

    Full Text Available The growing number of traffic accidents in recent years has become a serious concern to society. Accidents caused by driver's drowsiness behind the steering wheel have a high fatality rate because of the marked decline in the driver's abilities of perception, recognition, and vehicle control abilities while sleepy. Preventing such accidents caused by drowsiness is highly desirable but requires techniques for continuously detecting, estimating, and predicting the level of alertness of drivers and delivering effective feedbacks to maintain their maximum performance. This paper proposes an EEG-based drowsiness estimation system that combines electroencephalogram (EEG log subband power spectrum, correlation analysis, principal component analysis, and linear regression models to indirectly estimate driver's drowsiness level in a virtual-reality-based driving simulator. Our results demonstrated that it is feasible to accurately estimate quantitatively driving performance, expressed as deviation between the center of the vehicle and the center of the cruising lane, in a realistic driving simulator.

  6. Image-Analysis Based on Seed Phenomics in Sesame

    Directory of Open Access Journals (Sweden)

    Prasad R.

    2014-10-01

    Full Text Available The seed coat (testa structure of twenty-three cultivated (Sesamum indicum L. and six wild sesame (s. occidentale Regel & Heer., S. mulayanum Nair, S. prostratum Retz., S. radiatum Schumach. & Thonn., S. angustifolium (Oliv. Engl. and S. schinzianum Asch germplasm was analyzed from digital and Scanning Electron Microscopy (SEM images with dedicated software using the descriptors for computer based seed image analysis to understand the diversity of seed morphometric traits, which later on can be extended to screen and evaluate improved genotypes of sesame. Seeds of wild sesame species could conveniently be distinguished from cultivated varieties based on shape and architectural analysis. Results indicated discrete ‘cut off values to identify definite shape and contour of seed for a desirable sesame genotype along with the con-ventional practice of selecting lighter colored testa.

  7. Gold Nanoparticles-Based Barcode Analysis for Detection of Norepinephrine.

    Science.gov (United States)

    An, Jeung Hee; Lee, Kwon-Jai; Choi, Jeong-Woo

    2016-02-01

    Nanotechnology-based bio-barcode amplification analysis offers an innovative approach for detecting neurotransmitters. We evaluated the efficacy of this method for detecting norepinephrine in normal and oxidative-stress damaged dopaminergic cells. Our approach use a combination of DNA barcodes and bead-based immunoassays for detecting neurotransmitters with surface-enhanced Raman spectroscopy (SERS), and provides polymerase chain reaction (PCR)-like sensitivity. This method relies on magnetic Dynabeads containing antibodies and nanoparticles that are loaded both with DNA barcords and with antibodies that can sandwich the target protein captured by the Dynabead-bound antibodies. The aggregate sandwich structures are magnetically separated from the solution and treated to remove the conjugated barcode DNA. The DNA barcodes are then identified by SERS and PCR analysis. The concentration of norepinephrine in dopaminergic cells can be readily detected using the bio-barcode assay, which is a rapid, high-throughput screening tool for detecting neurotransmitters.

  8. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  9. Model-Based Dependability Analysis of Physical Systems with Modelica

    Directory of Open Access Journals (Sweden)

    Andrea Tundis

    2017-01-01

    Full Text Available Modelica is an innovative, equation-based, and acausal language that allows modeling complex physical systems, which are made of mechanical, electrical, and electrotechnical components, and evaluates their design through simulation techniques. Unfortunately, the increasing complexity and accuracy of such physical systems require new, more powerful, and flexible tools and techniques for evaluating important system properties and, in particular, the dependability ones such as reliability, safety, and maintainability. In this context, the paper describes some extensions of the Modelica language to support the modeling of system requirements and their relationships. Such extensions enable the requirement verification analysis through native constructs in the Modelica language. Furthermore, they allow exporting a Modelica-based system design as a Bayesian Network in order to analyze its dependability by employing a probabilistic approach. The proposal is exemplified through a case study concerning the dependability analysis of a Tank System.

  10. INNOVATION ANALYSIS BASED ON SCORES AT THE FIRM LEVEL

    Directory of Open Access Journals (Sweden)

    Cătălin George ALEXE

    2014-04-01

    Full Text Available Innovation analysis based on scores (Innovation Scorecard is a simple way to get a quick diagnosis on the potential of innovation of a firm in its intention to achieve the innovation capability. It aims to identify and remedy the deficient aspects related to innovation management being used as a measuring tool for the innovation initiatives over time within the innovation audit. The paper aims to present the advantages and disadvantages of using the method, and the three approaches developed over time. Therefore, the model proposed by the consulting firm Arthur D. Little in collaboration with the European Business School, Eckelmann's model and AGGB's local model are summarized and compared. At the end of the paper, several possible solutions are proposed to improve the way of analysis based on scores.

  11. Voxel-based analysis of the diffusion tensor

    International Nuclear Information System (INIS)

    Abe, Osamu; Takao, Hidemasa; Gonoi, Wataru; Sasaki, Hiroki; Murakami, Mizuho; Ohtomo, Kuni; Kabasawa, Hiroyuki; Kawaguchi, Hiroshi; Goto, Masami; Yamada, Haruyasu; Yamasue, Hidenori; Kasai, Kiyoto; Aoki, Shigeki

    2010-01-01

    Diffusion tensor imaging (DTI) has provided important insights into the neurobiological basis for normal development and aging and various disease processes in the central nervous system. The aim of this article is to review the current protocols for DTI acquisition and preprocessing and statistical testing for a voxelwise analysis of DTI, focused on statistical parametric mapping (SPM) and tract-based spatial statistics (TBSS). We tested the effects of distortion correction induced by gradient nonlinearity on fractional anisotropy (FA) maps or FA skeletons processed via two SPM-based methods (coregistration and FA template methods), or TBSS-based method, respectively. With two SPM-based methods, we found similar results in some points (e.g., significant FA elevation for uncorrected images in anterior-dominant white matter and for corrected images in bilateral middle cerebellar peduncles) and different results in other points (e.g., significantly larger FA for corrected images with coregistration method, but significantly smaller with FA template method in bilateral internal capsules, extending to corona radiata, and semioval centers). In contrast, there was no area with significant difference between uncorrected and corrected FA skeletons with TBSS-based method. The discrepancy among these results was not explained in full, but possible explanations were misregistration and smoothing for the SPM-based methods and insensitivity to FA changes outside the local centers of white matter bundles for TBSS-based method. (orig.)

  12. Preprint WebVRGIS Based Traffic Analysis and Visualization System

    OpenAIRE

    Li, Xiaoming; Lv, Zhihan; Wang, Weixi; Zhang, Baoyun; Hu, Jinxing; Yin, Ling; Feng, Shengzhong

    2015-01-01

    This is the preprint version of our paper on Advances in Engineering Software. With several characteristics, such as large scale, diverse predictability and timeliness, the city traffic data falls in the range of definition of Big Data. A Virtual Reality GIS based traffic analysis and visualization system is proposed as a promising and inspiring approach to manage and develop traffic big data. In addition to the basic GIS interaction functions, the proposed system also includes some intellige...

  13. A LITERATURE SURVEY ON RECOMMENDATION SYSTEM BASED ON SENTIMENTAL ANALYSIS

    OpenAIRE

    Achin Jain; Vanita Jain; Nidhi Kapoor

    2016-01-01

    Recommender systems have grown to be a critical research subject after the emergence of the first paper on collaborative filtering in the Nineties. Despite the fact that educational studies on recommender systems, has extended extensively over the last 10 years, there are deficiencies in the complete literature evaluation and classification of that research. Because of this, we reviewed articles on recommender structures, and then classified those based on sentiment analysis. The articles are...

  14. Comparative analysis of some brushless motors based on catalog data

    Directory of Open Access Journals (Sweden)

    Anton Kalapish

    2005-10-01

    Full Text Available Brushless motors (polyphased AC induction, synchronous and brushless DC motors have no alternatives in modern electric drives. They possess highly efficient and very wide range of speeds. The objective of this paper is to represent some relation between the basic parameters and magnitudes of electrical machines. This allows to be made a comparative analysis and a choice of motor concerning each particular case based not only on catalogue data or price for sale.

  15. Computer based approach to fatigue analysis and design

    International Nuclear Information System (INIS)

    Comstock, T.R.; Bernard, T.; Nieb, J.

    1979-01-01

    An approach is presented which uses a mini-computer based system for data acquisition, analysis and graphic displays relative to fatigue life estimation and design. Procedures are developed for identifying an eliminating damaging events due to overall duty cycle, forced vibration and structural dynamic characteristics. Two case histories, weld failures in heavy vehicles and low cycle fan blade failures, are discussed to illustrate the overall approach. (orig.) 891 RW/orig. 892 RKD [de

  16. Semantic Indexing and Retrieval based on Formal Concept Analysis

    OpenAIRE

    Codocedo , Victor; Lykourentzou , Ioanna; Napoli , Amedeo

    2012-01-01

    Semantic indexing and retrieval has become an important research area, as the available amount of information on the Web is growing more and more. In this paper, we introduce an original approach to semantic indexing and retrieval based on Formal Concept Analysis. The concept lattice is used as a semantic index and we propose an original algorithm for traversing the lattice and answering user queries. This framework has been used and evaluated on song datasets.

  17. An Analysis of Construction Accident Factors Based on Bayesian Network

    OpenAIRE

    Yunsheng Zhao; Jinyong Pei

    2013-01-01

    In this study, we have an analysis of construction accident factors based on bayesian network. Firstly, accidents cases are analyzed to build Fault Tree method, which is available to find all the factors causing the accidents, then qualitatively and quantitatively analyzes the factors with Bayesian network method, finally determines the safety management program to guide the safety operations. The results of this study show that bad condition of geological environment has the largest posterio...

  18. Independent component analysis based filtering for penumbral imaging

    International Nuclear Information System (INIS)

    Chen Yenwei; Han Xianhua; Nozaki, Shinya

    2004-01-01

    We propose a filtering based on independent component analysis (ICA) for Poisson noise reduction. In the proposed filtering, the image is first transformed to ICA domain and then the noise components are removed by a soft thresholding (shrinkage). The proposed filter, which is used as a preprocessing of the reconstruction, has been successfully applied to penumbral imaging. Both simulation results and experimental results show that the reconstructed image is dramatically improved in comparison to that without the noise-removing filters

  19. Quantitative analysis of the ATV data base, Stage 2

    International Nuclear Information System (INIS)

    Stenquist, C.; Kjellbert, N.A.

    1981-01-01

    A supplementary study of the Swedish ATV data base was carried out. The study was limited to an analysis of the quantitative coverage of component failures from 1979 through 1980. The results indicate that the coverage of component failures is about 75-80 per cent related to the failure reports and work order sheets at the reactor sites together with SKI's ''Safety Related Occurrences''. In general there has been an improvement compared to previous years. (Auth.)

  20. Experience based ageing analysis of NPP protection automation in Finland

    International Nuclear Information System (INIS)

    Simola, K.

    2000-01-01

    This paper describes three successive studies on ageing of protection automation of nuclear power plants. These studies were aimed at developing a methodology for an experience based ageing analysis, and applying it to identify the most critical components from ageing and safety points of view. The analyses resulted also to suggestions for improvement of data collection systems for the purpose of further ageing analyses. (author)

  1. Research on Air Quality Evaluation based on Principal Component Analysis

    Science.gov (United States)

    Wang, Xing; Wang, Zilin; Guo, Min; Chen, Wei; Zhang, Huan

    2018-01-01

    Economic growth has led to environmental capacity decline and the deterioration of air quality. Air quality evaluation as a fundamental of environmental monitoring and air pollution control has become increasingly important. Based on the principal component analysis (PCA), this paper evaluates the air quality of a large city in Beijing-Tianjin-Hebei Area in recent 10 years and identifies influencing factors, in order to provide reference to air quality management and air pollution control.

  2. AB019. Erectile dysfunction: analysis based on age stratification

    OpenAIRE

    Kai, Zhang

    2015-01-01

    Objective To establish the profile of erectile dysfunction in different age groups, and analysis the effect of sildenafil based on age stratification. Subjects and Methods From 2007 to 2008, a total of 4,507 men diagnosed with erectile dysfunction (ED) were enrolled from 46 centers in China; 4,039 of these patients were treated with sildenafil and asked to complete the Erectile Function domain of the International Index of Erectile Function, Erection Hardness Score, and Quality of Erection Qu...

  3. Analysis Of Japans Economy Based On 2014 From Macroeconomics Prospects

    Directory of Open Access Journals (Sweden)

    Dr Mohammad Rafiqul Islam

    2015-02-01

    Full Text Available Abstract Japan is the worlds third largest economy. But currently economic situations of Japan are not stable. It is not increasing as expected. Since 2013 it was world second largest economy but Japan loosed its placed to China in 2014 due to slow growth of important economic indicators. By using the basic Keynesian model we will provide a detailed analysis of the short and long run impacts of the changes for Japans real GDP rate of unemployment and inflation rate. We demonstrated a detailed use of the 45-degree diagram or the AD-IA model and other economic analysis of the macroeconomic principles that underlie the model and concepts. Finally we will recommend the government with a change in fiscal policy what based on the analysis by considering what might be achieved with a fiscal policy response and the extent to which any impact on the stock of public debt might be a consideration

  4. [Model-based biofuels system analysis: a review].

    Science.gov (United States)

    Chang, Shiyan; Zhang, Xiliang; Zhao, Lili; Ou, Xunmin

    2011-03-01

    Model-based system analysis is an important tool for evaluating the potential and impacts of biofuels, and for drafting biofuels technology roadmaps and targets. The broad reach of the biofuels supply chain requires that biofuels system analyses span a range of disciplines, including agriculture/forestry, energy, economics, and the environment. Here we reviewed various models developed for or applied to modeling biofuels, and presented a critical analysis of Agriculture/Forestry System Models, Energy System Models, Integrated Assessment Models, Micro-level Cost, Energy and Emission Calculation Models, and Specific Macro-level Biofuel Models. We focused on the models' strengths, weaknesses, and applicability, facilitating the selection of a suitable type of model for specific issues. Such an analysis was a prerequisite for future biofuels system modeling, and represented a valuable resource for researchers and policy makers.

  5. SVM-based glioma grading. Optimization by feature reduction analysis

    International Nuclear Information System (INIS)

    Zoellner, Frank G.; Schad, Lothar R.; Emblem, Kyrre E.; Harvard Medical School, Boston, MA; Oslo Univ. Hospital

    2012-01-01

    We investigated the predictive power of feature reduction analysis approaches in support vector machine (SVM)-based classification of glioma grade. In 101 untreated glioma patients, three analytic approaches were evaluated to derive an optimal reduction in features; (i) Pearson's correlation coefficients (PCC), (ii) principal component analysis (PCA) and (iii) independent component analysis (ICA). Tumor grading was performed using a previously reported SVM approach including whole-tumor cerebral blood volume (CBV) histograms and patient age. Best classification accuracy was found using PCA at 85% (sensitivity = 89%, specificity = 84%) when reducing the feature vector from 101 (100-bins rCBV histogram + age) to 3 principal components. In comparison, classification accuracy by PCC was 82% (89%, 77%, 2 dimensions) and 79% by ICA (87%, 75%, 9 dimensions). For improved speed (up to 30%) and simplicity, feature reduction by all three methods provided similar classification accuracy to literature values (∝87%) while reducing the number of features by up to 98%. (orig.)

  6. Diagnosis methods based on noise analysis at Cernavoda NPP, Romania

    International Nuclear Information System (INIS)

    Banica, Constantin; Dobrea, Dumitru

    1999-01-01

    This paper describes a recent noise analysis of the neutronic signals provided by in-core flux detectors (ICFD) and ion chambers (IC). This analysis is part of on-going program developed for Unit 1 of the Cernavoda NPP, Romania, with the following main objectives: - prediction of detector failures based on pattern recognition; - determination of fast excursions from steady states; - detection of abnormal mechanical vibrations in the reactor core. The introduction presents briefly the reactor, the location of ICFD's and IC's. The second section presents the data acquisition systems and their capabilities. The paper continues with a brief presentation of the numerical methods used for analysis (section 3). The most significant results can be found in section 4, while section 5 concludes about useful information that can be obtained from the neutronic signals at high power steady-state operation. (authors)

  7. Content-based TV sports video retrieval using multimodal analysis

    Science.gov (United States)

    Yu, Yiqing; Liu, Huayong; Wang, Hongbin; Zhou, Dongru

    2003-09-01

    In this paper, we propose content-based video retrieval, which is a kind of retrieval by its semantical contents. Because video data is composed of multimodal information streams such as video, auditory and textual streams, we describe a strategy of using multimodal analysis for automatic parsing sports video. The paper first defines the basic structure of sports video database system, and then introduces a new approach that integrates visual stream analysis, speech recognition, speech signal processing and text extraction to realize video retrieval. The experimental results for TV sports video of football games indicate that the multimodal analysis is effective for video retrieval by quickly browsing tree-like video clips or inputting keywords within predefined domain.

  8. Reliability analysis of digital based I and C system

    Energy Technology Data Exchange (ETDEWEB)

    Kang, I. S.; Cho, B. S.; Choi, M. J. [KOPEC, Yongin (Korea, Republic of)

    1999-10-01

    Rapidly, digital technology is being widely applied in replacing analog component installed in existing plant and designing new nuclear power plant for control and monitoring system in Korea as well as in foreign countries. Even though many merits of digital technology, it is being faced with a new problem of reliability assurance. The studies for solving this problem are being performed vigorously in foreign countries. The reliability of KNGR Engineered Safety Features Component Control System (ESF-CCS), digital based I and C system, was analyzed to verify fulfillment of the ALWR EPRI-URD requirement for reliability analysis and eliminate hazards in design applied new technology. The qualitative analysis using FMEA and quantitative analysis using reliability block diagram were performed. The results of analyses are shown in this paper.

  9. Methodology for risk-based analysis of technical specifications

    International Nuclear Information System (INIS)

    Vesely, W.E.; Gaertner, J.P.; Wagner, D.P.

    1985-01-01

    Part of the effort by EPRI to apply probabilistic risk assessment methods and results to the solution of utility problems involves the investigation of methods for risk-based analysis of technical specifications. The culmination of this investigation is the SOCRATES computer code developed by Battelle's Columbus Laboratories to assist in the evaluation of technical specifications of nuclear power plants. The program is designed to use information found in PRAs to re-evaluate risk for changes in component allowed outage times (AOTs) and surveillance test intervals (STIs). The SOCRATES program is a unique and important tool for technical specification evaluations. The detailed component unavailability model allows a detailed analysis of AOT and STI contributions to risk. Explicit equations allow fast and inexpensive calculations. Because the code is designed to accept ranges of parameters and to save results of calculations that do not change during the analysis, sensitivity studies are efficiently performed and results are clearly displayed

  10. GOMA: functional enrichment analysis tool based on GO modules

    Institute of Scientific and Technical Information of China (English)

    Qiang Huang; Ling-Yun Wu; Yong Wang; Xiang-Sun Zhang

    2013-01-01

    Analyzing the function of gene sets is a critical step in interpreting the results of high-throughput experiments in systems biology.A variety of enrichment analysis tools have been developed in recent years,but most output a long list of significantly enriched terms that are often redundant,making it difficult to extract the most meaningful functions.In this paper,we present GOMA,a novel enrichment analysis method based on the new concept of enriched functional Gene Ontology (GO) modules.With this method,we systematically revealed functional GO modules,i.e.,groups of functionally similar GO terms,via an optimization model and then ranked them by enrichment scores.Our new method simplifies enrichment analysis results by reducing redundancy,thereby preventing inconsistent enrichment results among functionally similar terms and providing more biologically meaningful results.

  11. A Machine Learning Based Intrusion Impact Analysis Scheme for Clouds

    Directory of Open Access Journals (Sweden)

    Junaid Arshad

    2012-01-01

    Full Text Available Clouds represent a major paradigm shift, inspiring the contemporary approach to computing. They present fascinating opportunities to address dynamic user requirements with the provision of on demand expandable computing infrastructures. However, Clouds introduce novel security challenges which need to be addressed to facilitate widespread adoption. This paper is focused on one such challenge - intrusion impact analysis. In particular, we highlight the significance of intrusion impact analysis for the overall security of Clouds. Additionally, we present a machine learning based scheme to address this challenge in accordance with the specific requirements of Clouds for intrusion impact analysis. We also present rigorous evaluation performed to assess the effectiveness and feasibility of the proposed method to address this challenge for Clouds. The evaluation results demonstrate high degree of effectiveness to correctly determine the impact of an intrusion along with significant reduction with respect to the intrusion response time.

  12. Facial Affect Recognition Using Regularized Discriminant Analysis-Based Algorithms

    Directory of Open Access Journals (Sweden)

    Cheng-Yuan Shih

    2010-01-01

    Full Text Available This paper presents a novel and effective method for facial expression recognition including happiness, disgust, fear, anger, sadness, surprise, and neutral state. The proposed method utilizes a regularized discriminant analysis-based boosting algorithm (RDAB with effective Gabor features to recognize the facial expressions. Entropy criterion is applied to select the effective Gabor feature which is a subset of informative and nonredundant Gabor features. The proposed RDAB algorithm uses RDA as a learner in the boosting algorithm. The RDA combines strengths of linear discriminant analysis (LDA and quadratic discriminant analysis (QDA. It solves the small sample size and ill-posed problems suffered from QDA and LDA through a regularization technique. Additionally, this study uses the particle swarm optimization (PSO algorithm to estimate optimal parameters in RDA. Experiment results demonstrate that our approach can accurately and robustly recognize facial expressions.

  13. Plug-in Based Analysis Framework for LHC Post-Mortem Analysis

    CERN Document Server

    Gorbonosov, R; Zerlauth, M; Baggiolini, V

    2014-01-01

    Plug-in based software architectures [1] are extensible, enforce modularity and allow several teams to work in parallel. But they have certain technical and organizational challenges, which we discuss in this paper. We gained our experience when developing the Post-Mortem Analysis (PMA) system, which is a mission critical system for the Large Hadron Collider (LHC). We used a plugin-based architecture with a general-purpose analysis engine, for which physicists and equipment experts code plugins containing the analysis algorithms. We have over 45 analysis plugins developed by a dozen of domain experts. This paper focuses on the design challenges we faced in order to mitigate the risks of executing third-party code: assurance that even a badly written plugin doesn't perturb the work of the overall application; plugin execution control which allows to detect plugin misbehaviour and react; robust communication mechanism between plugins, diagnostics facilitation in case of plugin failure; testing of the plugins be...

  14. Improved nowcasting of precipitation based on convective analysis fields

    Directory of Open Access Journals (Sweden)

    T. Haiden

    2007-04-01

    Full Text Available The high-resolution analysis and nowcasting system INCA (Integrated Nowcasting through Comprehensive Analysis developed at the Austrian national weather service provides three-dimensional fields of temperature, humidity, and wind on an hourly basis, and two-dimensional fields of precipitation rate in 15 min intervals. The system operates on a horizontal resolution of 1 km and a vertical resolution of 100–200 m. It combines surface station data, remote sensing data (radar, satellite, forecast fields of the numerical weather prediction model ALADIN, and high-resolution topographic data. An important application of the INCA system is nowcasting of convective precipitation. Based on fine-scale temperature, humidity, and wind analyses a number of convective analysis fields are routinely generated. These fields include convective boundary layer (CBL flow convergence and specific humidity, lifted condensation level (LCL, convective available potential energy (CAPE, convective inhibition (CIN, and various convective stability indices. Based on the verification of areal precipitation nowcasts it is shown that the pure translational forecast of convective cells can be improved by using a decision algorithm which is based on a subset of the above fields, combined with satellite products.

  15. Geographic Object-Based Image Analysis - Towards a new paradigm.

    Science.gov (United States)

    Blaschke, Thomas; Hay, Geoffrey J; Kelly, Maggi; Lang, Stefan; Hofmann, Peter; Addink, Elisabeth; Queiroz Feitosa, Raul; van der Meer, Freek; van der Werff, Harald; van Coillie, Frieke; Tiede, Dirk

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis - GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature extraction approaches. This article investigates these development and its implications and asks whether or not this is a new paradigm in remote sensing and Geographic Information Science (GIScience). We first discuss several limitations of prevailing per-pixel methods when applied to high resolution images. Then we explore the paradigm concept developed by Kuhn (1962) and discuss whether GEOBIA can be regarded as a paradigm according to this definition. We crystallize core concepts of GEOBIA, including the role of objects, of ontologies and the multiplicity of scales and we discuss how these conceptual developments support important methods in remote sensing such as change detection and accuracy assessment. The ramifications of the different theoretical foundations between the ' per-pixel paradigm ' and GEOBIA are analysed, as are some of the challenges along this path from pixels, to objects, to geo-intelligence. Based on several paradigm indications as defined by Kuhn and based on an analysis of peer-reviewed scientific literature we conclude that GEOBIA is a new and evolving paradigm.

  16. Sensitivity analysis practices: Strategies for model-based inference

    International Nuclear Information System (INIS)

    Saltelli, Andrea; Ratto, Marco; Tarantola, Stefano; Campolongo, Francesca

    2006-01-01

    Fourteen years after Science's review of sensitivity analysis (SA) methods in 1989 (System analysis at molecular scale, by H. Rabitz) we search Science Online to identify and then review all recent articles having 'sensitivity analysis' as a keyword. In spite of the considerable developments which have taken place in this discipline, of the good practices which have emerged, and of existing guidelines for SA issued on both sides of the Atlantic, we could not find in our review other than very primitive SA tools, based on 'one-factor-at-a-time' (OAT) approaches. In the context of model corroboration or falsification, we demonstrate that this use of OAT methods is illicit and unjustified, unless the model under analysis is proved to be linear. We show that available good practices, such as variance based measures and others, are able to overcome OAT shortcomings and easy to implement. These methods also allow the concept of factors importance to be defined rigorously, thus making the factors importance ranking univocal. We analyse the requirements of SA in the context of modelling, and present best available practices on the basis of an elementary model. We also point the reader to available recipes for a rigorous SA

  17. Sensitivity analysis practices: Strategies for model-based inference

    Energy Technology Data Exchange (ETDEWEB)

    Saltelli, Andrea [Institute for the Protection and Security of the Citizen (IPSC), European Commission, Joint Research Centre, TP 361, 21020 Ispra (Vatican City State, Holy See,) (Italy)]. E-mail: andrea.saltelli@jrc.it; Ratto, Marco [Institute for the Protection and Security of the Citizen (IPSC), European Commission, Joint Research Centre, TP 361, 21020 Ispra (VA) (Italy); Tarantola, Stefano [Institute for the Protection and Security of the Citizen (IPSC), European Commission, Joint Research Centre, TP 361, 21020 Ispra (VA) (Italy); Campolongo, Francesca [Institute for the Protection and Security of the Citizen (IPSC), European Commission, Joint Research Centre, TP 361, 21020 Ispra (VA) (Italy)

    2006-10-15

    Fourteen years after Science's review of sensitivity analysis (SA) methods in 1989 (System analysis at molecular scale, by H. Rabitz) we search Science Online to identify and then review all recent articles having 'sensitivity analysis' as a keyword. In spite of the considerable developments which have taken place in this discipline, of the good practices which have emerged, and of existing guidelines for SA issued on both sides of the Atlantic, we could not find in our review other than very primitive SA tools, based on 'one-factor-at-a-time' (OAT) approaches. In the context of model corroboration or falsification, we demonstrate that this use of OAT methods is illicit and unjustified, unless the model under analysis is proved to be linear. We show that available good practices, such as variance based measures and others, are able to overcome OAT shortcomings and easy to implement. These methods also allow the concept of factors importance to be defined rigorously, thus making the factors importance ranking univocal. We analyse the requirements of SA in the context of modelling, and present best available practices on the basis of an elementary model. We also point the reader to available recipes for a rigorous SA.

  18. Frame-based safety analysis approach for decision-based errors

    International Nuclear Information System (INIS)

    Fan, Chin-Feng; Yihb, Swu

    1997-01-01

    A frame-based approach is proposed to analyze decision-based errors made by automatic controllers or human operators due to erroneous reference frames. An integrated framework, Two Frame Model (TFM), is first proposed to model the dynamic interaction between the physical process and the decision-making process. Two important issues, consistency and competing processes, are raised. Consistency between the physical and logic frames makes a TFM-based system work properly. Loss of consistency refers to the failure mode that the logic frame does not accurately reflect the state of the controlled processes. Once such failure occurs, hazards may arise. Among potential hazards, the competing effect between the controller and the controlled process is the most severe one, which may jeopardize a defense-in-depth design. When the logic and physical frames are inconsistent, conventional safety analysis techniques are inadequate. We propose Frame-based Fault Tree; Analysis (FFTA) and Frame-based Event Tree Analysis (FETA) under TFM to deduce the context for decision errors and to separately generate the evolution of the logical frame as opposed to that of the physical frame. This multi-dimensional analysis approach, different from the conventional correctness-centred approach, provides a panoramic view in scenario generation. Case studies using the proposed techniques are also given to demonstrate their usage and feasibility

  19. Ontology-based specification, identification and analysis of perioperative risks.

    Science.gov (United States)

    Uciteli, Alexandr; Neumann, Juliane; Tahar, Kais; Saleh, Kutaiba; Stucke, Stephan; Faulbrück-Röhr, Sebastian; Kaeding, André; Specht, Martin; Schmidt, Tobias; Neumuth, Thomas; Besting, Andreas; Stegemann, Dominik; Portheine, Frank; Herre, Heinrich

    2017-09-06

    Medical personnel in hospitals often works under great physical and mental strain. In medical decision-making, errors can never be completely ruled out. Several studies have shown that between 50 and 60% of adverse events could have been avoided through better organization, more attention or more effective security procedures. Critical situations especially arise during interdisciplinary collaboration and the use of complex medical technology, for example during surgical interventions and in perioperative settings (the period of time before, during and after surgical intervention). In this paper, we present an ontology and an ontology-based software system, which can identify risks across medical processes and supports the avoidance of errors in particular in the perioperative setting. We developed a practicable definition of the risk notion, which is easily understandable by the medical staff and is usable for the software tools. Based on this definition, we developed a Risk Identification Ontology (RIO) and used it for the specification and the identification of perioperative risks. An agent system was developed, which gathers risk-relevant data during the whole perioperative treatment process from various sources and provides it for risk identification and analysis in a centralized fashion. The results of such an analysis are provided to the medical personnel in form of context-sensitive hints and alerts. For the identification of the ontologically specified risks, we developed an ontology-based software module, called Ontology-based Risk Detector (OntoRiDe). About 20 risks relating to cochlear implantation (CI) have already been implemented. Comprehensive testing has indicated the correctness of the data acquisition, risk identification and analysis components, as well as the web-based visualization of results.

  20. Web-Based Virtual Laboratory for Food Analysis Course

    Science.gov (United States)

    Handayani, M. N.; Khoerunnisa, I.; Sugiarti, Y.

    2018-02-01

    Implementation of learning on food analysis course in Program Study of Agro-industrial Technology Education faced problems. These problems include the availability of space and tools in the laboratory that is not comparable with the number of students also lack of interactive learning tools. On the other hand, the information technology literacy of students is quite high as well the internet network is quite easily accessible on campus. This is a challenge as well as opportunities in the development of learning media that can help optimize learning in the laboratory. This study aims to develop web-based virtual laboratory as one of the alternative learning media in food analysis course. This research is R & D (research and development) which refers to Borg & Gall model. The results showed that assessment’s expert of web-based virtual labs developed, in terms of software engineering aspects; visual communication; material relevance; usefulness and language used, is feasible as learning media. The results of the scaled test and wide-scale test show that students strongly agree with the development of web based virtual laboratory. The response of student to this virtual laboratory was positive. Suggestions from students provided further opportunities for improvement web based virtual laboratory and should be considered for further research.

  1. Phosphoproteomics-based systems analysis of signal transduction networks

    Directory of Open Access Journals (Sweden)

    Hiroko eKozuka-Hata

    2012-01-01

    Full Text Available Signal transduction systems coordinate complex cellular information to regulate biological events such as cell proliferation and differentiation. Although the accumulating evidence on widespread association of signaling molecules has revealed essential contribution of phosphorylation-dependent interaction networks to cellular regulation, their dynamic behavior is mostly yet to be analyzed. Recent technological advances regarding mass spectrometry-based quantitative proteomics have enabled us to describe the comprehensive status of phosphorylated molecules in a time-resolved manner. Computational analyses based on the phosphoproteome dynamics accelerate generation of novel methodologies for mathematical analysis of cellular signaling. Phosphoproteomics-based numerical modeling can be used to evaluate regulatory network elements from a statistical point of view. Integration with transcriptome dynamics also uncovers regulatory hubs at the transcriptional level. These omics-based computational methodologies, which have firstly been applied to representative signaling systems such as the epidermal growth factor receptor pathway, have now opened up a gate for systems analysis of signaling networks involved in immune response and cancer.

  2. Matching biomedical ontologies based on formal concept analysis.

    Science.gov (United States)

    Zhao, Mengyi; Zhang, Songmao; Li, Weizhuo; Chen, Guowei

    2018-03-19

    The goal of ontology matching is to identify correspondences between entities from different yet overlapping ontologies so as to facilitate semantic integration, reuse and interoperability. As a well developed mathematical model for analyzing individuals and structuring concepts, Formal Concept Analysis (FCA) has been applied to ontology matching (OM) tasks since the beginning of OM research, whereas ontological knowledge exploited in FCA-based methods is limited. This motivates the study in this paper, i.e., to empower FCA with as much as ontological knowledge as possible for identifying mappings across ontologies. We propose a method based on Formal Concept Analysis to identify and validate mappings across ontologies, including one-to-one mappings, complex mappings and correspondences between object properties. Our method, called FCA-Map, incrementally generates a total of five types of formal contexts and extracts mappings from the lattices derived. First, the token-based formal context describes how class names, labels and synonyms share lexical tokens, leading to lexical mappings (anchors) across ontologies. Second, the relation-based formal context describes how classes are in taxonomic, partonomic and disjoint relationships with the anchors, leading to positive and negative structural evidence for validating the lexical matching. Third, the positive relation-based context can be used to discover structural mappings. Afterwards, the property-based formal context describes how object properties are used in axioms to connect anchor classes across ontologies, leading to property mappings. Last, the restriction-based formal context describes co-occurrence of classes across ontologies in anonymous ancestors of anchors, from which extended structural mappings and complex mappings can be identified. Evaluation on the Anatomy, the Large Biomedical Ontologies, and the Disease and Phenotype track of the 2016 Ontology Alignment Evaluation Initiative campaign

  3. Physics-based deformable organisms for medical image analysis

    Science.gov (United States)

    Hamarneh, Ghassan; McIntosh, Chris

    2005-04-01

    Previously, "Deformable organisms" were introduced as a novel paradigm for medical image analysis that uses artificial life modelling concepts. Deformable organisms were designed to complement the classical bottom-up deformable models methodologies (geometrical and physical layers), with top-down intelligent deformation control mechanisms (behavioral and cognitive layers). However, a true physical layer was absent and in order to complete medical image segmentation tasks, deformable organisms relied on pure geometry-based shape deformations guided by sensory data, prior structural knowledge, and expert-generated schedules of behaviors. In this paper we introduce the use of physics-based shape deformations within the deformable organisms framework yielding additional robustness by allowing intuitive real-time user guidance and interaction when necessary. We present the results of applying our physics-based deformable organisms, with an underlying dynamic spring-mass mesh model, to segmenting and labelling the corpus callosum in 2D midsagittal magnetic resonance images.

  4. Reliability analysis of offshore structures using OMA based fatigue stresses

    DEFF Research Database (Denmark)

    Silva Nabuco, Bruna; Aissani, Amina; Glindtvad Tarpø, Marius

    2017-01-01

    focus is on the uncertainty observed on the different stresses used to predict the damage. This uncertainty can be reduced by Modal Based Fatigue Monitoring which is a technique based on continuously measuring of the accelerations in few points of the structure with the use of accelerometers known...... points of the structure, the stress history can be calculated in any arbitrary point of the structure. The accuracy of the estimated actual stress is analyzed by experimental tests on a scale model where the obtained stresses are compared to strain gauges measurements. After evaluating the fatigue...... stresses directly from the operational response of the structure, a reliability analysis is performed in order to estimate the reliability of using Modal Based Fatigue Monitoring for long term fatigue studies....

  5. Train integrity detection risk analysis based on PRISM

    Science.gov (United States)

    Wen, Yuan

    2018-04-01

    GNSS based Train Integrity Monitoring System (TIMS) is an effective and low-cost detection scheme for train integrity detection. However, as an external auxiliary system of CTCS, GNSS may be influenced by external environments, such as uncertainty of wireless communication channels, which may lead to the failure of communication and positioning. In order to guarantee the reliability and safety of train operation, a risk analysis method of train integrity detection based on PRISM is proposed in this article. First, we analyze the risk factors (in GNSS communication process and the on-board communication process) and model them. Then, we evaluate the performance of the model in PRISM based on the field data. Finally, we discuss how these risk factors influence the train integrity detection process.

  6. Economical analysis based on a simplified model of micro distillery

    International Nuclear Information System (INIS)

    Bristoti, A.; Adams, R.

    1987-01-01

    The investment costs of the hydrate alcohol distillery as well as the energy balance of its production made inviable its diffusion on a small scale. The present economical analysis is based on a simplified model of micro distillery where the reduction on investment costs in based on technical data based on the possibility of the utilization of an hydrate alcohol with a higher water constant than the usual, i. e. around 85 0 GL. The engineering project of this plant eliminates all pumps, all the liquids (water, sugar cane, syrup and wine) are fed by gravity. The bagasse is considered a noble by-product and it is utilized in poultry industry as a substitute of wood dust. All the heat needs of the micro distillery are supplied by wood. (author)

  7. Cost analysis of simulated base-catalyzed biodiesel production processes

    International Nuclear Information System (INIS)

    Tasić, Marija B.; Stamenković, Olivera S.; Veljković, Vlada B.

    2014-01-01

    Highlights: • Two semi-continuous biodiesel production processes from sunflower oil are simulated. • Simulations were based on the kinetics of base-catalyzed methanolysis reactions. • The total energy consumption was influenced by the kinetic model. • Heterogeneous base-catalyzed process is a preferable industrial technology. - Abstract: The simulation and economic feasibility evaluation of semi-continuous biodiesel production from sunflower oil were based on the kinetics of homogeneously (Process I) and heterogeneously (Process II) base-catalyzed methanolysis reactions. The annual plant’s capacity was determined to be 8356 tonnes of biodiesel. The total energy consumption was influenced by the unit model describing the methanolysis reaction kinetics. The energy consumption of the Process II was more than 2.5 times lower than that of the Process I. Also, the simulation showed the Process I had more and larger process equipment units, compared with the Process II. Based on lower total capital investment costs and biodiesel selling price, the Process II was economically more feasible than the Process I. Sensitivity analysis was conducted using variable sunflower oil and biodiesel prices. Using a biodiesel selling price of 0.990 $/kg, Processes I and II were shown to be economically profitable if the sunflower oil price was 0.525 $/kg and 0.696 $/kg, respectively

  8. Gating treatment delivery QA based on a surrogate motion analysis

    International Nuclear Information System (INIS)

    Chojnowski, J.; Simpson, E.

    2011-01-01

    Full text: To develop a methodology to estimate intrafractional target position error during a phase-based gated treatment. Westmead Cancer Care Centre is using respiratory correlated phase-based gated beam delivery in the treatment of lung cancer. The gating technique is managed by the Varian Real-time Position Management (RPM) system, version 1.7.5. A 6-dot block is placed on the abdomen of the patient and acts as a surrogate for the target motion. During a treatment session, the motion of the surrogate can be recorded by RPM application. Analysis of the surrogate motion file by in-house developed software allows the intrafractional error of the treatment session to be computed. To validate the computed error, a simple test that involves the introduction of deliberate errors is performed. Errors of up to 1.1 cm are introduced to a metal marker placed on a surrogate using the Varian Breathing Phantom. The moving marker was scanned in prospective mode using a GE Lightspeed 16 CT scanner. Using the CT images, a difference of the marker position with and without introduced errors is compared to the calculated errors based on the surrogate motion. The average and standard deviation of a difference between calculated target position errors and measured introduced artificial errors of the marker position is 0.02 cm and 0.07 cm respectively. Conclusion The calculated target positional error based on surrogate motion analysis provides a quantitative measure of intrafractional target positional errors during treatment. Routine QA for gated treatment using surrogate motion analysis is relatively quick and simple.

  9. NeoAnalysis: a Python-based toolbox for quick electrophysiological data processing and analysis.

    Science.gov (United States)

    Zhang, Bo; Dai, Ji; Zhang, Tao

    2017-11-13

    In a typical electrophysiological experiment, especially one that includes studying animal behavior, the data collected normally contain spikes, local field potentials, behavioral responses and other associated data. In order to obtain informative results, the data must be analyzed simultaneously with the experimental settings. However, most open-source toolboxes currently available for data analysis were developed to handle only a portion of the data and did not take into account the sorting of experimental conditions. Additionally, these toolboxes require that the input data be in a specific format, which can be inconvenient to users. Therefore, the development of a highly integrated toolbox that can process multiple types of data regardless of input data format and perform basic analysis for general electrophysiological experiments is incredibly useful. Here, we report the development of a Python based open-source toolbox, referred to as NeoAnalysis, to be used for quick electrophysiological data processing and analysis. The toolbox can import data from different data acquisition systems regardless of their formats and automatically combine different types of data into a single file with a standardized format. In cases where additional spike sorting is needed, NeoAnalysis provides a module to perform efficient offline sorting with a user-friendly interface. Then, NeoAnalysis can perform regular analog signal processing, spike train, and local field potentials analysis, behavioral response (e.g. saccade) detection and extraction, with several options available for data plotting and statistics. Particularly, it can automatically generate sorted results without requiring users to manually sort data beforehand. In addition, NeoAnalysis can organize all of the relevant data into an informative table on a trial-by-trial basis for data visualization. Finally, NeoAnalysis supports analysis at the population level. With the multitude of general-purpose functions provided

  10. Cloud-Based Orchestration of a Model-Based Power and Data Analysis Toolchain

    Science.gov (United States)

    Post, Ethan; Cole, Bjorn; Dinkel, Kevin; Kim, Hongman; Lee, Erich; Nairouz, Bassem

    2016-01-01

    The proposed Europa Mission concept contains many engineering and scientific instruments that consume varying amounts of power and produce varying amounts of data throughout the mission. System-level power and data usage must be well understood and analyzed to verify design requirements. Numerous cross-disciplinary tools and analysis models are used to simulate the system-level spacecraft power and data behavior. This paper addresses the problem of orchestrating a consistent set of models, tools, and data in a unified analysis toolchain when ownership is distributed among numerous domain experts. An analysis and simulation environment was developed as a way to manage the complexity of the power and data analysis toolchain and to reduce the simulation turnaround time. A system model data repository is used as the trusted store of high-level inputs and results while other remote servers are used for archival of larger data sets and for analysis tool execution. Simulation data passes through numerous domain-specific analysis tools and end-to-end simulation execution is enabled through a web-based tool. The use of a cloud-based service facilitates coordination among distributed developers and enables scalable computation and storage needs, and ensures a consistent execution environment. Configuration management is emphasized to maintain traceability between current and historical simulation runs and their corresponding versions of models, tools and data.

  11. Resilience Analysis of Countries under Disasters Based on Multisource Data.

    Science.gov (United States)

    Zhang, Nan; Huang, Hong

    2018-01-01

    Disasters occur almost daily in the world. Because emergencies frequently have no precedent, are highly uncertain, and can be very destructive, improving a country's resilience is an efficient way to reduce risk. In this article, we collected more than 20,000 historical data points from disasters from 207 countries to enable us to calculate the severity of disasters and the danger they pose to countries. In addition, 6 primary indices (disaster, personal attribute, infrastructure, economics, education, and occupation) including 38 secondary influencing factors are considered in analyzing the resilience of countries. Using these data, we obtained the danger, expected number of deaths, and resilience of all 207 countries. We found that a country covering a large area is more likely to have a low resilience score. Through sensitivity analysis of all secondary indices, we found that population density, frequency of disasters, and GDP are the three most critical factors affecting resilience. Based on broad-spectrum resilience analysis of the different continents, Oceania and South America have the highest resilience, while Asia has the lowest. Over the past 50 years, the resilience of many countries has been improved sharply, especially in developing countries. Based on our results, we analyze the comprehensive resilience and provide some optimal suggestions to efficiently improve resilience. © 2017 Society for Risk Analysis.

  12. Techno-Economic Analysis of Biofuels Production Based on Gasification

    Energy Technology Data Exchange (ETDEWEB)

    Swanson, R. M.; Platon, A.; Satrio, J. A.; Brown, R. C.; Hsu, D. D.

    2010-11-01

    This study compares capital and production costs of two biomass-to-liquid production plants based on gasification. The first biorefinery scenario is an oxygen-fed, low-temperature (870?C), non-slagging, fluidized bed gasifier. The second scenario is an oxygen-fed, high-temperature (1,300?C), slagging, entrained flow gasifier. Both are followed by catalytic Fischer-Tropsch synthesis and hydroprocessing to naphtha-range (gasoline blend stock) and distillate-range (diesel blend stock) liquid fractions. Process modeling software (Aspen Plus) is utilized to organize the mass and energy streams and cost estimation software is used to generate equipment costs. Economic analysis is performed to estimate the capital investment and operating costs. Results show that the total capital investment required for nth plant scenarios is $610 million and $500 million for high-temperature and low-temperature scenarios, respectively. Product value (PV) for the high-temperature and low-temperature scenarios is estimated to be $4.30 and $4.80 per gallon of gasoline equivalent (GGE), respectively, based on a feedstock cost of $75 per dry short ton. Sensitivity analysis is also performed on process and economic parameters. This analysis shows that total capital investment and feedstock cost are among the most influential parameters affecting the PV.

  13. FDTD technique based crosstalk analysis of bundled SWCNT interconnects

    International Nuclear Information System (INIS)

    Duksh, Yograj Singh; Kaushik, Brajesh Kumar; Agarwal, Rajendra P.

    2015-01-01

    The equivalent electrical circuit model of a bundled single-walled carbon nanotube based distributed RLC interconnects is employed for the crosstalk analysis. The accurate time domain analysis and crosstalk effect in the VLSI interconnect has emerged as an essential design criteria. This paper presents a brief description of the numerical method based finite difference time domain (FDTD) technique that is intended for estimation of voltages and currents on coupled transmission lines. For the FDTD implementation, the stability of the proposed model is strictly restricted by the Courant condition. This method is used for the estimation of crosstalk induced propagation delay and peak voltage in lossy RLC interconnects. Both functional and dynamic crosstalk effects are analyzed in the coupled transmission line. The effect of line resistance on crosstalk induced delay, and peak voltage under dynamic and functional crosstalk is also evaluated. The FDTD analysis and the SPICE simulations are carried out at 32 nm technology node for the global interconnects. It is observed that the analytical results obtained using the FDTD technique are in good agreement with the SPICE simulation results. The crosstalk induced delay, propagation delay, and peak voltage obtained using the FDTD technique shows average errors of 4.9%, 3.4% and 0.46%, respectively, in comparison to SPICE. (paper)

  14. Complexity analysis based on generalized deviation for financial markets

    Science.gov (United States)

    Li, Chao; Shang, Pengjian

    2018-03-01

    In this paper, a new modified method is proposed as a measure to investigate the correlation between past price and future volatility for financial time series, known as the complexity analysis based on generalized deviation. In comparison with the former retarded volatility model, the new approach is both simple and computationally efficient. The method based on the generalized deviation function presents us an exhaustive way showing the quantization of the financial market rules. Robustness of this method is verified by numerical experiments with both artificial and financial time series. Results show that the generalized deviation complexity analysis method not only identifies the volatility of financial time series, but provides a comprehensive way distinguishing the different characteristics between stock indices and individual stocks. Exponential functions can be used to successfully fit the volatility curves and quantify the changes of complexity for stock market data. Then we study the influence for negative domain of deviation coefficient and differences during the volatile periods and calm periods. after the data analysis of the experimental model, we found that the generalized deviation model has definite advantages in exploring the relationship between the historical returns and future volatility.

  15. Principle-based concept analysis: Caring in nursing education.

    Science.gov (United States)

    Salehian, Maryam; Heydari, Abbas; Aghebati, Nahid; Karimi Moonaghi, Hossein; Mazloom, Seyed Reza

    2016-03-01

    The aim of this principle-based concept analysis was to analyze caring in nursing education and to explain the current state of the science based on epistemologic, pragmatic, linguistic, and logical philosophical principles. A principle-based concept analysis method was used to analyze the nursing literature. The dataset included 46 English language studies, published from 2005 to 2014, and they were retrieved through PROQUEST, MEDLINE, CINAHL, ERIC, SCOPUS, and SID scientific databases. The key dimensions of the data were collected using a validated data-extraction sheet. The four principles of assessing pragmatic utility were used to analyze the data. The data were managed by using MAXQDA 10 software. The scientific literature that deals with caring in nursing education relies on implied meaning. Caring in nursing education refers to student-teacher interactions that are formed on the basis of human values and focused on the unique needs of the students (epistemological principle). The result of student-teacher interactions is the development of both the students and the teachers. Numerous applications of the concept of caring in nursing education are available in the literature (pragmatic principle). There is consistency in the meaning of the concept, as a central value of the faculty-student interaction (linguistic principle). Compared with other related concepts, such as "caring pedagogy," "value-based education," and "teaching excellence," caring in nursing education does not have exact and clear conceptual boundaries (logic principle). Caring in nursing education was identified as an approach to teaching and learning, and it is formed based on teacher-student interactions and sustainable human values. A greater understanding of the conceptual basis of caring in nursing education will improve the caring behaviors of teachers, create teaching-learning environments, and help experts in curriculum development.

  16. A methodology for strain-based fatigue reliability analysis

    International Nuclear Information System (INIS)

    Zhao, Y.X.

    2000-01-01

    A significant scatter of the cyclic stress-strain (CSS) responses should be noted for a nuclear reactor material, 1Cr18Ni9Ti pipe-weld metal. Existence of the scatter implies that a random cyclic strain applied history will be introduced under any of the loading modes even a deterministic loading history. A non-conservative evaluation might be given in the practice without considering the scatter. A methodology for strain-based fatigue reliability analysis, which has taken into account the scatter, is developed. The responses are approximately modeled by probability-based CSS curves of Ramberg-Osgood relation. The strain-life data are modeled, similarly, by probability-based strain-life curves of Coffin-Manson law. The reliability assessment is constructed by considering interference of the random fatigue strain applied and capacity histories. Probability density functions of the applied and capacity histories are analytically given. The methodology could be conveniently extrapolated to the case of deterministic CSS relation as the existent methods did. Non-conservative evaluation of the deterministic CSS relation and availability of present methodology have been indicated by an analysis of the material test results

  17. Visualization-based analysis of multiple response survey data

    Science.gov (United States)

    Timofeeva, Anastasiia

    2017-11-01

    During the survey, the respondents are often allowed to tick more than one answer option for a question. Analysis and visualization of such data have difficulties because of the need for processing multiple response variables. With standard representation such as pie and bar charts, information about the association between different answer options is lost. The author proposes a visualization approach for multiple response variables based on Venn diagrams. For a more informative representation with a large number of overlapping groups it is suggested to use similarity and association matrices. Some aggregate indicators of dissimilarity (similarity) are proposed based on the determinant of the similarity matrix and the maximum eigenvalue of association matrix. The application of the proposed approaches is well illustrated by the example of the analysis of advertising sources. Intersection of sets indicates that the same consumer audience is covered by several advertising sources. This information is very important for the allocation of the advertising budget. The differences between target groups in advertising sources are of interest. To identify such differences the hypothesis of homogeneity and independence are tested. Recent approach to the problem are briefly reviewed and compared. An alternative procedure is suggested. It is based on partition of a consumer audience into pairwise disjoint subsets and includes hypothesis testing of the difference between the population proportions. It turned out to be more suitable for the real problem being solved.

  18. Parameter uncertainty effects on variance-based sensitivity analysis

    International Nuclear Information System (INIS)

    Yu, W.; Harris, T.J.

    2009-01-01

    In the past several years there has been considerable commercial and academic interest in methods for variance-based sensitivity analysis. The industrial focus is motivated by the importance of attributing variance contributions to input factors. A more complete understanding of these relationships enables companies to achieve goals related to quality, safety and asset utilization. In a number of applications, it is possible to distinguish between two types of input variables-regressive variables and model parameters. Regressive variables are those that can be influenced by process design or by a control strategy. With model parameters, there are typically no opportunities to directly influence their variability. In this paper, we propose a new method to perform sensitivity analysis through a partitioning of the input variables into these two groupings: regressive variables and model parameters. A sequential analysis is proposed, where first an sensitivity analysis is performed with respect to the regressive variables. In the second step, the uncertainty effects arising from the model parameters are included. This strategy can be quite useful in understanding process variability and in developing strategies to reduce overall variability. When this method is used for nonlinear models which are linear in the parameters, analytical solutions can be utilized. In the more general case of models that are nonlinear in both the regressive variables and the parameters, either first order approximations can be used, or numerically intensive methods must be used

  19. Beyond the GUM: variance-based sensitivity analysis in metrology

    International Nuclear Information System (INIS)

    Lira, I

    2016-01-01

    Variance-based sensitivity analysis is a well established tool for evaluating the contribution of the uncertainties in the inputs to the uncertainty in the output of a general mathematical model. While the literature on this subject is quite extensive, it has not found widespread use in metrological applications. In this article we present a succinct review of the fundamentals of sensitivity analysis, in a form that should be useful to most people familiarized with the Guide to the Expression of Uncertainty in Measurement (GUM). Through two examples, it is shown that in linear measurement models, no new knowledge is gained by using sensitivity analysis that is not already available after the terms in the so-called ‘law of propagation of uncertainties’ have been computed. However, if the model behaves non-linearly in the neighbourhood of the best estimates of the input quantities—and if these quantities are assumed to be statistically independent—sensitivity analysis is definitely advantageous for gaining insight into how they can be ranked according to their importance in establishing the uncertainty of the measurand. (paper)

  20. ICWorld: An MMOG-Based Approach to Analysis

    Directory of Open Access Journals (Sweden)

    Wyatt Wong

    2008-01-01

    Full Text Available Intelligence analysts routinely work with "wicked" problems—critical,time-sensitive problems where analytical errors can lead to catastrophic consequences for the nation's security. In the analyst's world, important decisions are often made quickly, and are made based on consuming, understanding, and piecing together enormous volumes of data. The data is not only voluminous, but often fragmented, subjective, inaccurate and fluid.Why does multi-player on-line gaming (MMOG technology matter to the IC? Fundamentally, there are two reasons. The first is technological: stripping away the gamelike content, MMOGs are dynamic systems that represent a physical world, where users are presented with (virtual life-and-death challenges that can only be overcome through planning, collaboration and communication. The second is cultural: the emerging generation of analysts is part of what is sometimes called the "Digital Natives" (Prensky 2001 and is fluent with interactive media. MMOGs enable faster visualization, data manipulation, collaboration and analysis than traditional text and imagery.ICWorld is an MMOG approach to intelligence analysis that fuses ideasfrom experts in the fields of gaming and data visualization, with knowledge of current and future intelligence analysis processes and tools. The concept has evolved over the last year as a result of evaluations by allsource analysts from around the IC. When fully developed, the Forterra team believes that ICWorld will fundamentally address major shortcomings of intelligence analysis, and dramatically improve the effectiveness of intelligence products.

  1. Classification of scintigrams on the base of an automatic analysis

    International Nuclear Information System (INIS)

    Vidyukov, V.I.; Kasatkin, Yu.N.; Kal'nitskaya, E.F.; Mironov, S.P.; Rotenberg, E.M.

    1980-01-01

    The stages of drawing a discriminative system based on self-education for an automatic analysis of scintigrams have been considered. The results of the classification of 240 scintigrams of the liver into ''normal'', ''diffuse lesions'', ''focal lesions'' have been evaluated by medical experts and computer. The accuracy of the computerized classification was 91.7%, that of the experts-85%. The automatic analysis methods of scintigrams of the liver have been realized using the specialized MDS system of data processing. The quality of the discriminative system has been assessed on 125 scintigrams. The accuracy of the classification is equal to 89.6%. The employment of the self-education; methods permitted one to single out two subclasses depending on the severity of diffuse lesions

  2. GIS-based poverty and population distribution analysis in China

    Science.gov (United States)

    Cui, Jing; Wang, Yingjie; Yan, Hong

    2009-07-01

    Geographically, poverty status is not only related with social-economic factors but also strongly affected by geographical environment. In the paper, GIS-based poverty and population distribution analysis method is introduced for revealing their regional differences. More than 100000 poor villages and 592 national key poor counties are chosen for the analysis. The results show that poverty distribution tends to concentrate in most of west China and mountainous rural areas of mid China. Furthermore, the fifth census data are overlaid to those poor areas in order to gain its internal diversity of social-economic characteristics. By overlaying poverty related social-economic parameters, such as sex ratio, illiteracy, education level, percentage of ethnic minorities, family composition, finding shows that poverty distribution is strongly correlated with high illiteracy rate, high percentage minorities, and larger family member.

  3. Numerical analysis of a polysilicon-based resistive memory device

    KAUST Repository

    Berco, Dan

    2018-03-08

    This study investigates a conductive bridge resistive memory device based on a Cu top electrode, 10-nm polysilicon resistive switching layer and a TiN bottom electrode, by numerical analysis for $$10^{3}$$103 programming and erase simulation cycles. The low and high resistive state values in each cycle are calculated, and the analysis shows that the structure has excellent retention reliability properties. The presented Cu species density plot indicates that Cu insertion occurs almost exclusively along grain boundaries resulting in a confined isomorphic conductive filament that maintains its overall shape and electric properties during cycling. The superior reliability of this structure may thus be attributed to the relatively low amount of Cu migrating into the RSL during initial formation. In addition, the results show a good match and help to confirm experimental measurements done over a previously demonstrated device.

  4. Psychoacoustic Music Analysis Based on the Discrete Wavelet Packet Transform

    Directory of Open Access Journals (Sweden)

    Xing He

    2008-01-01

    Full Text Available Psychoacoustical computational models are necessary for the perceptual processing of acoustic signals and have contributed significantly in the development of highly efficient audio analysis and coding. In this paper, we present an approach for the psychoacoustic analysis of musical signals based on the discrete wavelet packet transform. The proposed method mimics the multiresolution properties of the human ear closer than other techniques and it includes simultaneous and temporal auditory masking. Experimental results show that this method provides better masking capabilities and it reduces the signal-to-masking ratio substantially more than other approaches, without introducing audible distortion. This model can lead to greater audio compression by permitting further bit rate reduction and more secure watermarking by providing greater signal space for information hiding.

  5. Automated migration analysis based on cell texture: method & reliability

    Directory of Open Access Journals (Sweden)

    Chittenden Thomas W

    2005-03-01

    Full Text Available Abstract Background In this paper, we present and validate a way to measure automatically the extent of cell migration based on automated examination of a series of digital photographs. It was designed specifically to identify the impact of Second Hand Smoke (SHS on endothelial cell migration but has broader applications. The analysis has two stages: (1 preprocessing of image texture, and (2 migration analysis. Results The output is a graphic overlay that indicates the front lines of cell migration superimposed on each original image, with automated reporting of the distance traversed vs. time. Expert preference compares to manual placement of leading edge shows complete equivalence of automated vs. manual leading edge definition for cell migration measurement. Conclusion Our method is indistinguishable from careful manual determinations of cell front lines, with the advantages of full automation, objectivity, and speed.

  6. Histogram analysis for smartphone-based rapid hematocrit determination

    Science.gov (United States)

    Jalal, Uddin M.; Kim, Sang C.; Shim, Joon S.

    2017-01-01

    A novel and rapid analysis technique using histogram has been proposed for the colorimetric quantification of blood hematocrits. A smartphone-based “Histogram” app for the detection of hematocrits has been developed integrating the smartphone embedded camera with a microfluidic chip via a custom-made optical platform. The developed histogram analysis shows its effectiveness in the automatic detection of sample channel including auto-calibration and can analyze the single-channel as well as multi-channel images. Furthermore, the analyzing method is advantageous to the quantification of blood-hematocrit both in the equal and varying optical conditions. The rapid determination of blood hematocrits carries enormous information regarding physiological disorders, and the use of such reproducible, cost-effective, and standard techniques may effectively help with the diagnosis and prevention of a number of human diseases. PMID:28717569

  7. Model-based human reliability analysis: prospects and requirements

    International Nuclear Information System (INIS)

    Mosleh, A.; Chang, Y.H.

    2004-01-01

    Major limitations of the conventional methods for human reliability analysis (HRA), particularly those developed for operator response analysis in probabilistic safety assessments (PSA) of nuclear power plants, are summarized as a motivation for the need and a basis for developing requirements for the next generation HRA methods. It is argued that a model-based approach that provides explicit cognitive causal links between operator behaviors and directly or indirectly measurable causal factors should be at the core of the advanced methods. An example of such causal model is briefly reviewed, where due to the model complexity and input requirements can only be currently implemented in a dynamic PSA environment. The computer simulation code developed for this purpose is also described briefly, together with current limitations in the models, data, and the computer implementation

  8. Robust LOD scores for variance component-based linkage analysis.

    Science.gov (United States)

    Blangero, J; Williams, J T; Almasy, L

    2000-01-01

    The variance component method is now widely used for linkage analysis of quantitative traits. Although this approach offers many advantages, the importance of the underlying assumption of multivariate normality of the trait distribution within pedigrees has not been studied extensively. Simulation studies have shown that traits with leptokurtic distributions yield linkage test statistics that exhibit excessive Type I error when analyzed naively. We derive analytical formulae relating the deviation from the expected asymptotic distribution of the lod score to the kurtosis and total heritability of the quantitative trait. A simple correction constant yields a robust lod score for any deviation from normality and for any pedigree structure, and effectively eliminates the problem of inflated Type I error due to misspecification of the underlying probability model in variance component-based linkage analysis.

  9. Microcomputer-based pneumatic controller for neutron activation analysis

    International Nuclear Information System (INIS)

    Byrd, J.S.; Sand, R.J.

    1976-10-01

    A microcomputer-based pneumatic controller for neutron activation analysis was designed and built at the Savannah River Laboratory for analysis of large numbers of geologic samples for locating potential supplies of uranium ore for the National Uranium Resource Evaluation program. In this system, commercially available microcomputer logic modules are used to transport sample capsules through a network of pressurized air lines. The logic modules are interfaced to pneumatic valves, solenoids, and photo-optical detectors. The system operates from programs stored in firmware (permanent software). It also commands a minicomputer and a hard-wired pulse height analyzer for data collection and bookkeeping tasks. The advantage of the system is that major system changes can be implemented in the firmware with no hardware changes. This report describes the hardware, firmware, and software for the electronics system

  10. Man-machine interfaces analysis system based on computer simulation

    International Nuclear Information System (INIS)

    Chen Xiaoming; Gao Zuying; Zhou Zhiwei; Zhao Bingquan

    2004-01-01

    The paper depicts a software assessment system, Dynamic Interaction Analysis Support (DIAS), based on computer simulation technology for man-machine interfaces (MMI) of a control room. It employs a computer to simulate the operation procedures of operations on man-machine interfaces in a control room, provides quantified assessment, and at the same time carries out analysis on operational error rate of operators by means of techniques for human error rate prediction. The problems of placing man-machine interfaces in a control room and of arranging instruments can be detected from simulation results. DIAS system can provide good technical supports to the design and improvement of man-machine interfaces of the main control room of a nuclear power plant

  11. Carbon nanotube based VLSI interconnects analysis and design

    CERN Document Server

    Kaushik, Brajesh Kumar

    2015-01-01

    The brief primarily focuses on the performance analysis of CNT based interconnects in current research scenario. Different CNT structures are modeled on the basis of transmission line theory. Performance comparison for different CNT structures illustrates that CNTs are more promising than Cu or other materials used in global VLSI interconnects. The brief is organized into five chapters which mainly discuss: (1) an overview of current research scenario and basics of interconnects; (2) unique crystal structures and the basics of physical properties of CNTs, and the production, purification and applications of CNTs; (3) a brief technical review, the geometry and equivalent RLC parameters for different single and bundled CNT structures; (4) a comparative analysis of crosstalk and delay for different single and bundled CNT structures; and (5) various unique mixed CNT bundle structures and their equivalent electrical models.

  12. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  13. Cloud-based Jupyter Notebooks for Water Data Analysis

    Science.gov (United States)

    Castronova, A. M.; Brazil, L.; Seul, M.

    2017-12-01

    The development and adoption of technologies by the water science community to improve our ability to openly collaborate and share workflows will have a transformative impact on how we address the challenges associated with collaborative and reproducible scientific research. Jupyter notebooks offer one solution by providing an open-source platform for creating metadata-rich toolchains for modeling and data analysis applications. Adoption of this technology within the water sciences, coupled with publicly available datasets from agencies such as USGS, NASA, and EPA enables researchers to easily prototype and execute data intensive toolchains. Moreover, implementing this software stack in a cloud-based environment extends its native functionality to provide researchers a mechanism to build and execute toolchains that are too large or computationally demanding for typical desktop computers. Additionally, this cloud-based solution enables scientists to disseminate data processing routines alongside journal publications in an effort to support reproducibility. For example, these data collection and analysis toolchains can be shared, archived, and published using the HydroShare platform or downloaded and executed locally to reproduce scientific analysis. This work presents the design and implementation of a cloud-based Jupyter environment and its application for collecting, aggregating, and munging various datasets in a transparent, sharable, and self-documented manner. The goals of this work are to establish a free and open source platform for domain scientists to (1) conduct data intensive and computationally intensive collaborative research, (2) utilize high performance libraries, models, and routines within a pre-configured cloud environment, and (3) enable dissemination of research products. This presentation will discuss recent efforts towards achieving these goals, and describe the architectural design of the notebook server in an effort to support collaborative

  14. Analysis of manufacturing based on object oriented discrete event simulation

    Directory of Open Access Journals (Sweden)

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  15. Regional Analysis of Remote Sensing Based Evapotranspiration Information

    Science.gov (United States)

    Geli, H. M. E.; Hain, C.; Anderson, M. C.; Senay, G. B.

    2017-12-01

    Recent research findings on modeling actual evapotranspiration (ET) using remote sensing data and methods have proven the ability of these methods to address wide range of hydrological and water resources issues including river basin water balance for improved water resources management, drought monitoring, drought impact and socioeconomic responses, agricultural water management, optimization of land-use for water conservations, water allocation agreement among others. However, there is still a critical need to identify appropriate type of ET information that can address each of these issues. The current trend of increasing demand for water due to population growth coupled with variable and limited water supply due to drought especially in arid and semiarid regions with limited water supply have highlighted the need for such information. To properly address these issues different spatial and temporal resolutions of ET information will need to be used. For example, agricultural water management applications require ET information at field (30-m) and daily time scales while for river basin hydrologic analysis relatively coarser spatial and temporal scales can be adequate for such regional applications. The objective of this analysis is to evaluate the potential of using an integrated ET information that can be used to address some of these issues collectively. This analysis will highlight efforts to address some of the issues that are applicable to New Mexico including assessment of statewide water budget as well as drought impact and socioeconomic responses which all require ET information but at different spatial and temporal scales. This analysis will provide an evaluation of four remote sensing based ET models including ALEXI, DisALEXI, SSEBop, and SEBAL3.0. The models will be compared with ground-based observations from eddy covariance towers and water balance calculations. Remote sensing data from Landsat, MODIS, and VIIRS sensors will be used to provide ET

  16. Amatchmethod Based on Latent Semantic Analysis for Earthquakehazard Emergency Plan

    Science.gov (United States)

    Sun, D.; Zhao, S.; Zhang, Z.; Shi, X.

    2017-09-01

    The structure of the emergency plan on earthquake is complex, and it's difficult for decision maker to make a decision in a short time. To solve the problem, this paper presents a match method based on Latent Semantic Analysis (LSA). After the word segmentation preprocessing of emergency plan, we carry out keywords extraction according to the part-of-speech and the frequency of words. Then through LSA, we map the documents and query information to the semantic space, and calculate the correlation of documents and queries by the relation between vectors. The experiments results indicate that the LSA can improve the accuracy of emergency plan retrieval efficiently.

  17. Knowledge based analysis of radiology reports using conceptual graphs

    International Nuclear Information System (INIS)

    Schroeder, M.

    1992-07-01

    The telegraphic language found in radiological reports can be well understood by a natrual language system using the underlying domain knowledge. We present the METEXA system, which emphasizes the use of radiological domain knowledge to determine the semantics of utterances. Syntactic and semantic analysis, lexical sematics and the structure of the domain model are described in some detail. A resolution-based inference engine answers relevant questions concerning the contents of the reports. As knowledge representation formalism the Conceptual Graph Theory by John Sowa has been chosen. (orig.)

  18. Principal Component Analysis Based Measure of Structural Holes

    Science.gov (United States)

    Deng, Shiguo; Zhang, Wenqing; Yang, Huijie

    2013-02-01

    Based upon principal component analysis, a new measure called compressibility coefficient is proposed to evaluate structural holes in networks. This measure incorporates a new effect from identical patterns in networks. It is found that compressibility coefficient for Watts-Strogatz small-world networks increases monotonically with the rewiring probability and saturates to that for the corresponding shuffled networks. While compressibility coefficient for extended Barabasi-Albert scale-free networks decreases monotonically with the preferential effect and is significantly large compared with that for corresponding shuffled networks. This measure is helpful in diverse research fields to evaluate global efficiency of networks.

  19. Analysis advanced methods of data bases of industrial experience return

    International Nuclear Information System (INIS)

    Lannoy, A.; Procaccia, H.

    1994-05-01

    This is a presentation, through different conceptions of data bases on industrial experience return, of the principal methods for treatments and analyses of the collected data, going from the frequency statistic and factorial analysis, to the the Bayesian statistical decision theory, which is a real decision assistance tool for responsibles, conceivers and operators. Examples in various fields are given (OREDA: Offshore REliability DAta bank for marine drilling platforms, CEDB: Component Event Data Bank for european electric power industry, RDF 93: reliability of electronic components of ''France Telecom'', EVT: failure EVenTs data bank in the french nuclear power plants by ''EDF''). (A.B.). refs., figs., tabs

  20. Lossless droplet transfer of droplet-based microfluidic analysis

    Science.gov (United States)

    Kelly, Ryan T [West Richland, WA; Tang, Keqi [Richland, WA; Page, Jason S [Kennewick, WA; Smith, Richard D [Richland, WA

    2011-11-22

    A transfer structure for droplet-based microfluidic analysis is characterized by a first conduit containing a first stream having at least one immiscible droplet of aqueous material and a second conduit containing a second stream comprising an aqueous fluid. The interface between the first conduit and the second conduit can define a plurality of apertures, wherein the apertures are sized to prevent exchange of the first and second streams between conduits while allowing lossless transfer of droplets from the first conduit to the second conduit through contact between the first and second streams.

  1. Seismic Response Analysis and Design of Structure with Base Isolation

    International Nuclear Information System (INIS)

    Rosko, Peter

    2010-01-01

    The paper reports the study on seismic response and energy distribution of a multi-story civil structure. The nonlinear analysis used the 2003 Bam earthquake acceleration record as the excitation input to the structural model. The displacement response was analyzed in time domain and in frequency domain. The displacement and its derivatives result energy components. The energy distribution in each story provides useful information for the structural upgrade with help of added devices. The objective is the structural displacement response minimization. The application of the structural seismic response research is presented in base-isolation example.

  2. AMATCHMETHOD BASED ON LATENT SEMANTIC ANALYSIS FOR EARTHQUAKEHAZARD EMERGENCY PLAN

    Directory of Open Access Journals (Sweden)

    D. Sun

    2017-09-01

    Full Text Available The structure of the emergency plan on earthquake is complex, and it’s difficult for decision maker to make a decision in a short time. To solve the problem, this paper presents a match method based on Latent Semantic Analysis (LSA. After the word segmentation preprocessing of emergency plan, we carry out keywords extraction according to the part-of-speech and the frequency of words. Then through LSA, we map the documents and query information to the semantic space, and calculate the correlation of documents and queries by the relation between vectors. The experiments results indicate that the LSA can improve the accuracy of emergency plan retrieval efficiently.

  3. Virtual Estimator for Piecewise Linear Systems Based on Observability Analysis

    Science.gov (United States)

    Morales-Morales, Cornelio; Adam-Medina, Manuel; Cervantes, Ilse; Vela-Valdés and, Luis G.; García Beltrán, Carlos Daniel

    2013-01-01

    This article proposes a virtual sensor for piecewise linear systems based on observability analysis that is in function of a commutation law related with the system's outpu. This virtual sensor is also known as a state estimator. Besides, it presents a detector of active mode when the commutation sequences of each linear subsystem are arbitrary and unknown. For the previous, this article proposes a set of virtual estimators that discern the commutation paths of the system and allow estimating their output. In this work a methodology in order to test the observability for piecewise linear systems with discrete time is proposed. An academic example is presented to show the obtained results. PMID:23447007

  4. Electromagnetic fields from mobile phone base station - variability analysis.

    Science.gov (United States)

    Bienkowski, Pawel; Zubrzak, Bartlomiej

    2015-09-01

    The article describes the character of electromagnetic field (EMF) in mobile phone base station (BS) surroundings and its variability in time with an emphasis on the measurement difficulties related to its pulse and multi-frequency nature. Work also presents long-term monitoring measurements performed recently in different locations in Poland - small city with dispersed building development and in major polish city - dense urban area. Authors tried to determine the trends in changing of EMF spectrum analyzing daily changes of measured EMF levels in those locations. Research was performed using selective electromagnetic meters and also EMF meter with spectrum analysis.

  5. Pressure Control in Distillation Columns: A Model-Based Analysis

    DEFF Research Database (Denmark)

    Mauricio Iglesias, Miguel; Bisgaard, Thomas; Kristensen, Henrik

    2014-01-01

    A comprehensive assessment of pressure control in distillation columns is presented, including the consequences for composition control and energy consumption. Two types of representative control structures are modeled, analyzed, and benchmarked. A detailed simulation test, based on a real...... industrial distillation column, is used to assess the differences between the two control structures and to demonstrate the benefits of pressure control in the operation. In the second part of the article, a thermodynamic analysis is carried out to establish the influence of pressure on relative volatility...

  6. Content and user-based music visual analysis

    Science.gov (United States)

    Guo, Xiaochun; Tang, Lei

    2015-12-01

    In recent years, people's ability to collect music got enhanced greatly. Many people who prefer listening music offline even stored thousands of music on their local storage or portable device. However, their ability to deal with music information has not been improved accordingly, which results in two problems. One is how to find out the favourite songs from large music dataset and satisfy different individuals. The other one is how to compose a play list quickly. To solve these problems, the authors proposed a content and user-based music visual analysis approach. We first developed a new recommendation algorithm based on the content of music and user's behaviour, which satisfy individual's preference. Then, we make use of visualization and interaction tools to illustrate the relationship between songs and help people compose a suitable play list. At the end of this paper, a survey is mentioned to show that our system is available and effective.

  7. Quantitative analysis of lead in polysulfide-based impression material

    Directory of Open Access Journals (Sweden)

    Aparecida Silva Braga

    2007-06-01

    Full Text Available Permlastic® is a polysulfide-based impression material widely used by dentists in Brazil. It is composed of a base paste and a catalyzer containing lead dioxide. The high toxicity of lead to humans is ground for much concern, since it can attack various systems and organs. The present study involved a quantitative analysis of the concentration of lead in the material Permlastic®. The lead was determined by plasma-induced optical emission spectrometry (Varian model Vista. The percentages of lead found in the two analyzed lots were 38.1 and 40.8%. The lead concentrations in the material under study were high, but the product’s packaging contained no information about these concentrations.

  8. Structural analysis of nanocomposites based on HDPE/EPDM blends.

    Science.gov (United States)

    Zitzumbo, Roberto; Alonso, Sergio; Avalos, Felipe; Ortiz, José C; López-Manchado, Miguel A; Arroyo, Miguel

    2006-02-01

    Intercalated and exfoliated nanocomposites based on HDPE and EPDM blends with an organoclay have been obtained through the addition of EPDM-g-MA as a compatibilizer. The combined effect of clay and EPDM-g-MA on the rheological behaviour is very noticeable with a sensible increase in viscosity which suggests the formation of a structural net of percolation induced by the presence of intercalated and exfoliated silicate layer. As deduced from rheological studies, a morphology based on nanostructured micro-domains dispersed in HDPE continuous phase is proposed for EPDM/HDPE blend nanocomposites. XRD and SEM analysis suggest that two different transport phenomena take simultaneously place during the intercalation process in the melt. One due to diffusion of HDPE chains into the tactoid and the other to diffusion of EPDM-g-MA into the silicate galleries.

  9. Live face detection based on the analysis of Fourier spectra

    Science.gov (United States)

    Li, Jiangwei; Wang, Yunhong; Tan, Tieniu; Jain, Anil K.

    2004-08-01

    Biometrics is a rapidly developing technology that is to identify a person based on his or her physiological or behavioral characteristics. To ensure the correction of authentication, the biometric system must be able to detect and reject the use of a copy of a biometric instead of the live biometric. This function is usually termed "liveness detection". This paper describes a new method for live face detection. Using structure and movement information of live face, an effective live face detection algorithm is presented. Compared to existing approaches, which concentrate on the measurement of 3D depth information, this method is based on the analysis of Fourier spectra of a single face image or face image sequences. Experimental results show that the proposed method has an encouraging performance.

  10. Harmonic analysis of electrified railway based on improved HHT

    Science.gov (United States)

    Wang, Feng

    2018-04-01

    In this paper, the causes and harms of the current electric locomotive electrical system harmonics are firstly studied and analyzed. Based on the characteristics of the harmonics in the electrical system, the Hilbert-Huang transform method is introduced. Based on the in-depth analysis of the empirical mode decomposition method and the Hilbert transform method, the reasons and solutions to the endpoint effect and modal aliasing problem in the HHT method are explored. For the endpoint effect of HHT, this paper uses point-symmetric extension method to extend the collected data; In allusion to the modal aliasing problem, this paper uses the high frequency harmonic assistant method to preprocess the signal and gives the empirical formula of high frequency auxiliary harmonic. Finally, combining the suppression of HHT endpoint effect and modal aliasing problem, an improved HHT method is proposed and simulated by matlab. The simulation results show that the improved HHT is effective for the electric locomotive power supply system.

  11. Geospatial analysis based on GIS integrated with LADAR.

    Science.gov (United States)

    Fetterman, Matt R; Freking, Robert; Fernandez-Cull, Christy; Hinkle, Christopher W; Myne, Anu; Relyea, Steven; Winslow, Jim

    2013-10-07

    In this work, we describe multi-layered analyses of a high-resolution broad-area LADAR data set in support of expeditionary activities. High-level features are extracted from the LADAR data, such as the presence and location of buildings and cars, and then these features are used to populate a GIS (geographic information system) tool. We also apply line-of-sight (LOS) analysis to develop a path-planning module. Finally, visualization is addressed and enhanced with a gesture-based control system that allows the user to navigate through the enhanced data set in a virtual immersive experience. This work has operational applications including military, security, disaster relief, and task-based robotic path planning.

  12. Analysis of large fault trees based on functional decomposition

    International Nuclear Information System (INIS)

    Contini, Sergio; Matuzas, Vaidas

    2011-01-01

    With the advent of the Binary Decision Diagrams (BDD) approach in fault tree analysis, a significant enhancement has been achieved with respect to previous approaches, both in terms of efficiency and accuracy of the overall outcome of the analysis. However, the exponential increase of the number of nodes with the complexity of the fault tree may prevent the construction of the BDD. In these cases, the only way to complete the analysis is to reduce the complexity of the BDD by applying the truncation technique, which nevertheless implies the problem of estimating the truncation error or upper and lower bounds of the top-event unavailability. This paper describes a new method to analyze large coherent fault trees which can be advantageously applied when the working memory is not sufficient to construct the BDD. It is based on the decomposition of the fault tree into simpler disjoint fault trees containing a lower number of variables. The analysis of each simple fault tree is performed by using all the computational resources. The results from the analysis of all simpler fault trees are re-combined to obtain the results for the original fault tree. Two decomposition methods are herewith described: the first aims at determining the minimal cut sets (MCS) and the upper and lower bounds of the top-event unavailability; the second can be applied to determine the exact value of the top-event unavailability. Potentialities, limitations and possible variations of these methods will be discussed with reference to the results of their application to some complex fault trees.

  13. Semiconductor product analysis challenges based on the 1999 ITRS

    International Nuclear Information System (INIS)

    Joseph, Thomas W.; Anderson, Richard E.; Gilfeather, Glen; LeClaire, Carole; Yim, Daniel

    2001-01-01

    One of the most significant challenges for technology characterization and failure analysis is to keep instrumentation and techniques in step with the development of technology itself. Not only are dimensions shrinking and new materials being employed, but the rate of change is increasing. According to the 1999 International Technology Roadmap for Semiconductors, 'The number and difficulty of the technical challenges continue to increase as technology moves forward'. It could be argued that technology cannot be developed without appropriate analytical techniques; nevertheless while much effort is being directed at materials and processes, only a small proportion is being directed at analysis. Whereas previous versions of the Semiconductor Industry Association roadmap contained a small number of implicit references to characterization and analysis, the 1999 ITRS contains many explicit references. It is clear that characterization is now woven through the roadmap, and technology developers in all areas appreciate the fact that new instrumentation and techniques will be required to sustain the rate of development the semiconductor industry has seen in recent years. Late in 1999, a subcommittee of the Sematech Product Analysis Forum (PAF) reviewed the ITRS and identified a 'top-ten' list of challenges which the failure analysis community will face as present technologies are extended and future technologies are developed. This paper discusses the PAF top-ten list of challenges, which is based primarily on the Difficult Challenges tables from each ITRS working group. Eight of the top-ten are challenges of significant technical magnitude; only two could be considered non-technical in nature. Most of these challenges cut across several working group areas and could be considered common threads in the roadmap, ranging from fault simulation and modeling to imaging small features, from electrical defect isolation to deprocessing. While evolutionary changes can be anticipated

  14. Poisson-event-based analysis of cell proliferation.

    Science.gov (United States)

    Summers, Huw D; Wills, John W; Brown, M Rowan; Rees, Paul

    2015-05-01

    A protocol for the assessment of cell proliferation dynamics is presented. This is based on the measurement of cell division events and their subsequent analysis using Poisson probability statistics. Detailed analysis of proliferation dynamics in heterogeneous populations requires single cell resolution within a time series analysis and so is technically demanding to implement. Here, we show that by focusing on the events during which cells undergo division rather than directly on the cells themselves a simplified image acquisition and analysis protocol can be followed, which maintains single cell resolution and reports on the key metrics of cell proliferation. The technique is demonstrated using a microscope with 1.3 μm spatial resolution to track mitotic events within A549 and BEAS-2B cell lines, over a period of up to 48 h. Automated image processing of the bright field images using standard algorithms within the ImageJ software toolkit yielded 87% accurate recording of the manually identified, temporal, and spatial positions of the mitotic event series. Analysis of the statistics of the interevent times (i.e., times between observed mitoses in a field of view) showed that cell division conformed to a nonhomogeneous Poisson process in which the rate of occurrence of mitotic events, λ exponentially increased over time and provided values of the mean inter mitotic time of 21.1 ± 1.2 hours for the A549 cells and 25.0 ± 1.1 h for the BEAS-2B cells. Comparison of the mitotic event series for the BEAS-2B cell line to that predicted by random Poisson statistics indicated that temporal synchronisation of the cell division process was occurring within 70% of the population and that this could be increased to 85% through serum starvation of the cell culture. © 2015 International Society for Advancement of Cytometry.

  15. Analysis of large fault trees based on functional decomposition

    Energy Technology Data Exchange (ETDEWEB)

    Contini, Sergio, E-mail: sergio.contini@jrc.i [European Commission, Joint Research Centre, Institute for the Protection and Security of the Citizen, 21020 Ispra (Italy); Matuzas, Vaidas [European Commission, Joint Research Centre, Institute for the Protection and Security of the Citizen, 21020 Ispra (Italy)

    2011-03-15

    With the advent of the Binary Decision Diagrams (BDD) approach in fault tree analysis, a significant enhancement has been achieved with respect to previous approaches, both in terms of efficiency and accuracy of the overall outcome of the analysis. However, the exponential increase of the number of nodes with the complexity of the fault tree may prevent the construction of the BDD. In these cases, the only way to complete the analysis is to reduce the complexity of the BDD by applying the truncation technique, which nevertheless implies the problem of estimating the truncation error or upper and lower bounds of the top-event unavailability. This paper describes a new method to analyze large coherent fault trees which can be advantageously applied when the working memory is not sufficient to construct the BDD. It is based on the decomposition of the fault tree into simpler disjoint fault trees containing a lower number of variables. The analysis of each simple fault tree is performed by using all the computational resources. The results from the analysis of all simpler fault trees are re-combined to obtain the results for the original fault tree. Two decomposition methods are herewith described: the first aims at determining the minimal cut sets (MCS) and the upper and lower bounds of the top-event unavailability; the second can be applied to determine the exact value of the top-event unavailability. Potentialities, limitations and possible variations of these methods will be discussed with reference to the results of their application to some complex fault trees.

  16. Mobile applications for weight management: theory-based content analysis.

    Science.gov (United States)

    Azar, Kristen M J; Lesser, Lenard I; Laing, Brian Y; Stephens, Janna; Aurora, Magi S; Burke, Lora E; Palaniappan, Latha P

    2013-11-01

    The use of smartphone applications (apps) to assist with weight management is increasingly prevalent, but the quality of these apps is not well characterized. The goal of the study was to evaluate diet/nutrition and anthropometric tracking apps based on incorporation of features consistent with theories of behavior change. A comparative, descriptive assessment was conducted of the top-rated free apps in the Health and Fitness category available in the iTunes App Store. Health and Fitness apps (N=200) were evaluated using predetermined inclusion/exclusion criteria and categorized based on commonality in functionality, features, and developer description. Four researchers then evaluated the two most popular apps in each category using two instruments: one based on traditional behavioral theory (score range: 0-100) and the other on the Fogg Behavioral Model (score range: 0-6). Data collection and analysis occurred in November 2012. Eligible apps (n=23) were divided into five categories: (1) diet tracking; (2) healthy cooking; (3) weight/anthropometric tracking; (4) grocery decision making; and (5) restaurant decision making. The mean behavioral theory score was 8.1 (SD=4.2); the mean persuasive technology score was 1.9 (SD=1.7). The top-rated app on both scales was Lose It! by Fitnow Inc. All apps received low overall scores for inclusion of behavioral theory-based strategies. © 2013 American Journal of Preventive Medicine.

  17. Phishing Detection: Analysis of Visual Similarity Based Approaches

    Directory of Open Access Journals (Sweden)

    Ankit Kumar Jain

    2017-01-01

    Full Text Available Phishing is one of the major problems faced by cyber-world and leads to financial losses for both industries and individuals. Detection of phishing attack with high accuracy has always been a challenging issue. At present, visual similarities based techniques are very useful for detecting phishing websites efficiently. Phishing website looks very similar in appearance to its corresponding legitimate website to deceive users into believing that they are browsing the correct website. Visual similarity based phishing detection techniques utilise the feature set like text content, text format, HTML tags, Cascading Style Sheet (CSS, image, and so forth, to make the decision. These approaches compare the suspicious website with the corresponding legitimate website by using various features and if the similarity is greater than the predefined threshold value then it is declared phishing. This paper presents a comprehensive analysis of phishing attacks, their exploitation, some of the recent visual similarity based approaches for phishing detection, and its comparative study. Our survey provides a better understanding of the problem, current solution space, and scope of future research to deal with phishing attacks efficiently using visual similarity based approaches.

  18. Echo-waveform classification using model and model free techniques: Experimental study results from central western continental shelf of India

    Digital Repository Service at National Institute of Oceanography (India)

    Chakraborty, B.; Navelkar, G.S.; Desai, R.G.P.; Janakiraman, G.; Mahale, V.; Fernandes, W.A.; Rao, N.

    seafloor of India, but unable to provide a suitable means for seafloor classification. This paper also suggests a hybrid artificial neural network (ANN) architecture i.e. Learning Vector Quantisation (LVQ) for seafloor classification. An analysis...

  19. Flood Risk Assessment Based On Security Deficit Analysis

    Science.gov (United States)

    Beck, J.; Metzger, R.; Hingray, B.; Musy, A.

    Risk is a human perception: a given risk may be considered as acceptable or unac- ceptable depending on the group that has to face that risk. Flood risk analysis of- ten estimates economic losses from damages, but neglects the question of accept- able/unacceptable risk. With input from land use managers, politicians and other stakeholders, risk assessment based on security deficit analysis determines objects with unacceptable risk and their degree of security deficit. Such a risk assessment methodology, initially developed by the Swiss federal authorities, is illustrated by its application on a reach of the Alzette River (Luxembourg) in the framework of the IRMA-SPONGE FRHYMAP project. Flood risk assessment always involves a flood hazard analysis, an exposed object vulnerability analysis, and an analysis combing the results of these two previous analyses. The flood hazard analysis was done with the quasi-2D hydraulic model FldPln to produce flood intensity maps. Flood intensity was determined by the water height and velocity. Object data for the vulnerability analysis, provided by the Luxembourg government, were classified according to their potential damage. Potential damage is expressed in terms of direct, human life and secondary losses. A thematic map was produced to show the object classification. Protection goals were then attributed to the object classes. Protection goals are assigned in terms of an acceptable flood intensity for a certain flood frequency. This is where input from land use managers and politicians comes into play. The perception of risk in the re- gion or country influences the protection goal assignment. Protection goals as used in Switzerland were used in this project. Thematic maps showing the protection goals of each object in the case study area for a given flood frequency were produced. Com- parison between an object's protection goal and the intensity of the flood that touched the object determine the acceptability of the risk and the

  20. Reliability analysis of cluster-based ad-hoc networks

    International Nuclear Information System (INIS)

    Cook, Jason L.; Ramirez-Marquez, Jose Emmanuel

    2008-01-01

    The mobile ad-hoc wireless network (MAWN) is a new and emerging network scheme that is being employed in a variety of applications. The MAWN varies from traditional networks because it is a self-forming and dynamic network. The MAWN is free of infrastructure and, as such, only the mobile nodes comprise the network. Pairs of nodes communicate either directly or through other nodes. To do so, each node acts, in turn, as a source, destination, and relay of messages. The virtue of a MAWN is the flexibility this provides; however, the challenge for reliability analyses is also brought about by this unique feature. The variability and volatility of the MAWN configuration makes typical reliability methods (e.g. reliability block diagram) inappropriate because no single structure or configuration represents all manifestations of a MAWN. For this reason, new methods are being developed to analyze the reliability of this new networking technology. New published methods adapt to this feature by treating the configuration probabilistically or by inclusion of embedded mobility models. This paper joins both methods together and expands upon these works by modifying the problem formulation to address the reliability analysis of a cluster-based MAWN. The cluster-based MAWN is deployed in applications with constraints on networking resources such as bandwidth and energy. This paper presents the problem's formulation, a discussion of applicable reliability metrics for the MAWN, and illustration of a Monte Carlo simulation method through the analysis of several example networks

  1. Principle-based concept analysis: intentionality in holistic nursing theories.

    Science.gov (United States)

    Aghebati, Nahid; Mohammadi, Eesa; Ahmadi, Fazlollah; Noaparast, Khosrow Bagheri

    2015-03-01

    This is a report of a principle-based concept analysis of intentionality in holistic nursing theories. A principle-based concept analysis method was used to analyze seven holistic theories. The data included eight books and 31 articles (1998-2011), which were retrieved through MEDLINE and CINAHL. Erickson, Kriger, Parse, Watson, and Zahourek define intentionality as a capacity, a focused consciousness, and a pattern of human being. Rogers and Newman do not explicitly mention intentionality; however, they do explain pattern and consciousness (epistemology). Intentionality has been operationalized as a core concept of nurse-client relationships (pragmatic). The theories are consistent on intentionality as a noun and as an attribute of the person-intentionality is different from intent and intention (linguistic). There is ambiguity concerning the boundaries between intentionality and consciousness (logic). Theoretically, intentionality is an evolutionary capacity to integrate human awareness and experience. Because intentionality is an individualized concept, we introduced it as "a matrix of continuous known changes" that emerges in two forms: as a capacity of human being and as a capacity of transpersonal caring. This study has produced a theoretical definition of intentionality and provides a foundation for future research to further investigate intentionality to better delineate its boundaries. © The Author(s) 2014.

  2. Streaming support for data intensive cloud-based sequence analysis.

    Science.gov (United States)

    Issa, Shadi A; Kienzler, Romeo; El-Kalioby, Mohamed; Tonellato, Peter J; Wall, Dennis; Bruggmann, Rémy; Abouelhoda, Mohamed

    2013-01-01

    Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of "resources-on-demand" and "pay-as-you-go", scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client's site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation.

  3. Streaming Support for Data Intensive Cloud-Based Sequence Analysis

    Directory of Open Access Journals (Sweden)

    Shadi A. Issa

    2013-01-01

    Full Text Available Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS technology. Based on the concepts of “resources-on-demand” and “pay-as-you-go”, scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client’s site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation.

  4. Analysis of Android Device-Based Solutions for Fall Detection

    Directory of Open Access Journals (Sweden)

    Eduardo Casilari

    2015-07-01

    Full Text Available Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources to fall detection solutions.

  5. Analysis of Android Device-Based Solutions for Fall Detection.

    Science.gov (United States)

    Casilari, Eduardo; Luque, Rafael; Morón, María-José

    2015-07-23

    Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions.

  6. Image-Based 3d Reconstruction and Analysis for Orthodontia

    Science.gov (United States)

    Knyaz, V. A.

    2012-08-01

    Among the main tasks of orthodontia are analysis of teeth arches and treatment planning for providing correct position for every tooth. The treatment plan is based on measurement of teeth parameters and designing perfect teeth arch curve which teeth are to create after treatment. The most common technique for teeth moving uses standard brackets which put on teeth and a wire of given shape which is clamped by these brackets for producing necessary forces to every tooth for moving it in given direction. The disadvantages of standard bracket technique are low accuracy of tooth dimensions measurements and problems with applying standard approach for wide variety of complex orthodontic cases. The image-based technique for orthodontic planning, treatment and documenting aimed at overcoming these disadvantages is proposed. The proposed approach provides performing accurate measurements of teeth parameters needed for adequate planning, designing correct teeth position and monitoring treatment process. The developed technique applies photogrammetric means for teeth arch 3D model generation, brackets position determination and teeth shifting analysis.

  7. Nonlinear Process Fault Diagnosis Based on Serial Principal Component Analysis.

    Science.gov (United States)

    Deng, Xiaogang; Tian, Xuemin; Chen, Sheng; Harris, Chris J

    2018-03-01

    Many industrial processes contain both linear and nonlinear parts, and kernel principal component analysis (KPCA), widely used in nonlinear process monitoring, may not offer the most effective means for dealing with these nonlinear processes. This paper proposes a new hybrid linear-nonlinear statistical modeling approach for nonlinear process monitoring by closely integrating linear principal component analysis (PCA) and nonlinear KPCA using a serial model structure, which we refer to as serial PCA (SPCA). Specifically, PCA is first applied to extract PCs as linear features, and to decompose the data into the PC subspace and residual subspace (RS). Then, KPCA is performed in the RS to extract the nonlinear PCs as nonlinear features. Two monitoring statistics are constructed for fault detection, based on both the linear and nonlinear features extracted by the proposed SPCA. To effectively perform fault identification after a fault is detected, an SPCA similarity factor method is built for fault recognition, which fuses both the linear and nonlinear features. Unlike PCA and KPCA, the proposed method takes into account both linear and nonlinear PCs simultaneously, and therefore, it can better exploit the underlying process's structure to enhance fault diagnosis performance. Two case studies involving a simulated nonlinear process and the benchmark Tennessee Eastman process demonstrate that the proposed SPCA approach is more effective than the existing state-of-the-art approach based on KPCA alone, in terms of nonlinear process fault detection and identification.

  8. Analysis of Android Device-Based Solutions for Fall Detection

    Science.gov (United States)

    Casilari, Eduardo; Luque, Rafael; Morón, María-José

    2015-01-01

    Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions. PMID:26213928

  9. Nondeducibility-Based Analysis of Cyber-Physical Systems

    Science.gov (United States)

    Gamage, Thoshitha; McMillin, Bruce

    Controlling information flow in a cyber-physical system (CPS) is challenging because cyber domain decisions and actions manifest themselves as visible changes in the physical domain. This paper presents a nondeducibility-based observability analysis for CPSs. In many CPSs, the capacity of a low-level (LL) observer to deduce high-level (HL) actions ranges from limited to none. However, a collaborative set of observers strategically located in a network may be able to deduce all the HL actions. This paper models a distributed power electronics control device network using a simple DC circuit in order to understand the effect of multiple observers in a CPS. The analysis reveals that the number of observers required to deduce all the HL actions in a system increases linearly with the number of configurable units. A simple definition of nondeducibility based on the uniqueness of low-level projections is also presented. This definition is used to show that a system with two security domain levels could be considered “nondeducibility secure” if no unique LL projections exist.

  10. Thermogravimetric and Kinetic Analysis of Cassava Starch Based Bioplastic

    Directory of Open Access Journals (Sweden)

    Nanang Eko Wahyuningtyas

    2017-11-01

    Full Text Available Cassava starch based bioplasticfor packaging application has great potency because of the various starch-producing plants in Indonesia.Bioplasticcan contribute to reduce the dependence on fossil fuels andpetroleumthat can solve the environmentalproblem.Thepurpose of this research is to find out the thermal decomposition and the activation energy of cassava starch based bioplastic. The methods weresynthesis bioplastic with cassava starch as main component and glycerol as plasticizer. The thermogravimetry analysis was conducted to obtain the decomposition process mechanism of bioplastic and the heating valueof bioplasticwas measured  using theadiabatic bomb calorimetric.  Data analysis was conducted using  a fitting model approach with an acikalin method to determine the activation energy. The result of thethermogravimetricanalysis showed thatbioplasticisgraduallydecomposedto the moisture, volatilematter, fixed carbon, andash in four stages mechanism. Totally decomposition of bioplastic was 530°C, then all of bioplastic was become the ash. The activation energy in the early and primary thermal decomposition stages are 1.27 kJ/moland 22.62 kJ/mol, respectively and heating valueof bioplastic is 15.16 MJ/kg.

  11. Research on cloud-based remote measurement and analysis system

    Science.gov (United States)

    Gao, Zhiqiang; He, Lingsong; Su, Wei; Wang, Can; Zhang, Changfan

    2015-02-01

    The promising potential of cloud computing and its convergence with technologies such as cloud storage, cloud push, mobile computing allows for creation and delivery of newer type of cloud service. Combined with the thought of cloud computing, this paper presents a cloud-based remote measurement and analysis system. This system mainly consists of three parts: signal acquisition client, web server deployed on the cloud service, and remote client. This system is a special website developed using asp.net and Flex RIA technology, which solves the selective contradiction between two monitoring modes, B/S and C/S. This platform supplies customer condition monitoring and data analysis service by Internet, which was deployed on the cloud server. Signal acquisition device is responsible for data (sensor data, audio, video, etc.) collection and pushes the monitoring data to the cloud storage database regularly. Data acquisition equipment in this system is only conditioned with the function of data collection and network function such as smartphone and smart sensor. This system's scale can adjust dynamically according to the amount of applications and users, so it won't cause waste of resources. As a representative case study, we developed a prototype system based on Ali cloud service using the rotor test rig as the research object. Experimental results demonstrate that the proposed system architecture is feasible.

  12. Streaming Support for Data Intensive Cloud-Based Sequence Analysis

    Science.gov (United States)

    Issa, Shadi A.; Kienzler, Romeo; El-Kalioby, Mohamed; Tonellato, Peter J.; Wall, Dennis; Bruggmann, Rémy; Abouelhoda, Mohamed

    2013-01-01

    Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of “resources-on-demand” and “pay-as-you-go”, scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client's site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation. PMID:23710461

  13. A Web-Based Development Environment for Collaborative Data Analysis

    Science.gov (United States)

    Erdmann, M.; Fischer, R.; Glaser, C.; Klingebiel, D.; Komm, M.; Müller, G.; Rieger, M.; Steggemann, J.; Urban, M.; Winchen, T.

    2014-06-01

    Visual Physics Analysis (VISPA) is a web-based development environment addressing high energy and astroparticle physics. It covers the entire analysis spectrum from the design and validation phase to the execution of analyses and the visualization of results. VISPA provides a graphical steering of the analysis flow, which consists of self-written, re-usable Python and C++ modules for more demanding tasks. All common operating systems are supported since a standard internet browser is the only software requirement for users. Even access via mobile and touch-compatible devices is possible. In this contribution, we present the most recent developments of our web application concerning technical, state-of-the-art approaches as well as practical experiences. One of the key features is the use of workspaces, i.e. user-configurable connections to remote machines supplying resources and local file access. Thereby, workspaces enable the management of data, computing resources (e.g. remote clusters or computing grids), and additional software either centralized or individually. We further report on the results of an application with more than 100 third-year students using VISPA for their regular particle physics exercises during the winter term 2012/13. Besides the ambition to support and simplify the development cycle of physics analyses, new use cases such as fast, location-independent status queries, the validation of results, and the ability to share analyses within worldwide collaborations with a single click become conceivable.

  14. A Web-Based Development Environment for Collaborative Data Analysis

    International Nuclear Information System (INIS)

    Erdmann, M; Fischer, R; Glaser, C; Klingebiel, D; Müller, G; Rieger, M; Urban, M; Winchen, T; Komm, M; Steggemann, J

    2014-01-01

    Visual Physics Analysis (VISPA) is a web-based development environment addressing high energy and astroparticle physics. It covers the entire analysis spectrum from the design and validation phase to the execution of analyses and the visualization of results. VISPA provides a graphical steering of the analysis flow, which consists of self-written, re-usable Python and C++ modules for more demanding tasks. All common operating systems are supported since a standard internet browser is the only software requirement for users. Even access via mobile and touch-compatible devices is possible. In this contribution, we present the most recent developments of our web application concerning technical, state-of-the-art approaches as well as practical experiences. One of the key features is the use of workspaces, i.e. user-configurable connections to remote machines supplying resources and local file access. Thereby, workspaces enable the management of data, computing resources (e.g. remote clusters or computing grids), and additional software either centralized or individually. We further report on the results of an application with more than 100 third-year students using VISPA for their regular particle physics exercises during the winter term 2012/13. Besides the ambition to support and simplify the development cycle of physics analyses, new use cases such as fast, location-independent status queries, the validation of results, and the ability to share analyses within worldwide collaborations with a single click become conceivable

  15. Graph-Based Analysis of Nuclear Smuggling Data

    International Nuclear Information System (INIS)

    Cook, Diane; Holder, Larry; Thompson, Sandra E.; Whitney, Paul D.; Chilton, Lawrence

    2009-01-01

    Much of the data that is collected and analyzed today is structural, consisting not only of entities but also of relationships between the entities. As a result, analysis applications rely upon automated structural data mining approaches to find patterns and concepts of interest. This ability to analyze structural data has become a particular challenge in many security-related domains. In these domains, focusing on the relationships between entities in the data is critical to detect important underlying patterns. In this study we apply structural data mining techniques to automate analysis of nuclear smuggling data. In particular, we choose to model the data as a graph and use graph-based relational learning to identify patterns and concepts of interest in the data. In this paper, we identify the analysis questions that are of importance to security analysts and describe the knowledge representation and data mining approach that we adopt for this challenge. We analyze the results using the Russian nuclear smuggling event database.

  16. Asymptotic performance of regularized quadratic discriminant analysis based classifiers

    KAUST Repository

    Elkhalil, Khalil

    2017-12-13

    This paper carries out a large dimensional analysis of the standard regularized quadratic discriminant analysis (QDA) classifier designed on the assumption that data arise from a Gaussian mixture model. The analysis relies on fundamental results from random matrix theory (RMT) when both the number of features and the cardinality of the training data within each class grow large at the same pace. Under some mild assumptions, we show that the asymptotic classification error converges to a deterministic quantity that depends only on the covariances and means associated with each class as well as the problem dimensions. Such a result permits a better understanding of the performance of regularized QDA and can be used to determine the optimal regularization parameter that minimizes the misclassification error probability. Despite being valid only for Gaussian data, our theoretical findings are shown to yield a high accuracy in predicting the performances achieved with real data sets drawn from popular real data bases, thereby making an interesting connection between theory and practice.

  17. A Web-Based Development Environment for Collaborative Data Analysis

    CERN Document Server

    Erdmann, M; Glaser, C; Klingebiel, D; Komm, M; Müller, G; Rieger, M; Steggemann, J; Urban, M; Winchen, T

    2014-01-01

    Visual Physics Analysis (VISPA) is a web-based development environment addressing high energy and astroparticle physics. It covers the entire analysis spectrum from the design and validation phase to the execution of analyses and the visualization of results. VISPA provides a graphical steering of the analysis ow, which consists of self-written, re-usable Python and C++ modules for more demanding tasks. All common operating systems are supported since a standard internet browser is the only software requirement for users. Even access via mobile and touch-compatible devices is possible. In this contribution, we present the most recent developments of our web application concerning technical, state-of-the-art approaches as well as practical experiences. One of the key features is the use of workspaces, i.e. user-congurable connections to remote machines supplying resources and local le access. Thereby, workspaces enable the management of data, computing resources (e.g. remote clusters or computing grids), and a...

  18. Experimental investigation of thermal neutron analysis based landmine detection technology

    International Nuclear Information System (INIS)

    Zeng Jun; Chu Chengsheng; Ding Ge; Xiang Qingpei; Hao Fanhua; Luo Xiaobing

    2013-01-01

    Background: Recently, the prompt gamma-rays neutron activation analysis method is wildly used in coal analysis and explosive detection, however there were less application about landmine detection using neutron method especially in the domestic research. Purpose: In order to verify the feasibility of Thermal Neutron Analysis (TNA) method used in landmine detection, and explore the characteristic of this technology. Methods: An experimental system of TNA landmine detection was built based on LaBr 3 (Ce) fast scintillator detector and 252 Cf isotope neutron source. The system is comprised of the thermal neutron transition system, the shield system, and the detector system. Results: On the basis of the TNA, the wide energy area calibration method especially to the high energy area was investigated, and the least detection time for a typical mine was defined. In this study, the 72-type anti-tank mine, the 500 g TNT sample and several interferential objects are tested in loess, red soil, magnetic soil and sand respectively. Conclusions: The experimental results indicate that TNA is a reliable demining method, and it can be used to confirm the existence of Anti-Tank Mines (ATM) and large Anti-Personnel Mines (APM) in complicated condition. (authors)

  19. Glyph-Based Video Visualization for Semen Analysis

    KAUST Repository

    Duffy, Brian

    2015-08-01

    © 2013 IEEE. The existing efforts in computer assisted semen analysis have been focused on high speed imaging and automated image analysis of sperm motility. This results in a large amount of data, and it is extremely challenging for both clinical scientists and researchers to interpret, compare and correlate the multidimensional and time-varying measurements captured from video data. In this work, we use glyphs to encode a collection of numerical measurements taken at a regular interval and to summarize spatio-temporal motion characteristics using static visual representations. The design of the glyphs addresses the needs for (a) encoding some 20 variables using separable visual channels, (b) supporting scientific observation of the interrelationships between different measurements and comparison between different sperm cells and their flagella, and (c) facilitating the learning of the encoding scheme by making use of appropriate visual abstractions and metaphors. As a case study, we focus this work on video visualization for computer-aided semen analysis, which has a broad impact on both biological sciences and medical healthcare. We demonstrate that glyph-based visualization can serve as a means of external memorization of video data as well as an overview of a large set of spatiotemporal measurements. It enables domain scientists to make scientific observation in a cost-effective manner by reducing the burden of viewing videos repeatedly, while providing them with a new visual representation for conveying semen statistics.

  20. SVM-based glioma grading. Optimization by feature reduction analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zoellner, Frank G.; Schad, Lothar R. [University Medical Center Mannheim, Heidelberg Univ., Mannheim (Germany). Computer Assisted Clinical Medicine; Emblem, Kyrre E. [Massachusetts General Hospital, Charlestown, A.A. Martinos Center for Biomedical Imaging, Boston MA (United States). Dept. of Radiology; Harvard Medical School, Boston, MA (United States); Oslo Univ. Hospital (Norway). The Intervention Center

    2012-11-01

    We investigated the predictive power of feature reduction analysis approaches in support vector machine (SVM)-based classification of glioma grade. In 101 untreated glioma patients, three analytic approaches were evaluated to derive an optimal reduction in features; (i) Pearson's correlation coefficients (PCC), (ii) principal component analysis (PCA) and (iii) independent component analysis (ICA). Tumor grading was performed using a previously reported SVM approach including whole-tumor cerebral blood volume (CBV) histograms and patient age. Best classification accuracy was found using PCA at 85% (sensitivity = 89%, specificity = 84%) when reducing the feature vector from 101 (100-bins rCBV histogram + age) to 3 principal components. In comparison, classification accuracy by PCC was 82% (89%, 77%, 2 dimensions) and 79% by ICA (87%, 75%, 9 dimensions). For improved speed (up to 30%) and simplicity, feature reduction by all three methods provided similar classification accuracy to literature values ({proportional_to}87%) while reducing the number of features by up to 98%. (orig.)

  1. Cost Risk Analysis Based on Perception of the Engineering Process

    Science.gov (United States)

    Dean, Edwin B.; Wood, Darrell A.; Moore, Arlene A.; Bogart, Edward H.

    1986-01-01

    In most cost estimating applications at the NASA Langley Research Center (LaRC), it is desirable to present predicted cost as a range of possible costs rather than a single predicted cost. A cost risk analysis generates a range of cost for a project and assigns a probability level to each cost value in the range. Constructing a cost risk curve requires a good estimate of the expected cost of a project. It must also include a good estimate of expected variance of the cost. Many cost risk analyses are based upon an expert's knowledge of the cost of similar projects in the past. In a common scenario, a manager or engineer, asked to estimate the cost of a project in his area of expertise, will gather historical cost data from a similar completed project. The cost of the completed project is adjusted using the perceived technical and economic differences between the two projects. This allows errors from at least three sources. The historical cost data may be in error by some unknown amount. The managers' evaluation of the new project and its similarity to the old project may be in error. The factors used to adjust the cost of the old project may not correctly reflect the differences. Some risk analyses are based on untested hypotheses about the form of the statistical distribution that underlies the distribution of possible cost. The usual problem is not just to come up with an estimate of the cost of a project, but to predict the range of values into which the cost may fall and with what level of confidence the prediction is made. Risk analysis techniques that assume the shape of the underlying cost distribution and derive the risk curve from a single estimate plus and minus some amount usually fail to take into account the actual magnitude of the uncertainty in cost due to technical factors in the project itself. This paper addresses a cost risk method that is based on parametric estimates of the technical factors involved in the project being costed. The engineering

  2. Technoeconomic analysis of a biomass based district heating system

    International Nuclear Information System (INIS)

    Zhang, H.; Ugursal, V.I.; Fung, A.

    2005-01-01

    This paper discussed a proposed biomass-based district heating system to be built for the Pictou Landing First Nation Community in Nova Scotia. The community centre consists of 6 buildings and a connecting arcade. The methodology used to size and design heating, ventilating and air conditioning (HVAC) systems, as well as biomass district energy systems (DES) were discussed. Annual energy requirements and biomass fuel consumption predictions were presented, along with cost estimates. A comparative assessment of the system with that of a conventional oil fired system was also conducted. It was suggested that the design and analysis methodology could be used for any similar application. The buildings were modelled and simulated using the Hourly Analysis Program (HAP), a detailed 2-in-1 software program which can be used both for HVAC system sizing and building energy consumption estimation. A techno-economics analysis was conducted to justify the viability of the biomass combustion system. Heating load calculations were performed assuming that the thermostat was set constantly at 22 degrees C. Community centre space heating loads due to individual envelope components for 3 different scenarios were summarized, as the design architecture for the buildings was not yet finalized. It was suggested that efforts should be made to ensure air-tightness and insulation levels of the interior arcade glass wall. A hydronic distribution system with baseboard space heating units was selected, comprising of a woodchip boiler, hot water distribution system, convective heating units and control systems. The community has its own logging operation which will provide the wood fuel required by the proposed system. An outline of the annual allowable harvest covered by the Pictou Landing Forestry Management Plan was presented, with details of proposed wood-chippers for the creation of biomass. It was concluded that the woodchip combustion system is economically preferable to the

  3. Harmonic analysis in integrated energy system based on compressed sensing

    International Nuclear Information System (INIS)

    Yang, Ting; Pen, Haibo; Wang, Dan; Wang, Zhaoxia

    2016-01-01

    Highlights: • We propose a harmonic/inter-harmonic analysis scheme with compressed sensing theory. • Property of sparseness of harmonic signal in electrical power system is proved. • The ratio formula of fundamental and harmonic components sparsity is presented. • Spectral Projected Gradient-Fundamental Filter reconstruction algorithm is proposed. • SPG-FF enhances the precision of harmonic detection and signal reconstruction. - Abstract: The advent of Integrated Energy Systems enabled various distributed energy to access the system through different power electronic devices. The development of this has made the harmonic environment more complex. It needs low complexity and high precision of harmonic detection and analysis methods to improve power quality. To solve the shortages of large data storage capacities and high complexity of compression in sampling under the Nyquist sampling framework, this research paper presents a harmonic analysis scheme based on compressed sensing theory. The proposed scheme enables the performance of the functions of compressive sampling, signal reconstruction and harmonic detection simultaneously. In the proposed scheme, the sparsity of the harmonic signals in the base of the Discrete Fourier Transform (DFT) is numerically calculated first. This is followed by providing a proof of the matching satisfaction of the necessary conditions for compressed sensing. The binary sparse measurement is then leveraged to reduce the storage space in the sampling unit in the proposed scheme. In the recovery process, the scheme proposed a novel reconstruction algorithm called the Spectral Projected Gradient with Fundamental Filter (SPG-FF) algorithm to enhance the reconstruction precision. One of the actual microgrid systems is used as simulation example. The results of the experiment shows that the proposed scheme effectively enhances the precision of harmonic and inter-harmonic detection with low computing complexity, and has good

  4. Overview description of the base scenario derived from FEP analysis

    International Nuclear Information System (INIS)

    Locke, J.; Bailey, L.

    1998-01-01

    , subsequent evolution and the processes affecting radionuclide transport for the groundwater and gas pathways. This report uses the conceptual models developed from the FEP analysis to present a description of the base scenario, in terms of the processes to be represented in detailed models. This report does not present an assessment of the base scenario, but rather seeks to provide a summary of those features, events and processes that should be represented, at an appropriate level of detail, within numerical models. The requirements for the development of appropriate models for representing the base scenario are described in an underlying report within the model development document suite. (author)

  5. Diagnostic markers of urothelial cancer based on DNA methylation analysis

    International Nuclear Information System (INIS)

    Chihara, Yoshitomo; Hirao, Yoshihiko; Kanai, Yae; Fujimoto, Hiroyuki; Sugano, Kokichi; Kawashima, Kiyotaka; Liang, Gangning; Jones, Peter A; Fujimoto, Kiyohide; Kuniyasu, Hiroki

    2013-01-01

    Early detection and risk assessment are crucial for treating urothelial cancer (UC), which is characterized by a high recurrence rate, and necessitates frequent and invasive monitoring. We aimed to establish diagnostic markers for UC based on DNA methylation. In this multi-center study, three independent sample sets were prepared. First, DNA methylation levels at CpG loci were measured in the training sets (tumor samples from 91 UC patients, corresponding normal-appearing tissue from these patients, and 12 normal tissues from age-matched bladder cancer-free patients) using the Illumina Golden Gate methylation assay to identify differentially methylated loci. Next, these methylated loci were validated by quantitative DNA methylation by pyrosequencing, using another cohort of tissue samples (Tissue validation set). Lastly, methylation of these markers was analyzed in the independent urine samples (Urine validation set). ROC analysis was performed to evaluate the diagnostic accuracy of these 12 selected markers. Of the 1303 CpG sites, 158 were hyper ethylated and 356 were hypo ethylated in tumor tissues compared to normal tissues. In the panel analysis, 12 loci showed remarkable alterations between tumor and normal samples, with 94.3% sensitivity and 97.8% specificity. Similarly, corresponding normal tissue could be distinguished from normal tissues with 76.0% sensitivity and 100% specificity. Furthermore, the diagnostic accuracy for UC of these markers determined in urine samples was high, with 100% sensitivity and 100% specificity. Based on these preliminary findings, diagnostic markers based on differential DNA methylation at specific loci can be useful for non-invasive and reliable detection of UC and epigenetic field defect

  6. Variance-based sensitivity analysis for wastewater treatment plant modelling.

    Science.gov (United States)

    Cosenza, Alida; Mannina, Giorgio; Vanrolleghem, Peter A; Neumann, Marc B

    2014-02-01

    Global sensitivity analysis (GSA) is a valuable tool to support the use of mathematical models that characterise technical or natural systems. In the field of wastewater modelling, most of the recent applications of GSA use either regression-based methods, which require close to linear relationships between the model outputs and model factors, or screening methods, which only yield qualitative results. However, due to the characteristics of membrane bioreactors (MBR) (non-linear kinetics, complexity, etc.) there is an interest to adequately quantify the effects of non-linearity and interactions. This can be achieved with variance-based sensitivity analysis methods. In this paper, the Extended Fourier Amplitude Sensitivity Testing (Extended-FAST) method is applied to an integrated activated sludge model (ASM2d) for an MBR system including microbial product formation and physical separation processes. Twenty-one model outputs located throughout the different sections of the bioreactor and 79 model factors are considered. Significant interactions among the model factors are found. Contrary to previous GSA studies for ASM models, we find the relationship between variables and factors to be non-linear and non-additive. By analysing the pattern of the variance decomposition along the plant, the model factors having the highest variance contributions were identified. This study demonstrates the usefulness of variance-based methods in membrane bioreactor modelling where, due to the presence of membranes and different operating conditions than those typically found in conventional activated sludge systems, several highly non-linear effects are present. Further, the obtained results highlight the relevant role played by the modelling approach for MBR taking into account simultaneously biological and physical processes. © 2013.

  7. Feature-Based Statistical Analysis of Combustion Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, J; Krishnamoorthy, V; Liu, S; Grout, R; Hawkes, E; Chen, J; Pascucci, V; Bremer, P T

    2011-11-18

    We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion

  8. Design analysis and microprocessor based control of a nuclear reactor

    International Nuclear Information System (INIS)

    Sabbakh, N.J.

    1988-01-01

    The object of this thesis is to design and test a microprocessor based controller, to a simulated nuclear reactor system. The mathematical model that describes the dynamics of a typical nuclear reactor of one group of delayed neutrons approximations with temperature feedback was chosen. A digital computer program has been developed for the design and analysis of a simulated model based on the concept of state-variable feedback in order to meet a desired system response with maximum overshoot of 3.4% and setting time of 4 sec. The state variable feedback coefficients are designed for the continuous system, then an approximation is used to obtain in the state variable feedback vector for the discrete system. System control was implemented utilizing Direct Digital Control (DDC) of a nuclear reactor simulated model through a control algorithm that was performed by means of a microprocessor based system. The controller performance was satisfactorily tested by exciting the reactor system with a transient reactivity disturbance and by a step change in power demand. Direct digital control, when implemented on a microprocessor adds versatility, flexibility in system design with the added advantage of possible use of optimal control algorithms. 6 tabs.; 30 figs.; 46 refs.; 6 apps

  9. Syndrome identification based on 2D analysis software.

    Science.gov (United States)

    Boehringer, Stefan; Vollmar, Tobias; Tasse, Christiane; Wurtz, Rolf P; Gillessen-Kaesbach, Gabriele; Horsthemke, Bernhard; Wieczorek, Dagmar

    2006-10-01

    Clinical evaluation of children with developmental delay continues to present a challenge to the clinicians. In many cases, the face provides important information to diagnose a condition. However, database support with respect to facial traits is limited at present. Computer-based analyses of 2D and 3D representations of faces have been developed, but it is unclear how well a larger number of conditions can be handled by such systems. We have therefore analysed 2D pictures of patients each being affected with one of 10 syndromes (fragile X syndrome; Cornelia de Lange syndrome; Williams-Beuren syndrome; Prader-Willi syndrome; Mucopolysaccharidosis type III; Cri-du-chat syndrome; Smith-Lemli-Opitz syndrome; Sotos syndrome; Microdeletion 22q11.2; Noonan syndrome). We can show that a classification accuracy of >75% can be achieved for a computer-based diagnosis among the 10 syndromes, which is about the same accuracy achieved for five syndromes in a previous study. Pairwise discrimination of syndromes ranges from 80 to 99%. Furthermore, we can demonstrate that the criteria used by the computer decisions match clinical observations in many cases. These findings indicate that computer-based picture analysis might be a helpful addition to existing database systems, which are meant to assist in syndrome diagnosis, especially as data acquisition is straightforward and involves off-the-shelf digital camera equipment.

  10. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  11. A Novel Multiobjective Evolutionary Algorithm Based on Regression Analysis

    Directory of Open Access Journals (Sweden)

    Zhiming Song

    2015-01-01

    Full Text Available As is known, the Pareto set of a continuous multiobjective optimization problem with m objective functions is a piecewise continuous (m-1-dimensional manifold in the decision space under some mild conditions. However, how to utilize the regularity to design multiobjective optimization algorithms has become the research focus. In this paper, based on this regularity, a model-based multiobjective evolutionary algorithm with regression analysis (MMEA-RA is put forward to solve continuous multiobjective optimization problems with variable linkages. In the algorithm, the optimization problem is modelled as a promising area in the decision space by a probability distribution, and the centroid of the probability distribution is (m-1-dimensional piecewise continuous manifold. The least squares method is used to construct such a model. A selection strategy based on the nondominated sorting is used to choose the individuals to the next generation. The new algorithm is tested and compared with NSGA-II and RM-MEDA. The result shows that MMEA-RA outperforms RM-MEDA and NSGA-II on the test instances with variable linkages. At the same time, MMEA-RA has higher efficiency than the other two algorithms. A few shortcomings of MMEA-RA have also been identified and discussed in this paper.

  12. A Research Roadmap for Computation-Based Human Reliability Analysis

    International Nuclear Information System (INIS)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

    2015-01-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  13. Aerodynamic flight evaluation analysis and data base update

    Science.gov (United States)

    Boyle, W. W.; Miller, M. S.; Wilder, G. O.; Reheuser, R. D.; Sharp, R. S.; Bridges, G. I.

    1989-01-01

    Research was conducted to determine the feasibility of replacing the Solid Rocket Boosters on the existing Space Shuttle Launch Vehicle (SSLV) with Liquid Rocket Boosters (LRB). As a part of the LRB selection process, a series of wind tunnel tests were conducted along with aero studies to determine the effects of different LRB configurations on the SSLV. Final results were tabulated into increments and added to the existing SSLV data base. The research conducted in this study was taken from a series of wind tunnel tests conducted at Marshall's 14-inch Trisonic Wind Tunnel. The effects on the axial force (CAF), normal force (CNF), pitching moment (CMF), side force (CY), wing shear force (CSR), wing torque moment (CTR), and wing bending moment (CBR) coefficients were investigated for a number of candidate LRB configurations. The aero effects due to LRB protuberances, ET/LRB separation distance, and aft skirts were also gathered from the tests. Analysis was also conducted to investigate the base pressure and plume effects due to the new booster geometries. The test results found in Phases 1 and 2 of wind tunnel testing are discussed and compared. Preliminary LRB lateral/directional data results and trends are given. The protuberance and gap/skirt effects are discussed. The base pressure/plume effects study is discussed and results are given.

  14. An Application of Monte-Carlo-Based Sensitivity Analysis on the Overlap in Discriminant Analysis

    Directory of Open Access Journals (Sweden)

    S. Razmyan

    2012-01-01

    Full Text Available Discriminant analysis (DA is used for the measurement of estimates of a discriminant function by minimizing their group misclassifications to predict group membership of newly sampled data. A major source of misclassification in DA is due to the overlapping of groups. The uncertainty in the input variables and model parameters needs to be properly characterized in decision making. This study combines DEA-DA with a sensitivity analysis approach to an assessment of the influence of banks’ variables on the overall variance in overlap in a DA in order to determine which variables are most significant. A Monte-Carlo-based sensitivity analysis is considered for computing the set of first-order sensitivity indices of the variables to estimate the contribution of each uncertain variable. The results show that the uncertainties in the loans granted and different deposit variables are more significant than uncertainties in other banks’ variables in decision making.

  15. Canonical correlation analysis for gene-based pleiotropy discovery.

    Directory of Open Access Journals (Sweden)

    Jose A Seoane

    2014-10-01

    Full Text Available Genome-wide association studies have identified a wealth of genetic variants involved in complex traits and multifactorial diseases. There is now considerable interest in testing variants for association with multiple phenotypes (pleiotropy and for testing multiple variants for association with a single phenotype (gene-based association tests. Such approaches can increase statistical power by combining evidence for association over multiple phenotypes or genetic variants respectively. Canonical Correlation Analysis (CCA measures the correlation between two sets of multidimensional variables, and thus offers the potential to combine these two approaches. To apply CCA, we must restrict the number of attributes relative to the number of samples. Hence we consider modules of genetic variation that can comprise a gene, a pathway or another biologically relevant grouping, and/or a set of phenotypes. In order to do this, we use an attribute selection strategy based on a binary genetic algorithm. Applied to a UK-based prospective cohort study of 4286 women (the British Women's Heart and Health Study, we find improved statistical power in the detection of previously reported genetic associations, and identify a number of novel pleiotropic associations between genetic variants and phenotypes. New discoveries include gene-based association of NSF with triglyceride levels and several genes (ACSM3, ERI2, IL18RAP, IL23RAP and NRG1 with left ventricular hypertrophy phenotypes. In multiple-phenotype analyses we find association of NRG1 with left ventricular hypertrophy phenotypes, fibrinogen and urea and pleiotropic relationships of F7 and F10 with Factor VII, Factor IX and cholesterol levels.

  16. Functional analysis-based interventions for challenging behaviour in dementia.

    Science.gov (United States)

    Moniz Cook, Esme D; Swift, Katie; James, Ian; Malouf, Reem; De Vugt, Marjolein; Verhey, Frans

    2012-02-15

    Functional analysis (FA) for the management of challenging behaviour is a promising behavioural intervention that involves exploring the meaning or purpose of an individual's behaviour. It extends the 'ABC' approach of behavioural analysis, to overcome the restriction of having to derive a single explanatory hypothesis for the person's behaviour. It is seen as a first line alternative to traditional pharmacological management for agitation and aggression. FA typically requires the therapist to develop and evaluate hypotheses-driven strategies that aid family and staff caregivers to reduce or resolve a person's distress and its associated behavioural manifestations. To assess the effects of functional analysis-based interventions for people with dementia (and their caregivers) living in their own home or in other settings. We searched ALOIS: the Cochrane Dementia and Cognitive Improvement Group's Specialized Register on 3 March 2011 using the terms: FA, behaviour (intervention, management, modification), BPSD, psychosocial and Dementia. Randomised controlled trials (RCTs) with reported behavioural outcomes that could be associated with functional analysis for the management of challenging behaviour in dementia. Four reviewers selected trials for inclusion. Two reviewers worked independently to extract data and assess trial quality, including bias. Meta-analyses for reported incidence, frequency, severity of care recipient challenging behaviour and mood (primary outcomes) and caregiver reaction, burden and mood were performed. Details of adverse effects were noted. Eighteen trials are included in the review. The majority were in family care settings. For fourteen studies, FA was just one aspect of a broad multi-component programme of care. Assessing the effect of FA was compromised by ill-defined protocols for the duration of component parts of these programmes (i.e. frequency of the intervention or actual time spent). Therefore, establishing the real effect of the

  17. Rasch model based analysis of the Force Concept Inventory

    Directory of Open Access Journals (Sweden)

    Maja Planinic

    2010-03-01

    Full Text Available The Force Concept Inventory (FCI is an important diagnostic instrument which is widely used in the field of physics education research. It is therefore very important to evaluate and monitor its functioning using different tools for statistical analysis. One of such tools is the stochastic Rasch model, which enables construction of linear measures for persons and items from raw test scores and which can provide important insight in the structure and functioning of the test (how item difficulties are distributed within the test, how well the items fit the model, and how well the items work together to define the underlying construct. The data for the Rasch analysis come from the large-scale research conducted in 2006-07, which investigated Croatian high school students’ conceptual understanding of mechanics on a representative sample of 1676 students (age 17–18 years. The instrument used in research was the FCI. The average FCI score for the whole sample was found to be (27.7±0.4%, indicating that most of the students were still non-Newtonians at the end of high school, despite the fact that physics is a compulsory subject in Croatian schools. The large set of obtained data was analyzed with the Rasch measurement computer software WINSTEPS 3.66. Since the FCI is routinely used as pretest and post-test on two very different types of population (non-Newtonian and predominantly Newtonian, an additional predominantly Newtonian sample (N=141, average FCI score of 64.5% of first year students enrolled in introductory physics course at University of Zagreb was also analyzed. The Rasch model based analysis suggests that the FCI has succeeded in defining a sufficiently unidimensional construct for each population. The analysis of fit of data to the model found no grossly misfitting items which would degrade measurement. Some items with larger misfit and items with significantly different difficulties in the two samples of students do require further

  18. Identification and annotation of erotic film based on content analysis

    Science.gov (United States)

    Wang, Donghui; Zhu, Miaoliang; Yuan, Xin; Qian, Hui

    2005-02-01

    The paper brings forward a new method for identifying and annotating erotic films based on content analysis. First, the film is decomposed to video and audio stream. Then, the video stream is segmented into shots and key frames are extracted from each shot. We filter the shots that include potential erotic content by finding the nude human body in key frames. A Gaussian model in YCbCr color space for detecting skin region is presented. An external polygon that covered the skin regions is used for the approximation of the human body. Last, we give the degree of the nudity by calculating the ratio of skin area to whole body area with weighted parameters. The result of the experiment shows the effectiveness of our method.

  19. Analysis of rocket flight stability based on optical image measurement

    Science.gov (United States)

    Cui, Shuhua; Liu, Junhu; Shen, Si; Wang, Min; Liu, Jun

    2018-02-01

    Based on the abundant optical image measurement data from the optical measurement information, this paper puts forward the method of evaluating the rocket flight stability performance by using the measurement data of the characteristics of the carrier rocket in imaging. On the basis of the method of measuring the characteristics of the carrier rocket, the attitude parameters of the rocket body in the coordinate system are calculated by using the measurements data of multiple high-speed television sets, and then the parameters are transferred to the rocket body attack angle and it is assessed whether the rocket has a good flight stability flying with a small attack angle. The measurement method and the mathematical algorithm steps through the data processing test, where you can intuitively observe the rocket flight stability state, and also can visually identify the guidance system or failure analysis.

  20. Analysis of Environmental Law Enforcement Mechanism Based on Economic Principle

    Science.gov (United States)

    Cao, Hongjun; Shao, Haohao; Cai, Xuesen

    2017-11-01

    Strengthening and improving the environmental law enforcement mechanism is an important way to protect the ecological environment. This paper is based on economical principles, we did analysis of the marginal management costs by using Pigou means and the marginal transaction costs by using Coase means vary with the quantity growth of pollutant discharge Enterprises. We analyzed all this information, then we got the conclusion as follows. In the process of strengthening the environmental law enforcement mechanism, firstly, we should fully mobilize all aspects of environmental law enforcement, such as legislative bodies and law enforcement agencies, public welfare organizations, television, newspapers, enterprises, people and so on, they need to form a reasonable and organic structure system; then we should use various management means, such as government regulation, legal sanctions, fines, persuasion and denounce, they also need to form an organic structural system.

  1. Maintenance management of railway infrastructures based on reliability analysis

    International Nuclear Information System (INIS)

    Macchi, Marco; Garetti, Marco; Centrone, Domenico; Fumagalli, Luca; Piero Pavirani, Gian

    2012-01-01

    Railway infrastructure maintenance plays a crucial role for rail transport. It aims at guaranteeing safety of operations and availability of railway tracks and related equipment for traffic regulation. Moreover, it is one major cost for rail transport operations. Thus, the increased competition in traffic market is asking for maintenance improvement, aiming at the reduction of maintenance expenditures while keeping the safety of operations. This issue is addressed by the methodology presented in the paper. The first step of the methodology consists of a family-based approach for the equipment reliability analysis; its purpose is the identification of families of railway items which can be given the same reliability targets. The second step builds the reliability model of the railway system for identifying the most critical items, given a required service level for the transportation system. The two methods have been implemented and tested in practical case studies, in the context of Rete Ferroviaria Italiana, the Italian public limited company for railway transportation.

  2. Aeromagnetic Compensation Algorithm Based on Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    Peilin Wu

    2018-01-01

    Full Text Available Aeromagnetic exploration is an important exploration method in geophysics. The data is typically measured by optically pumped magnetometer mounted on an aircraft. But any aircraft produces significant levels of magnetic interference. Therefore, aeromagnetic compensation is important in aeromagnetic exploration. However, multicollinearity of the aeromagnetic compensation model degrades the performance of the compensation. To address this issue, a novel aeromagnetic compensation method based on principal component analysis is proposed. Using the algorithm, the correlation in the feature matrix is eliminated and the principal components are using to construct the hyperplane to compensate the platform-generated magnetic fields. The algorithm was tested using a helicopter, and the obtained improvement ratio is 9.86. The compensated quality is almost the same or slightly better than the ridge regression. The validity of the proposed method was experimentally demonstrated.

  3. Role based access control design using Triadic concept analysis

    Institute of Scientific and Technical Information of China (English)

    Ch Aswani Kumar; S Chandra Mouliswaran; LI Jin-hai; C Chandrasekar

    2016-01-01

    Role based access control is one of the widely used access control models. There are investigations in the literature that use knowledge representation mechanisms such as formal concept analysis (FCA), description logics, and Ontology for representing access control mechanism. However, while using FCA, investigations reported in the literature so far work on the logic that transforms the three dimensional access control matrix into dyadic formal contexts. This transformation is mainly to derive the formal concepts, lattice structure and implications to represent role hierarchy and constraints of RBAC. In this work, we propose a methodology that models RBAC using triadic FCA without transforming the triadic access control matrix into dyadic formal contexts. Our discussion is on two lines of inquiry. We present how triadic FCA can provide a suitable representation of RBAC policy and we demonstrate how this representation follows role hierarchy and constraints of RBAC on sample healthcare network available in the literature.

  4. Structural Optimization based on the Concept of First Order Analysis

    International Nuclear Information System (INIS)

    Shinji, Nishiwaki; Hidekazu, Nishigaki; Yasuaki, Tsurumi; Yoshio, Kojima; Noboru, Kikuchi

    2002-01-01

    Computer Aided Engineering (CAE) has been successfully utilized in mechanical industries such as the automotive industry. It is, however, difficult for most mechanical design engineers to directly use CAE due to the sophisticated nature of the operations involved. In order to mitigate this problem, a new type of CAE, First Order Analysis (FOA) has been proposed. This paper presents the outcome of research concerning the development of a structural topology optimization methodology within FOA. This optimization method is constructed based on discrete and function-oriented elements such as beam and panel elements, and sequential convex programming. In addition, examples are provided to show the utility of the methodology presented here for mechanical design engineers

  5. Knowledge-based requirements analysis for automating software development

    Science.gov (United States)

    Markosian, Lawrence Z.

    1988-01-01

    We present a new software development paradigm that automates the derivation of implementations from requirements. In this paradigm, informally-stated requirements are expressed in a domain-specific requirements specification language. This language is machine-understable and requirements expressed in it are captured in a knowledge base. Once the requirements are captured, more detailed specifications and eventually implementations are derived by the system using transformational synthesis. A key characteristic of the process is that the required human intervention is in the form of providing problem- and domain-specific engineering knowledge, not in writing detailed implementations. We describe a prototype system that applies the paradigm in the realm of communication engineering: the prototype automatically generates implementations of buffers following analysis of the requirements on each buffer.

  6. Multivariate analysis of eigenvalues and eigenvectors in tensor based morphometry

    Science.gov (United States)

    Rajagopalan, Vidya; Schwartzman, Armin; Hua, Xue; Leow, Alex; Thompson, Paul; Lepore, Natasha

    2015-01-01

    We develop a new algorithm to compute voxel-wise shape differences in tensor-based morphometry (TBM). As in standard TBM, we non-linearly register brain T1-weighed MRI data from a patient and control group to a template, and compute the Jacobian of the deformation fields. In standard TBM, the determinants of the Jacobian matrix at each voxel are statistically compared between the two groups. More recently, a multivariate extension of the statistical analysis involving the deformation tensors derived from the Jacobian matrices has been shown to improve statistical detection power.7 However, multivariate methods comprising large numbers of variables are computationally intensive and may be subject to noise. In addition, the anatomical interpretation of results is sometimes difficult. Here instead, we analyze the eigenvalues and the eigenvectors of the Jacobian matrices. Our method is validated on brain MRI data from Alzheimer's patients and healthy elderly controls from the Alzheimer's Disease Neuro Imaging Database.

  7. Analysis of valve failures from the NUCLARR data base

    International Nuclear Information System (INIS)

    Moore, L.M.

    1997-11-01

    The Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) contains data on component failures with categorical and qualifying information such as component design, normal operating state, system application and safety grade information which is important to the development of risk-based component surveillance testing requirements. This report presents descriptions and results of analyses of valve component failure data and covariate information available in the document Nuclear Computerized Library for Assessing Reactor Reliability Data Manual, Part 3: Hardware Component Failure Data (NUCLARR Data Manual). Although there are substantial records on valve performance, there are many categories of the corresponding descriptors and qualifying information for which specific values are missing. Consequently, this limits the data available for analysis of covariate effects. This report presents cross tabulations by different covariate categories and limited modeling of covariate effects for data subsets with substantive non-missing covariate information

  8. Analysis and research of PWR instrument commissioning based on Simulink

    International Nuclear Information System (INIS)

    Luan Zhenhua; Liu Daoguang; Qiu Shaoshuai; Yang Zongwei; Feng Guangyu

    2013-01-01

    Based on Simulink platform, a mathematical model of the lead lag link, differential unit, arithmetic logic module is built. Considering the specific problems encountered in the debug field work, this model is applied in the analysis of key modules and controller and in the resolving of eddy current problems. The test process dynamic control characteristic is simulated, to analyze the trend of the actual response, and conduct the simulation study and propose the concrete solutions. The actual debugging process proved that the use of simulation technology to find the problem, optimize the control data, and adjust the control strategy is very important for the early detection of problems and to speed the test process, shorten the debug duration and increase the debug quality. (authors)

  9. Invariant moments based convolutional neural networks for image analysis

    Directory of Open Access Journals (Sweden)

    Vijayalakshmi G.V. Mahesh

    2017-01-01

    Full Text Available The paper proposes a method using convolutional neural network to effectively evaluate the discrimination between face and non face patterns, gender classification using facial images and facial expression recognition. The novelty of the method lies in the utilization of the initial trainable convolution kernels coefficients derived from the zernike moments by varying the moment order. The performance of the proposed method was compared with the convolutional neural network architecture that used random kernels as initial training parameters. The multilevel configuration of zernike moments was significant in extracting the shape information suitable for hierarchical feature learning to carry out image analysis and classification. Furthermore the results showed an outstanding performance of zernike moment based kernels in terms of the computation time and classification accuracy.

  10. Web Pages Content Analysis Using Browser-Based Volunteer Computing

    Directory of Open Access Journals (Sweden)

    Wojciech Turek

    2013-01-01

    Full Text Available Existing solutions to the problem of finding valuable information on the Websuffers from several limitations like simplified query languages, out-of-date in-formation or arbitrary results sorting. In this paper a different approach to thisproblem is described. It is based on the idea of distributed processing of Webpages content. To provide sufficient performance, the idea of browser-basedvolunteer computing is utilized, which requires the implementation of text pro-cessing algorithms in JavaScript. In this paper the architecture of Web pagescontent analysis system is presented, details concerning the implementation ofthe system and the text processing algorithms are described and test resultsare provided.

  11. Viscoelastic Plate Analysis Based on Gâteaux Differential

    Directory of Open Access Journals (Sweden)

    Kadıoğlu Fethi

    2016-01-01

    Full Text Available In this study, it is aimed to analyze the quasi-static response of viscoelastic Kirchhoff plates with mixed finite element formulation based on the Gâteaux differential. Although the static response of elastic plate, beam and shell structures is a widely studied topic, there are few studies that exist in the literature pertaining to the analysis of the viscoelastic structural elements especially with complex geometries, loading conditions and constitutive relations. The developed mixed finite element model in transformed Laplace-Carson space has four unknowns as displacement, bending and twisting moments in addition to the dynamic and geometric boundary condition terms. Four-parameter solid model is employed for modelling the viscoelastic behaviour. For transformation of the solutions obtained in the Laplace-Carson domain to the time domain, different numerical inverse transform techniques are employed. The developed solution technique is applied to several quasi-static example problems for the verification of the suggested numerical procedure.

  12. Analysis of equivalent antenna based on FDTD method

    Directory of Open Access Journals (Sweden)

    Yun-xing Yang

    2014-09-01

    Full Text Available An equivalent microstrip antenna used in radio proximity fuse is presented. The design of this antenna is based on multilayer multi-permittivity dielectric substrate which is analyzed by finite difference time domain (FDTD method. Equivalent iterative formula is modified in the condition of cylindrical coordinate system. The mixed substrate which contains two kinds of media (one of them is airtakes the place of original single substrate. The results of equivalent antenna simulation show that the resonant frequency of equivalent antenna is similar to that of the original antenna. The validity of analysis can be validated by means of antenna resonant frequency formula. Two antennas have same radiation pattern and similar gain. This method can be used to reduce the weight of antenna, which is significant to the design of missile-borne antenna.

  13. Case studies: Risk-based analysis of technical specifications

    International Nuclear Information System (INIS)

    Wagner, D.P.; Minton, L.A.; Gaertner, J.P.

    1987-01-01

    The SOCRATES computer program uses the results of a Probabilistic Risk Assessment (PRA) or a system level risk analysis to calculate changes in risk due to changes in the surveillance test interval and/or the allowed outage time stated in the technical specification. The computer program can accommodate various testing strategies (such as staggered or simultaneous testing) to allow modeling of component testing as it is carried out at a plant. The methods and computer program are an integral part of a larger decision process aimed at determining benefits from technical specification changes. These benefits can include cost savings to the utilities by reducing forced shutdowns with no adverse impacts on risk. Three summaries of case study applications are included to demonstrate the types of results that can be achieved through risk-based evaluation of technical specifications. (orig.)

  14. Image Registration Algorithm Based on Parallax Constraint and Clustering Analysis

    Science.gov (United States)

    Wang, Zhe; Dong, Min; Mu, Xiaomin; Wang, Song

    2018-01-01

    To resolve the problem of slow computation speed and low matching accuracy in image registration, a new image registration algorithm based on parallax constraint and clustering analysis is proposed. Firstly, Harris corner detection algorithm is used to extract the feature points of two images. Secondly, use Normalized Cross Correlation (NCC) function to perform the approximate matching of feature points, and the initial feature pair is obtained. Then, according to the parallax constraint condition, the initial feature pair is preprocessed by K-means clustering algorithm, which is used to remove the feature point pairs with obvious errors in the approximate matching process. Finally, adopt Random Sample Consensus (RANSAC) algorithm to optimize the feature points to obtain the final feature point matching result, and the fast and accurate image registration is realized. The experimental results show that the image registration algorithm proposed in this paper can improve the accuracy of the image matching while ensuring the real-time performance of the algorithm.

  15. Experimental Bifurcation Analysis Using Control-Based Continuation

    DEFF Research Database (Denmark)

    Bureau, Emil; Starke, Jens

    The focus of this thesis is developing and implementing techniques for performing experimental bifurcation analysis on nonlinear mechanical systems. The research centers around the newly developed control-based continuation method, which allows to systematically track branches of stable...... the resulting behavior, we propose and test three different methods for assessing stability of equilibrium states during experimental continuation. We show that it is possible to determine the stability without allowing unbounded divergence, and that it is under certain circumstances possible to quantify...... and unstable equilibria under variation of parameters. As a test case we demonstrate that it is possible to track the complete frequency response, including the unstable branches, for a harmonically forced impact oscillator with hardening spring nonlinearity, controlled by electromagnetic actuators. The method...

  16. Road Network Vulnerability Analysis Based on Improved Ant Colony Algorithm

    Directory of Open Access Journals (Sweden)

    Yunpeng Wang

    2014-01-01

    Full Text Available We present an improved ant colony algorithm-based approach to assess the vulnerability of a road network and identify the critical infrastructures. This approach improves computational efficiency and allows for its applications in large-scale road networks. This research involves defining the vulnerability conception, modeling the traffic utility index and the vulnerability of the road network, and identifying the critical infrastructures of the road network. We apply the approach to a simple test road network and a real road network to verify the methodology. The results show that vulnerability is directly related to traffic demand and increases significantly when the demand approaches capacity. The proposed approach reduces the computational burden and may be applied in large-scale road network analysis. It can be used as a decision-supporting tool for identifying critical infrastructures in transportation planning and management.

  17. Iris recognition based on robust principal component analysis

    Science.gov (United States)

    Karn, Pradeep; He, Xiao Hai; Yang, Shuai; Wu, Xiao Hong

    2014-11-01

    Iris images acquired under different conditions often suffer from blur, occlusion due to eyelids and eyelashes, specular reflection, and other artifacts. Existing iris recognition systems do not perform well on these types of images. To overcome these problems, we propose an iris recognition method based on robust principal component analysis. The proposed method decomposes all training images into a low-rank matrix and a sparse error matrix, where the low-rank matrix is used for feature extraction. The sparsity concentration index approach is then applied to validate the recognition result. Experimental results using CASIA V4 and IIT Delhi V1iris image databases showed that the proposed method achieved competitive performances in both recognition accuracy and computational efficiency.

  18. Skull base chordomas: analysis of dose-response characteristics

    International Nuclear Information System (INIS)

    Niemierko, Andrzej; Terahara, Atsuro; Goitein, Michael

    1997-01-01

    Objective: To extract dose-response characteristics from dose-volume histograms and corresponding actuarial survival statistics for 115 patients with skull base chordomas. Materials and Methods: We analyzed data for 115 patients with skull base chordoma treated with combined photon and proton conformal radiotherapy to doses in the range 66.6Gy - 79.2Gy. Data set for each patient included gender, histology, age, tumor volume, prescribed dose, overall treatment time, time to recurrence or time to last observation, target dose-volume histogram, and several dosimetric parameters (minimum/mean/median/maximum target dose, percent of the target volume receiving the prescribed dose, dose to 90% of the target volume, and the Equivalent Uniform Dose (EUD). Data were analyzed using the Kaplan-Meier survivor function estimate, the proportional hazards (Cox) model, and parametric modeling of the actuarial probability of recurrence. Parameters of dose-response characteristics were obtained using the maximum likelihood method. Results: Local failure developed in 42 (36%) of patients, with actuarial local control rates at 5 years of 59.2%. The proportional hazards model revealed significant dependence of gender on the probability of recurrence, with female patients having significantly poorer prognosis (hazard ratio of 2.3 with the p value of 0.008). The Wilcoxon and the log-rank tests of the corresponding Kaplan-Meier recurrence-free survival curves confirmed statistical significance of this effect. The Cox model with stratification by gender showed significance of tumor volume (p=0.01), the minimum target dose (p=0.02), and the EUD (p=0.02). Other parameters were not significant at the α level of significance of 0.05, including the prescribed dose (p=0.21). Parametric analysis using a combined model of tumor control probability (to account for non-uniformity of target dose distribution) and the Weibull failure time model (to account for censoring) allowed us to estimate

  19. Cross-covariance based global dynamic sensitivity analysis

    Science.gov (United States)

    Shi, Yan; Lu, Zhenzhou; Li, Zhao; Wu, Mengmeng

    2018-02-01

    For identifying the cross-covariance source of dynamic output at each time instant for structural system involving both input random variables and stochastic processes, a global dynamic sensitivity (GDS) technique is proposed. The GDS considers the effect of time history inputs on the dynamic output. In the GDS, the cross-covariance decomposition is firstly developed to measure the contribution of the inputs to the output at different time instant, and an integration of the cross-covariance change over the specific time interval is employed to measure the whole contribution of the input to the cross-covariance of output. Then, the GDS main effect indices and the GDS total effect indices can be easily defined after the integration, and they are effective in identifying the important inputs and the non-influential inputs on the cross-covariance of output at each time instant, respectively. The established GDS analysis model has the same form with the classical ANOVA when it degenerates to the static case. After degeneration, the first order partial effect can reflect the individual effects of inputs to the output variance, and the second order partial effect can reflect the interaction effects to the output variance, which illustrates the consistency of the proposed GDS indices and the classical variance-based sensitivity indices. The MCS procedure and the Kriging surrogate method are developed to solve the proposed GDS indices. Several examples are introduced to illustrate the significance of the proposed GDS analysis technique and the effectiveness of the proposed solution.

  20. Structural reliability analysis based on the cokriging technique

    International Nuclear Information System (INIS)

    Zhao Wei; Wang Wei; Dai Hongzhe; Xue Guofeng

    2010-01-01

    Approximation methods are widely used in structural reliability analysis because they are simple to create and provide explicit functional relationships between the responses and variables in stead of the implicit limit state function. Recently, the kriging method which is a semi-parameter interpolation technique that can be used for deterministic optimization and structural reliability has gained popularity. However, to fully exploit the kriging method, especially in high-dimensional problems, a large number of sample points should be generated to fill the design space and this can be very expensive and even impractical in practical engineering analysis. Therefore, in this paper, a new method-the cokriging method, which is an extension of kriging, is proposed to calculate the structural reliability. cokriging approximation incorporates secondary information such as the values of the gradients of the function being approximated. This paper explores the use of the cokriging method for structural reliability problems by comparing it with the Kriging method based on some numerical examples. The results indicate that the cokriging procedure described in this work can generate approximation models to improve on the accuracy and efficiency for structural reliability problems and is a viable alternative to the kriging.

  1. Hydrochemical analysis of groundwater using a tree-based model

    Science.gov (United States)

    Litaor, M. Iggy; Brielmann, H.; Reichmann, O.; Shenker, M.

    2010-06-01

    SummaryHydrochemical indices are commonly used to ascertain aquifer characteristics, salinity problems, anthropogenic inputs and resource management, among others. This study was conducted to test the applicability of a binary decision tree model to aquifer evaluation using hydrochemical indices as input. The main advantage of the tree-based model compared to other commonly used statistical procedures such as cluster and factor analyses is the ability to classify groundwater samples with assigned probability and the reduction of a large data set into a few significant variables without creating new factors. We tested the model using data sets collected from headwater springs of the Jordan River, Israel. The model evaluation consisted of several levels of complexity, from simple separation between the calcium-magnesium-bicarbonate water type of karstic aquifers to the more challenging separation of calcium-sodium-bicarbonate water type flowing through perched and regional basaltic aquifers. In all cases, the model assigned measures for goodness of fit in the form of misclassification errors and singled out the most significant variable in the analysis. The model proceeded through a sequence of partitions providing insight into different possible pathways and changing lithology. The model results were extremely useful in constraining the interpretation of geological heterogeneity and constructing a conceptual flow model for a given aquifer. The tree model clearly identified the hydrochemical indices that were excluded from the analysis, thus providing information that can lead to a decrease in the number of routinely analyzed variables and a significant reduction in laboratory cost.

  2. Multi-spectrometer calibration transfer based on independent component analysis.

    Science.gov (United States)

    Liu, Yan; Xu, Hao; Xia, Zhenzhen; Gong, Zhiyong

    2018-02-26

    Calibration transfer is indispensable for practical applications of near infrared (NIR) spectroscopy due to the need for precise and consistent measurements across different spectrometers. In this work, a method for multi-spectrometer calibration transfer is described based on independent component analysis (ICA). A spectral matrix is first obtained by aligning the spectra measured on different spectrometers. Then, by using independent component analysis, the aligned spectral matrix is decomposed into the mixing matrix and the independent components of different spectrometers. These differing measurements between spectrometers can then be standardized by correcting the coefficients within the independent components. Two NIR datasets of corn and edible oil samples measured with three and four spectrometers, respectively, were used to test the reliability of this method. The results of both datasets reveal that spectra measurements across different spectrometers can be transferred simultaneously and that the partial least squares (PLS) models built with the measurements on one spectrometer can predict that the spectra can be transferred correctly on another.

  3. Cnn Based Retinal Image Upscaling Using Zero Component Analysis

    Science.gov (United States)

    Nasonov, A.; Chesnakov, K.; Krylov, A.

    2017-05-01

    The aim of the paper is to obtain high quality of image upscaling for noisy images that are typical in medical image processing. A new training scenario for convolutional neural network based image upscaling method is proposed. Its main idea is a novel dataset preparation method for deep learning. The dataset contains pairs of noisy low-resolution images and corresponding noiseless highresolution images. To achieve better results at edges and textured areas, Zero Component Analysis is applied to these images. The upscaling results are compared with other state-of-the-art methods like DCCI, SI-3 and SRCNN on noisy medical ophthalmological images. Objective evaluation of the results confirms high quality of the proposed method. Visual analysis shows that fine details and structures like blood vessels are preserved, noise level is reduced and no artifacts or non-existing details are added. These properties are essential in retinal diagnosis establishment, so the proposed algorithm is recommended to be used in real medical applications.

  4. Validation test case generation based on safety analysis ontology

    International Nuclear Information System (INIS)

    Fan, Chin-Feng; Wang, Wen-Shing

    2012-01-01

    Highlights: ► Current practice in validation test case generation for nuclear system is mainly ad hoc. ► This study designs a systematic approach to generate validation test cases from a Safety Analysis Report. ► It is based on a domain-specific ontology. ► Test coverage criteria have been defined and satisfied. ► A computerized toolset has been implemented to assist the proposed approach. - Abstract: Validation tests in the current nuclear industry practice are typically performed in an ad hoc fashion. This study presents a systematic and objective method of generating validation test cases from a Safety Analysis Report (SAR). A domain-specific ontology was designed and used to mark up a SAR; relevant information was then extracted from the marked-up document for use in automatically generating validation test cases that satisfy the proposed test coverage criteria; namely, single parameter coverage, use case coverage, abnormal condition coverage, and scenario coverage. The novelty of this technique is its systematic rather than ad hoc test case generation from a SAR to achieve high test coverage.

  5. Comparison of wind turbines based on power curve analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-02-01

    In the study measured power curves for 46 wind turbines were analyzed with the purpose to establish the basis for a consistent comparison of the efficiency of the wind turbines. Emphasis is on wind turbines above 500 kW rated power, with power curves measured after 1994 according to international recommendations. The available power curves fulfilling these requirements were smoothened according to a procedure developed for the purpose in such a way that the smoothened power curves are equally representative as the measured curves. The resulting smoothened power curves are presented in a standardized format for the subsequent processing. Using wind turbine data from the power curve documentation the analysis results in curves for specific energy production (kWh/M{sup 2}/yr) versus specific rotor load (kW/M{sup 2}) for a range of mean wind speeds. On this basis generalized curves for specific annual energy production versus specific rotor load are established for a number of generalized wind turbine concepts. The 46 smoothened standardized power curves presented in the report, the procedure developed to establish them, and the results of the analysis based on them aim at providers of measured power curves as well as users of them including manufacturers, advisors and decision makers. (au)

  6. Entropy-based model for miRNA isoform analysis.

    Directory of Open Access Journals (Sweden)

    Shengqin Wang

    Full Text Available MiRNAs have been widely studied due to their important post-transcriptional regulatory roles in gene expression. Many reports have demonstrated the evidence of miRNA isoform products (isomiRs in high-throughput small RNA sequencing data. However, the biological function involved in these molecules is still not well investigated. Here, we developed a Shannon entropy-based model to estimate isomiR expression profiles of high-throughput small RNA sequencing data extracted from miRBase webserver. By using the Kolmogorov-Smirnov statistical test (KS test, we demonstrated that the 5p and 3p miRNAs present more variants than the single arm miRNAs. We also found that the isomiR variant, except the 3' isomiR variant, is strongly correlated with Minimum Free Energy (MFE of pre-miRNA, suggesting the intrinsic feature of pre-miRNA should be one of the important factors for the miRNA regulation. The functional enrichment analysis showed that the miRNAs with high variation, particularly the 5' end variation, are enriched in a set of critical functions, supporting these molecules should not be randomly produced. Our results provide a probabilistic framework for miRNA isoforms analysis, and give functional insights into pre-miRNA processing.

  7. Artistic image analysis using graph-based learning approaches.

    Science.gov (United States)

    Carneiro, Gustavo

    2013-08-01

    We introduce a new methodology for the problem of artistic image analysis, which among other tasks, involves the automatic identification of visual classes present in an art work. In this paper, we advocate the idea that artistic image analysis must explore a graph that captures the network of artistic influences by computing the similarities in terms of appearance and manual annotation. One of the novelties of our methodology is the proposed formulation that is a principled way of combining these two similarities in a single graph. Using this graph, we show that an efficient random walk algorithm based on an inverted label propagation formulation produces more accurate annotation and retrieval results compared with the following baseline algorithms: bag of visual words, label propagation, matrix completion, and structural learning. We also show that the proposed approach leads to a more efficient inference and training procedures. This experiment is run on a database containing 988 artistic images (with 49 visual classification problems divided into a multiclass problem with 27 classes and 48 binary problems), where we show the inference and training running times, and quantitative comparisons with respect to several retrieval and annotation performance measures.

  8. A network-base analysis of CMIP5 "historical" experiments

    Science.gov (United States)

    Bracco, A.; Foudalis, I.; Dovrolis, C.

    2012-12-01

    In computer science, "complex network analysis" refers to a set of metrics, modeling tools and algorithms commonly used in the study of complex nonlinear dynamical systems. Its main premise is that the underlying topology or network structure of a system has a strong impact on its dynamics and evolution. By allowing to investigate local and non-local statistical interaction, network analysis provides a powerful, but only marginally explored, framework to validate climate models and investigate teleconnections, assessing their strength, range, and impacts on the climate system. In this work we propose a new, fast, robust and scalable methodology to examine, quantify, and visualize climate sensitivity, while constraining general circulation models (GCMs) outputs with observations. The goal of our novel approach is to uncover relations in the climate system that are not (or not fully) captured by more traditional methodologies used in climate science and often adopted from nonlinear dynamical systems analysis, and to explain known climate phenomena in terms of the network structure or its metrics. Our methodology is based on a solid theoretical framework and employs mathematical and statistical tools, exploited only tentatively in climate research so far. Suitably adapted to the climate problem, these tools can assist in visualizing the trade-offs in representing global links and teleconnections among different data sets. Here we present the methodology, and compare network properties for different reanalysis data sets and a suite of CMIP5 coupled GCM outputs. With an extensive model intercomparison in terms of the climate network that each model leads to, we quantify how each model reproduces major teleconnections, rank model performances, and identify common or specific errors in comparing model outputs and observations.

  9. Consumer’s market analysis of products based on cassava

    Science.gov (United States)

    Unteawati, Bina; Fitriani; Fatih, Cholid

    2018-03-01

    Cassava product has the important role for enhancing household's income in rural. Cassava as raw material food is plentiful as local food in Lampung. Cassava product is one of strategic value addition activities. Value additional activities are a key to create income source enrichment in rural. The household was product cassava as a snack or additional food. Their product cassava was operated in small-scale, traditional, and discontinuous production. They have been lacked in technology, capital, and market access. Measurement the sustainability of their business is important. The market has driven the business globally. This research aims to (1) describe the cassava demand to locally product cassava in rural and (2) analysis the consumer's perception of cassava product. Research take placed in Lampung Province, involved Bandar Lampung and Metro City, Pringsewu, Pesawaran, Central Lampung, and East Lampung district. It is held in February until April 2017. Data were analyzed by descriptive statistic and multidimensional scaling. Based on the analysis conclude that (1) the demand of product cassava from rural was massive in volume and regularity with the enormous transaction. This fact is very important to role business cycles. Consumers demand continuously will lead the production of cassava product sustain. Producers of product cassava will consume fresh cassava for the farmer. Consumption of fresh cassava for home industry regularly in rural will develop balancing in fresh cassava price in the farming gate (2) The consumer's perception on cassava product in the different market showed that they prefer much to consume cassava chips as cassava product products than other. Next are crackers, opak, and tiwul rice. Urban consumers prefer product products as snacks (chips, crumbs, and opak), with consumption frequency of 2-5 times per week and volume of 1-3 kg purchases. Consumers in rural were more frequent with daily consumption frequency. Multidimensional scaling

  10. An ion beam analysis software based on ImageJ

    International Nuclear Information System (INIS)

    Udalagama, C.; Chen, X.; Bettiol, A.A.; Watt, F.

    2013-01-01

    The suit of techniques (RBS, STIM, ERDS, PIXE, IL, IF,…) available in ion beam analysis yields a variety of rich information. Typically, after the initial challenge of acquiring data we are then faced with the task of having to extract relevant information or to present the data in a format with the greatest impact. This process sometimes requires developing new software tools. When faced with such situations the usual practice at the Centre for Ion Beam Applications (CIBA) in Singapore has been to use our computational expertise to develop ad hoc software tools as and when we need them. It then became apparent that the whole ion beam community can benefit from such tools; specifically from a common software toolset that can be developed and maintained by everyone with freedom to use and allowance to modify. In addition to the benefits of readymade tools and sharing the onus of development, this also opens up the possibility for collaborators to access and analyse ion beam data without having to depend on an ion beam lab. This has the virtue of making the ion beam techniques more accessible to a broader scientific community. We have identified ImageJ as an appropriate software base to develop such a common toolset. In addition to being in the public domain and been setup for collaborative tool development, ImageJ is accompanied by hundreds of modules (plugins) that allow great breadth in analysis. The present work is the first step towards integrating ion beam analysis into ImageJ. Some of the features of the current version of the ImageJ ‘ion beam’ plugin are: (1) reading list mode or event-by-event files, (2) energy gates/sorts, (3) sort stacks, (4) colour function, (5) real time map updating, (6) real time colour updating and (7) median and average map creation

  11. An ion beam analysis software based on ImageJ

    Energy Technology Data Exchange (ETDEWEB)

    Udalagama, C., E-mail: chammika@nus.edu.sg [Centre for Ion Beam Applications (CIBA), Department of Physics, National University of Singapore, 2 Science Drive 3, Singapore 117 542 (Singapore); Chen, X.; Bettiol, A.A.; Watt, F. [Centre for Ion Beam Applications (CIBA), Department of Physics, National University of Singapore, 2 Science Drive 3, Singapore 117 542 (Singapore)

    2013-07-01

    The suit of techniques (RBS, STIM, ERDS, PIXE, IL, IF,…) available in ion beam analysis yields a variety of rich information. Typically, after the initial challenge of acquiring data we are then faced with the task of having to extract relevant information or to present the data in a format with the greatest impact. This process sometimes requires developing new software tools. When faced with such situations the usual practice at the Centre for Ion Beam Applications (CIBA) in Singapore has been to use our computational expertise to develop ad hoc software tools as and when we need them. It then became apparent that the whole ion beam community can benefit from such tools; specifically from a common software toolset that can be developed and maintained by everyone with freedom to use and allowance to modify. In addition to the benefits of readymade tools and sharing the onus of development, this also opens up the possibility for collaborators to access and analyse ion beam data without having to depend on an ion beam lab. This has the virtue of making the ion beam techniques more accessible to a broader scientific community. We have identified ImageJ as an appropriate software base to develop such a common toolset. In addition to being in the public domain and been setup for collaborative tool development, ImageJ is accompanied by hundreds of modules (plugins) that allow great breadth in analysis. The present work is the first step towards integrating ion beam analysis into ImageJ. Some of the features of the current version of the ImageJ ‘ion beam’ plugin are: (1) reading list mode or event-by-event files, (2) energy gates/sorts, (3) sort stacks, (4) colour function, (5) real time map updating, (6) real time colour updating and (7) median and average map creation.

  12. Ca analysis: an Excel based program for the analysis of intracellular calcium transients including multiple, simultaneous regression analysis.

    Science.gov (United States)

    Greensmith, David J

    2014-01-01

    Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow. Copyright © 2013 The Author. Published by Elsevier Ireland Ltd.. All rights reserved.

  13. Neural Representation. A Survey-Based Analysis of the Notion

    Directory of Open Access Journals (Sweden)

    Oscar Vilarroya

    2017-08-01

    Full Text Available The word representation (as in “neural representation”, and many of its related terms, such as to represent, representational and the like, play a central explanatory role in neuroscience literature. For instance, in “place cell” literature, place cells are extensively associated with their role in “the representation of space.” In spite of its extended use, we still lack a clear, universal and widely accepted view on what it means for a nervous system to represent something, on what makes a neural activity a representation, and on what is re-presented. The lack of a theoretical foundation and definition of the notion has not hindered actual research. My aim here is to identify how active scientists use the notion of neural representation, and eventually to list a set of criteria, based on actual use, that can help in distinguishing between genuine or non-genuine neural-representation candidates. In order to attain this objective, I present first the results of a survey of authors within two domains, place-cell and multivariate pattern analysis (MVPA research. Based on the authors’ replies, and on a review of neuroscientific research, I outline a set of common properties that an account of neural representation seems to require. I then apply these properties to assess the use of the notion in two domains of the survey, place-cell and MVPA studies. I conclude by exploring a shift in the notion of representation suggested by recent literature.

  14. Perceptual security of encrypted images based on wavelet scaling analysis

    Science.gov (United States)

    Vargas-Olmos, C.; Murguía, J. S.; Ramírez-Torres, M. T.; Mejía Carlos, M.; Rosu, H. C.; González-Aguilar, H.

    2016-08-01

    The scaling behavior of the pixel fluctuations of encrypted images is evaluated by using the detrended fluctuation analysis based on wavelets, a modern technique that has been successfully used recently for a wide range of natural phenomena and technological processes. As encryption algorithms, we use the Advanced Encryption System (AES) in RBT mode and two versions of a cryptosystem based on cellular automata, with the encryption process applied both fully and partially by selecting different bitplanes. In all cases, the results show that the encrypted images in which no understandable information can be visually appreciated and whose pixels look totally random present a persistent scaling behavior with the scaling exponent α close to 0.5, implying no correlation between pixels when the DFA with wavelets is applied. This suggests that the scaling exponents of the encrypted images can be used as a perceptual security criterion in the sense that when their values are close to 0.5 (the white noise value) the encrypted images are more secure also from the perceptual point of view.

  15. Risk-based configuration control system: Analysis and approaches

    International Nuclear Information System (INIS)

    Samanta, P.K.; Kim, I.S.; Lofgren, E.V.; Vesely, W.E.

    1990-01-01

    This paper presents an analysis of risks associated with component outage configurations during power operation of a nuclear power plant and discusses approaches and strategies for developing a risk-based configuration control system. A configuration, as used here, is a set of component states. The objective of risk-based configuration control is to detect and control plant configurations using a risk-perspective. The configuration contributions to core-melt frequency and core-melt probability are studied for two plants. Large core-melt frequency can be caused by configurations and there are a number of such configurations that are not currently controlled by technical specifications. However, the expected frequency of occurrence of the impacting configurations is small and the actual core-melt probability contributions are also generally small. Effective strategies and criteria for controlling configuration risks are presented. Such control strategies take into consideration the risks associated with configurations, the nature and characteristics of the configuration risks, and also the practical considerations such as adequate repair times and/or options to transfer to low risk configurations. Alternative types of criteria are discussed that are not overly restrictive to result in unnecessary plant shutdown, but rather motivates effective test and maintenance practices that control risk-significant configurations to allow continued operation with an adequate margin to meet challenges to safety

  16. Object-Based Image Analysis in Wetland Research: A Review

    Directory of Open Access Journals (Sweden)

    Iryna Dronova

    2015-05-01

    Full Text Available The applications of object-based image analysis (OBIA in remote sensing studies of wetlands have been growing over recent decades, addressing tasks from detection and delineation of wetland bodies to comprehensive analyses of within-wetland cover types and their change. Compared to pixel-based approaches, OBIA offers several important benefits to wetland analyses related to smoothing of the local noise, incorporating meaningful non-spectral features for class separation and accounting for landscape hierarchy of wetland ecosystem organization and structure. However, there has been little discussion on whether unique challenges of wetland environments can be uniformly addressed by OBIA across different types of data, spatial scales and research objectives, and to what extent technical and conceptual aspects of this framework may themselves present challenges in a complex wetland setting. This review presents a synthesis of 73 studies that applied OBIA to different types of remote sensing data, spatial scale and research objectives. It summarizes the progress and scope of OBIA uses in wetlands, key benefits of this approach, factors related to accuracy and uncertainty in its applications and the main research needs and directions to expand the OBIA capacity in the future wetland studies. Growing demands for higher-accuracy wetland characterization at both regional and local scales together with advances in very high resolution remote sensing and novel tasks in wetland restoration monitoring will likely continue active exploration of the OBIA potential in these diverse and complex environments.

  17. IMU-Based Joint Angle Measurement for Gait Analysis

    Directory of Open Access Journals (Sweden)

    Thomas Seel

    2014-04-01

    Full Text Available This contribution is concerned with joint angle calculation based on inertial measurement data in the context of human motion analysis. Unlike most robotic devices, the human body lacks even surfaces and right angles. Therefore, we focus on methods that avoid assuming certain orientations in which the sensors are mounted with respect to the body segments. After a review of available methods that may cope with this challenge, we present a set of new methods for: (1 joint axis and position identification; and (2 flexion/extension joint angle measurement. In particular, we propose methods that use only gyroscopes and accelerometers and, therefore, do not rely on a homogeneous magnetic field. We provide results from gait trials of a transfemoral amputee in which we compare the inertial measurement unit (IMU-based methods to an optical 3D motion capture system. Unlike most authors, we place the optical markers on anatomical landmarks instead of attaching them to the IMUs. Root mean square errors of the knee flexion/extension angles are found to be less than 1° on the prosthesis and about 3° on the human leg. For the plantar/dorsiflexion of the ankle, both deviations are about 1°.

  18. Multicriteria Similarity-Based Anomaly Detection Using Pareto Depth Analysis.

    Science.gov (United States)

    Hsiao, Ko-Jen; Xu, Kevin S; Calder, Jeff; Hero, Alfred O

    2016-06-01

    We consider the problem of identifying patterns in a data set that exhibits anomalous behavior, often referred to as anomaly detection. Similarity-based anomaly detection algorithms detect abnormally large amounts of similarity or dissimilarity, e.g., as measured by the nearest neighbor Euclidean distances between a test sample and the training samples. In many application domains, there may not exist a single dissimilarity measure that captures all possible anomalous patterns. In such cases, multiple dissimilarity measures can be defined, including nonmetric measures, and one can test for anomalies by scalarizing using a nonnegative linear combination of them. If the relative importance of the different dissimilarity measures are not known in advance, as in many anomaly detection applications, the anomaly detection algorithm may need to be executed multiple times with different choices of weights in the linear combination. In this paper, we propose a method for similarity-based anomaly detection using a novel multicriteria dissimilarity measure, the Pareto depth. The proposed Pareto depth analysis (PDA) anomaly detection algorithm uses the concept of Pareto optimality to detect anomalies under multiple criteria without having to run an algorithm multiple times with different choices of weights. The proposed PDA approach is provably better than using linear combinations of the criteria, and shows superior performance on experiments with synthetic and real data sets.

  19. Risk-based configuration control system: Analysis and approaches

    International Nuclear Information System (INIS)

    Samanta, P.K.; Vesely, W.E.; Kim, I.S.; Lofgren, E.V.

    1989-01-01

    This paper presents an analysis of risks associated with component outage configurations during power operation of a nuclear power plant and discusses approaches and strategies for developing a risk-based configuration control system. A configuration, as used here, is a set of component states. The objective of risk-based configuration control is to detect and control plant configurations using a risk-perspective. The configuration contributions to core-melt frequency and core-melt probability are studied for two plants. Large core-melt frequency can be caused by configurations and there are a number of such configurations that are not currently controlled by technical specifications. However, the expected frequency of occurrence of the impacting configurations is small and the actual core-melt probability contributions are also generally small. Effective strategies and criteria for controlling configuration risks are presented. Such control strategies take into consideration the risks associated with configurations, the nature and characteristics of the configuration risks, and also the practical considerations such as adequate repair times and/or options to transfer to low risk configurations. Alternative types of criteria are discussed that are not overly restrictive to result in unnecessary plant shutdown, but rather motivates effective tests and maintenance practices that control; risk-significant configurations to allow continued operation with an adequate margin to meet challenges to safety. 3 refs., 7 figs., 2 tabs

  20. Agent-based simulation for human-induced hazard analysis.

    Science.gov (United States)

    Bulleit, William M; Drewek, Matthew W

    2011-02-01

    Terrorism could be treated as a hazard for design purposes. For instance, the terrorist hazard could be analyzed in a manner similar to the way that seismic hazard is handled. No matter how terrorism is dealt with in the design of systems, the need for predictions of the frequency and magnitude of the hazard will be required. And, if the human-induced hazard is to be designed for in a manner analogous to natural hazards, then the predictions should be probabilistic in nature. The model described in this article is a prototype model that used agent-based modeling (ABM) to analyze terrorist attacks. The basic approach in this article of using ABM to model human-induced hazards has been preliminarily validated in the sense that the attack magnitudes seem to be power-law distributed and attacks occur mostly in regions where high levels of wealth pass through, such as transit routes and markets. The model developed in this study indicates that ABM is a viable approach to modeling socioeconomic-based infrastructure systems for engineering design to deal with human-induced hazards. © 2010 Society for Risk Analysis.

  1. A Flocking Based algorithm for Document Clustering Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Xiaohui [ORNL; Gao, Jinzhu [ORNL; Potok, Thomas E [ORNL

    2006-01-01

    Social animals or insects in nature often exhibit a form of emergent collective behavior known as flocking. In this paper, we present a novel Flocking based approach for document clustering analysis. Our Flocking clustering algorithm uses stochastic and heuristic principles discovered from observing bird flocks or fish schools. Unlike other partition clustering algorithm such as K-means, the Flocking based algorithm does not require initial partitional seeds. The algorithm generates a clustering of a given set of data through the embedding of the high-dimensional data items on a two-dimensional grid for easy clustering result retrieval and visualization. Inspired by the self-organized behavior of bird flocks, we represent each document object with a flock boid. The simple local rules followed by each flock boid result in the entire document flock generating complex global behaviors, which eventually result in a clustering of the documents. We evaluate the efficiency of our algorithm with both a synthetic dataset and a real document collection that includes 100 news articles collected from the Internet. Our results show that the Flocking clustering algorithm achieves better performance compared to the K- means and the Ant clustering algorithm for real document clustering.

  2. Reliability analysis for new technology-based transmitters

    Energy Technology Data Exchange (ETDEWEB)

    Brissaud, Florent, E-mail: florent.brissaud.2007@utt.f [Institut National de l' Environnement Industriel et des Risques (INERIS), Parc Technologique Alata, BP 2, 60550 Verneuil-en-Halatte (France); Universite de Technologie de Troyes (UTT), Institut Charles Delaunay (ICD) and STMR UMR CNRS 6279, 12 rue Marie Curie, BP 2060, 10010 Troyes cedex (France); Barros, Anne; Berenguer, Christophe [Universite de Technologie de Troyes (UTT), Institut Charles Delaunay (ICD) and STMR UMR CNRS 6279, 12 rue Marie Curie, BP 2060, 10010 Troyes cedex (France); Charpentier, Dominique [Institut National de l' Environnement Industriel et des Risques (INERIS), Parc Technologique Alata, BP 2, 60550 Verneuil-en-Halatte (France)

    2011-02-15

    The reliability analysis of new technology-based transmitters has to deal with specific issues: various interactions between both material elements and functions, undefined behaviours under faulty conditions, several transmitted data, and little reliability feedback. To handle these particularities, a '3-step' model is proposed, based on goal tree-success tree (GTST) approaches to represent both the functional and material aspects, and includes the faults and failures as a third part for supporting reliability analyses. The behavioural aspects are provided by relationship matrices, also denoted master logic diagrams (MLD), with stochastic values which represent direct relationships between system elements. Relationship analyses are then proposed to assess the effect of any fault or failure on any material element or function. Taking these relationships into account, the probabilities of malfunction and failure modes are evaluated according to time. Furthermore, uncertainty analyses tend to show that even if the input data and system behaviour are not well known, these previous results can be obtained in a relatively precise way. An illustration is provided by a case study on an infrared gas transmitter. These properties make the proposed model and corresponding reliability analyses especially suitable for intelligent transmitters (or 'smart sensors').

  3. Performance of Water-Based Liquid Scintillator: An Independent Analysis

    Directory of Open Access Journals (Sweden)

    D. Beznosko

    2014-01-01

    Full Text Available The water-based liquid scintillator (WbLS is a new material currently under development. It is based on the idea of dissolving the organic scintillator in water using special surfactants. This material strives to achieve the novel detection techniques by combining the Cerenkov rings and scintillation light, as well as the total cost reduction compared to pure liquid scintillator (LS. The independent light yield measurement analysis for the light yield measurements using three different proton beam energies (210 MeV, 475 MeV, and 2000 MeV for water, two different WbLS formulations (0.4% and 0.99%, and pure LS conducted at Brookhaven National Laboratory, USA, is presented. The results show that a goal of ~100 optical photons/MeV, indicated by the simulation to be an optimal light yield for observing both the Cerenkov ring and the scintillation light from the proton decay in a large water detector, has been achieved.

  4. Poka Yoke system based on image analysis and object recognition

    Science.gov (United States)

    Belu, N.; Ionescu, L. M.; Misztal, A.; Mazăre, A.

    2015-11-01

    Poka Yoke is a method of quality management which is related to prevent faults from arising during production processes. It deals with “fail-sating” or “mistake-proofing”. The Poka-yoke concept was generated and developed by Shigeo Shingo for the Toyota Production System. Poka Yoke is used in many fields, especially in monitoring production processes. In many cases, identifying faults in a production process involves a higher cost than necessary cost of disposal. Usually, poke yoke solutions are based on multiple sensors that identify some nonconformities. This means the presence of different equipment (mechanical, electronic) on production line. As a consequence, coupled with the fact that the method itself is an invasive, affecting the production process, would increase its price diagnostics. The bulky machines are the means by which a Poka Yoke system can be implemented become more sophisticated. In this paper we propose a solution for the Poka Yoke system based on image analysis and identification of faults. The solution consists of a module for image acquisition, mid-level processing and an object recognition module using associative memory (Hopfield network type). All are integrated into an embedded system with AD (Analog to Digital) converter and Zync 7000 (22 nm technology).

  5. Comprehensive Study on Lexicon-based Ensemble Classification Sentiment Analysis

    Directory of Open Access Journals (Sweden)

    Łukasz Augustyniak

    2015-12-01

    Full Text Available We propose a novel method for counting sentiment orientation that outperforms supervised learning approaches in time and memory complexity and is not statistically significantly different from them in accuracy. Our method consists of a novel approach to generating unigram, bigram and trigram lexicons. The proposed method, called frequentiment, is based on calculating the frequency of features (words in the document and averaging their impact on the sentiment score as opposed to documents that do not contain these features. Afterwards, we use ensemble classification to improve the overall accuracy of the method. What is important is that the frequentiment-based lexicons with sentiment threshold selection outperform other popular lexicons and some supervised learners, while being 3–5 times faster than the supervised approach. We compare 37 methods (lexicons, ensembles with lexicon’s predictions as input and supervised learners applied to 10 Amazon review data sets and provide the first statistical comparison of the sentiment annotation methods that include ensemble approaches. It is one of the most comprehensive comparisons of domain sentiment analysis in the literature.

  6. Nonlinear Resonance Analysis of Slender Portal Frames under Base Excitation

    Directory of Open Access Journals (Sweden)

    Luis Fernando Paullo Muñoz

    2017-01-01

    Full Text Available The dynamic nonlinear response and stability of slender structures in the main resonance regions are a topic of importance in structural analysis. In complex problems, the determination of the response in the frequency domain indirectly obtained through analyses in time domain can lead to huge computational effort in large systems. In nonlinear cases, the response in the frequency domain becomes even more cumbersome because of the possibility of multiple solutions for certain forcing frequencies. Those solutions can be stable and unstable, in particular saddle-node bifurcation at the turning points along the resonance curves. In this work, an incremental technique for direct calculation of the nonlinear response in frequency domain of plane frames subjected to base excitation is proposed. The transformation of equations of motion to the frequency domain is made through the harmonic balance method in conjunction with the Galerkin method. The resulting system of nonlinear equations in terms of the modal amplitudes and forcing frequency is solved by the Newton-Raphson method together with an arc-length procedure to obtain the nonlinear resonance curves. Suitable examples are presented, and the influence of the frame geometric parameters and base motion on the nonlinear resonance curves is investigated.

  7. Molecular-based recursive partitioning analysis model for glioblastoma in the temozolomide era a correlative analysis based on nrg oncology RTOG 0525

    NARCIS (Netherlands)

    Bell, Erica Hlavin; Pugh, Stephanie L.; McElroy, Joseph P.; Gilbert, Mark R.; Mehta, Minesh; Klimowicz, Alexander C.; Magliocco, Anthony; Bredel, Markus; Robe, Pierre; Grosu, Anca L.; Stupp, Roger; Curran, Walter; Becker, Aline P.; Salavaggione, Andrea L.; Barnholtz-Sloan, Jill S.; Aldape, Kenneth; Blumenthal, Deborah T.; Brown, Paul D.; Glass, Jon; Souhami, Luis; Lee, R. Jeffrey; Brachman, David; Flickinger, John; Won, Minhee; Chakravarti, Arnab

    2017-01-01

    IMPORTANCE: There is a need for a more refined, molecularly based classification model for glioblastoma (GBM) in the temozolomide era. OBJECTIVE: To refine the existing clinically based recursive partitioning analysis (RPA) model by incorporating molecular variables. DESIGN, SETTING, AND

  8. Active Desiccant-Based Preconditioning Market Analysis and Product Development

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, J.

    2001-01-11

    The Phase 1 report (ORNL/Sub/94-SVO44/1), completed earlier in this program, involved a comprehensive field survey and market analysis comparing various specialized outdoor air handling units. This initial investigation included conventional cooling and reheat, conventional cooling with sensible recovery, total energy recovery systems (passive desiccant technology) and various active desiccant systems. The report concluded that several markets do promise a significant sales opportunity for a Climate Changer-based active desiccant system offering. (Climate Changer is a registered trademark of Trane Company.) This initial market analysis defined the wants and needs of the end customers (design engineers and building owners), which, along with subsequent information included in this report, have been used to guide the determination of the most promising active desiccant system configurations. This Phase 2 report begins with a summary of a more thorough investigation of those specific markets identified as most promising for active desiccant systems. Table 1 estimates the annual sales potential for a cost-effective product line of active desiccant systems, such as that built from Climate Changer modules. The Product Development Strategy section describes the active desiccant system configurations chosen to best fit the needs of the marketplace while minimizing system options. Key design objectives based on market research are listed in this report for these active desiccant systems. Corresponding performance goals for the dehumidification wheel required to meet the overall system design objectives are also defined. The Performance Modeling section describes the strategy used by SEMCO to design the dehumidification wheels integrated into the prototype systems currently being tested as part of the U.S. Department of Energy's Advanced Desiccant Technology Program. Actual performance data from wheel testing was used to revise the system performance and energy analysis

  9. Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems

    Science.gov (United States)

    Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris

    2010-01-01

    Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.

  10. TRANSPARENCY IN ELECTRONIC BUSINESS NEGOTIATIONS – EVIDENCE BASED ANALYSIS

    Directory of Open Access Journals (Sweden)

    Radoslav Delina

    2014-12-01

    Full Text Available Purpose: In current economy, where ICT plays a crucial role for being competitive and effective, businesses are facing higher pressures of flexibility and efficiency than ever before. Transparency is often considered as a suitable mechanism for better market prices and more efficient market environment. Electronic business environment provides the possibility to set up more transparent environment and bring higher competitiveness and efficiency on the market. The paper analyse the impact of transparency on prices in e-procurement.Methodology: Reverse auctions are considered as transparent tool simulating in partial level real competition. Together, it allows to examine several levels of transparency set up in auction negotiation process. The impact of transparency on final prices was analysed on real data using relation based analysis were different situations of transparency set up is compared against achieved final price.Findings: Research results based on real data shows, that generally, the transparency in electronic reverse auction can lead to more negative prices agreed by purchasers as current scientific and commercial promotions.Research limitation: Significance of research results is limited due to still low readiness and skills of e-procurers. The validation of results is needed to realized within longer period of time and from environments with different level of e-readiness. Together, it reveal that transparency is more complex issue where the significance of transparency can reveal its sense in some specific situations on the market and negotiation.Value of paper: Evidenced based research reveal some controversy results which support new scientific efforts in microeconomics and socio-economic impact of ICT fields. Together, it affects real practitioners in way how to use and perceive claimed impact of reverse auction solutions.

  11. The heat is on: thermodynamic analysis in fragment-based drug discovery

    NARCIS (Netherlands)

    Edink, E.S.; Jansen, C.J.W.; Leurs, R.; De Esch, I.J.

    2010-01-01

    Thermodynamic analysis provides access to the determinants of binding affinity, enthalpy and entropy. In fragment-based drug discovery (FBDD), thermodynamic analysis provides a powerful tool to discriminate fragments based on their potential for successful optimization. The thermodynamic data

  12. The study of the thermal behavior of solid mixtures of metakaolin and sodium hydroxide by isoconversional model-free analyzes

    Science.gov (United States)

    Gordina, Natalya E.; Prokof'ev, Valery Yu; Smirnov, Nikolay N.; Khramtsova, Alexandra P.

    2017-11-01

    Interactions in solid mixtures of 6Al2Si2O7: 12NaOH and 6Al2Si2O7: 12NaOH: 2Al2O3 under thermal treatment were studied. Ultrasonic pre-treatment of suspensions with a frequency of 22 kHz was applied. X-ray phase analysis, scanning electron microscopy, and synchronous thermal analysis were used in this research. It was shown that after evaporation of the suspensions, the hydrated LTA zeolite and sodium hydroaluminate are synthesized. During thermal treatment up to 500 °C, the adsorption water is removed first and then the dehydration of the zeolite and hydroaluminate occurs. Calcination at temperatures above 500° C leads to the synthesis of Na6Al4Si4O17 and Na8Al4Si4O18 by the interaction of metakaolin and zeolite with sodium hydroxide. At temperatures above 700 °C, the formation of mullite and nepheline occurs in the temperature range of 500-800 °C. The kinetic parameters have been calculated using Friedman analysis (differential method), Kissinger-Akahira-Sunose and Ozawa-Flynn-Wall analyzes (integral methods). It was shown that all analyzes give similar dependences of the apparent activation energy vs the conversion extent and the values of E are in the range of 70-350 kJ mol-1. It was established that ultrasonic pre-treatment allows to reduce the values of the apparent activation energy. It was discovered that the Al2O3 excess over the stoichiometry of the LTA zeolite synthesis is an inhibitor of the mullite and nepheline formation reactions.

  13. Reliability and risk analysis data base development: an historical perspective

    International Nuclear Information System (INIS)

    Fragola, Joseph R.

    1996-01-01

    Collection of empirical data and data base development for use in the prediction of the probability of future events has a long history. Dating back at least to the 17th century, safe passage events and mortality events were collected and analyzed to uncover prospective underlying classes and associated class attributes. Tabulations of these developed classes and associated attributes formed the underwriting basis for the fledgling insurance industry. Much earlier, master masons and architects used design rules of thumb to capture the experience of the ages and thereby produce structures of incredible longevity and reliability (Antona, E., Fragola, J. and Galvagni, R. Risk based decision analysis in design. Fourth SRA Europe Conference Proceedings, Rome, Italy, 18-20 October 1993). These rules served so well in producing robust designs that it was not until almost the 19th century that the analysis (Charlton, T.M., A History Of Theory Of Structures In The 19th Century, Cambridge University Press, Cambridge, UK, 1982) of masonry voussoir arches, begun by Galileo some two centuries earlier (Galilei, G. Discorsi e dimostrazioni mathematiche intorno a due nuove science, (Discourses and mathematical demonstrations concerning two new sciences, Leiden, The Netherlands, 1638), was placed on a sound scientific basis. Still, with the introduction of new materials (such as wrought iron and steel) and the lack of theoretical knowledge and computational facilities, approximate methods of structural design abounded well into the second half of the 20th century. To this day structural designers account for material variations and gaps in theoretical knowledge by employing factors of safety (Benvenuto, E., An Introduction to the History of Structural Mechanics, Part II: Vaulted Structures and Elastic Systems, Springer-Verlag, NY, 1991) or codes of practice (ASME Boiler and Pressure Vessel Code, ASME, New York) originally developed in the 19th century (Antona, E., Fragola, J. and

  14. Evidence-based Neuro Linguistic Psychotherapy: a meta-analysis.

    Science.gov (United States)

    Zaharia, Cătălin; Reiner, Melita; Schütz, Peter

    2015-12-01

    Neuro Linguistic Programming (NLP) Framework has enjoyed enormous popularity in the field of applied psychology. NLP has been used in business, education, law, medicine and psychotherapy to identify people's patterns and alter their responses to stimuli, so they are better able to regulate their environment and themselves. NLP looks at achieving goals, creating stable relationships, eliminating barriers such as fears and phobias, building self-confidence, and self-esteem, and achieving peak performance. Neuro Linguistic Psychotherapy (NLPt) encompasses NLP as framework and set of interventions in the treatment of individuals with different psychological and/or social problems. We aimed systematically to analyse the available data regarding the effectiveness of Neuro Linguistic Psychotherapy (NLPt). The present work is a meta-analysis of studies, observational or randomized controlled trials, for evaluating the efficacy of Neuro Linguistic Programming in individuals with different psychological and/or social problems. The databases searched to identify studies in English and German language: CENTRAL in the Cochrane Library; PubMed; ISI Web of Knowledge (include results also from Medline and the Web of Science); PsycINFO (including PsycARTICLES); Psyndex; Deutschsprachige Diplomarbeiten der Psychologie (database of theses in Psychology in German language), Social SciSearch; National library of health and two NLP-specific research databases: one from the NLP Community (http://www.nlp.de/cgi-bin/research/nlprdb.cgi?action=res_entries) and one from the NLP Group (http://www.nlpgrup.com/bilimselarastirmalar/bilimsel-arastirmalar-4.html#Zweig154). From a total number of 425 studies, 350 were removed and considered not relevant based on the title and abstract. Included, in the final analysis, are 12 studies with numbers of participants ranging between 12 and 115 subjects. The vast majority of studies were prospective observational. The actual paper represents the first

  15. GaitaBase: Web-based repository system for gait analysis.

    Science.gov (United States)

    Tirosh, Oren; Baker, Richard; McGinley, Jenny

    2010-02-01

    The need to share gait analysis data to improve clinical decision support has been recognised since the early 1990s. GaitaBase has been established to provide a web-accessible repository system of gait analysis data to improve the sharing of data across local and international clinical and research community. It is used by several clinical and research groups across the world providing cross-group access permissions to retrieve and analyse the data. The system is useful for bench-marking and quality assurance, clinical consultation, and collaborative research. It has the capacity to increase the population sample size and improve the quality of 'normative' gait data. In addition the accumulated stored data may facilitate clinicians in comparing their own gait data with others, and give a valuable insight into how effective specific interventions have been for others. 2009 Elsevier Ltd. All rights reserved.

  16. Distance Based Root Cause Analysis and Change Impact Analysis of Performance Regressions

    Directory of Open Access Journals (Sweden)

    Junzan Zhou

    2015-01-01

    Full Text Available Performance regression testing is applied to uncover both performance and functional problems of software releases. A performance problem revealed by performance testing can be high response time, low throughput, or even being out of service. Mature performance testing process helps systematically detect software performance problems. However, it is difficult to identify the root cause and evaluate the potential change impact. In this paper, we present an approach leveraging server side logs for identifying root causes of performance problems. Firstly, server side logs are used to recover call tree of each business transaction. We define a novel distance based metric computed from call trees for root cause analysis and apply inverted index from methods to business transactions for change impact analysis. Empirical studies show that our approach can effectively and efficiently help developers diagnose root cause of performance problems.

  17. Comparative Analysis of Investment Funds Stocks-based Portfolios and BET Stocks-based Portfolios

    Directory of Open Access Journals (Sweden)

    Ion STANCU

    2010-04-01

    Full Text Available In this paper we intend to find out what is the best choice of stocks-based portfolio. The major goal is to find whether is more efficient to invest the whole capital in a single sector, like financial investments, or to create a diversified portfolio, taking into account assets from various economic sectors. Capital allocation will be based on the concept of cointegration. We have chosen this method because it can be applied on non-stationary data series, and, besides, it has the advantage of using the whole set of information provided by the financial assets. Another goal is to study how the portfolio structure adjusts if a shock occurs during the period under analysis so that to preserve a certain return or minimize a potential loss. The study will result in an investment solution in the Romanian capital market, even in the context of financial crisis.

  18. Comparison of stress-based and strain-based creep failure criteria for severe accident analysis

    International Nuclear Information System (INIS)

    Chavez, S.A.; Kelly, D.L.; Witt, R.J.; Stirn, D.P.

    1995-01-01

    We conducted a parametic analysis of stress-based and strain-based creep failure criteria to determine if there is a significant difference between the two criteria for SA533B vessel steel under severe accident conditions. Parametric variables include debris composition, system pressure, and creep strain histories derived from different testing programs and mathematically fit, with and without tertiary creep. Results indicate significant differences between the two criteria. Stress gradient plays an important role in determining which criterion will predict failure first. Creep failure was not very sensitive to different creep strain histories, except near the transition temperature of the vessel steel (900K to 1000K). Statistical analyses of creep failure data of four independent sources indicate that these data may be pooled, with a spline point at 1000K. We found the Manson-Haferd parameter to have better failure predictive capability than the Larson-Miller parameter for the data studied. (orig.)

  19. Cluster-based analysis of multi-model climate ensembles

    Science.gov (United States)

    Hyde, Richard; Hossaini, Ryan; Leeson, Amber A.

    2018-06-01

    Clustering - the automated grouping of similar data - can provide powerful and unique insight into large and complex data sets, in a fast and computationally efficient manner. While clustering has been used in a variety of fields (from medical image processing to economics), its application within atmospheric science has been fairly limited to date, and the potential benefits of the application of advanced clustering techniques to climate data (both model output and observations) has yet to be fully realised. In this paper, we explore the specific application of clustering to a multi-model climate ensemble. We hypothesise that clustering techniques can provide (a) a flexible, data-driven method of testing model-observation agreement and (b) a mechanism with which to identify model development priorities. We focus our analysis on chemistry-climate model (CCM) output of tropospheric ozone - an important greenhouse gas - from the recent Atmospheric Chemistry and Climate Model Intercomparison Project (ACCMIP). Tropospheric column ozone from the ACCMIP ensemble was clustered using the Data Density based Clustering (DDC) algorithm. We find that a multi-model mean (MMM) calculated using members of the most-populous cluster identified at each location offers a reduction of up to ˜ 20 % in the global absolute mean bias between the MMM and an observed satellite-based tropospheric ozone climatology, with respect to a simple, all-model MMM. On a spatial basis, the bias is reduced at ˜ 62 % of all locations, with the largest bias reductions occurring in the Northern Hemisphere - where ozone concentrations are relatively large. However, the bias is unchanged at 9 % of all locations and increases at 29 %, particularly in the Southern Hemisphere. The latter demonstrates that although cluster-based subsampling acts to remove outlier model data, such data may in fact be closer to observed values in some locations. We further demonstrate that clustering can provide a viable and

  20. Evaluation of nuclear reactor based activation analysis techniques

    International Nuclear Information System (INIS)

    Obrusnik, I.; Kucera, J.

    1977-09-01

    A survey is presented of the basic types of activation analysis applied in environmental control. Reactor neutron activation analysis is described (including the reactor as a neutron source, sample activation in the reactor, methodology of neutron activation analysis, sample transport into the reactor and sample packaging after irradiation, instrumental activation analysis with radiochemical separation, data measurement and evaluation, sampling and sample preparation). Sources of environmental contamination with trace elements, sampling and sample analysis by neutron activation are described. The analysis is described of soils, waters and biological materials. Methods are shown of evaluating neutron activation analysis results and of their interpretation for purposes of environmental control. (J.B.)

  1. Conceptual bases of Christian, faith-based substance abuse rehabilitation programs: qualitative analysis of staff interviews.

    Science.gov (United States)

    McCoy, Lisa K; Hermos, John A; Bokhour, Barbara G; Frayne, Susan M

    2004-09-01

    Faith-based substance abuse rehabilitation programs provide residential treatment for many substance abusers. To determine key governing concepts of such programs, we conducted semi-structured interviews with sample of eleven clinical and administrative staff referred to us by program directors at six, Evangelical Christian, faith-based, residential rehabilitation programs representing two large, nationwide networks. Qualitative analysis using grounded theory methods examined how spirituality is incorporated into treatment and elicited key theories of addiction and recovery. Although containing comprehensive secular components, the core activities are strongly rooted in a Christian belief system that informs their understanding of addiction and recovery and drives the treatment format. These governing conceptions, that addiction stems from attempts to fill a spiritual void through substance use and recovery through salvation and a long-term relationship with God, provide an explicit, theory-driven model upon which they base their core treatment activities. Knowledge of these core concepts and practices should be helpful to clinicians in considering referrals to faith-based recovery programs.

  2. Feature-Based Analysis of Plasma-Based Particle Acceleration Data

    Energy Technology Data Exchange (ETDEWEB)

    Rubel, Oliver [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Geddes, Cameron G. R. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Chen, Min [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Cormier-Michel, Estelle [Tech-X Corp., Boulder, CO (United States); Bethel, E. Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2014-02-01

    Plasma-based particle accelerators can produce and sustain thousands of times stronger acceleration fields than conventional particle accelerators, providing a potential solution to the problem of the growing size and cost of conventional particle accelerators. To facilitate scientific knowledge discovery from the ever growing collections of accelerator simulation data generated by accelerator physicists to investigate next-generation plasma-based particle accelerator designs, we describe a novel approach for automatic detection and classification of particle beams and beam substructures due to temporal differences in the acceleration process, here called acceleration features. The automatic feature detection in combination with a novel visualization tool for fast, intuitive, query-based exploration of acceleration features enables an effective top-down data exploration process, starting from a high-level, feature-based view down to the level of individual particles. We describe the application of our analysis in practice to analyze simulations of single pulse and dual and triple colliding pulse accelerator designs, and to study the formation and evolution of particle beams, to compare substructures of a beam and to investigate transverse particle loss.

  3. Disposal criticality analysis for aluminum-based DOE fuels

    International Nuclear Information System (INIS)

    Davis, J.W.; Gottlieb, P.

    1997-11-01

    This paper describes the disposal criticality analysis for canisters containing aluminum-based Department of Energy fuels from research reactors. Different canisters were designed for disposal of highly enriched uranium (HEU) and medium enriched uranium (MEU) fuel. In addition to the standard criticality concerns in storage and transportation, such as flooding, the disposal criticality analysis must consider the degradation of the fuel and components within the waste package. Massachusetts Institute of Technology (MIT) U-Al fuel with 93.5% enriched uranium and Oak Ridge Research Reactor (ORR) U-Si-Al fuel with 21% enriched uranium are representative of the HEU and MEU fuel inventories, respectively. Conceptual canister designs with 64 MIT assemblies (16/layer, 4 layers) or 40 ORR assemblies (10/layer, 4 layers) were developed for these fuel types. Borated stainless steel plates were incorporated into a stainless steel internal basket structure within a 439 mm OD, 15 mm thick XM-19 canister shell. The Codisposal waste package contains 5 HLW canisters (represented by 5 Defense Waste Processing Facility canisters from the Savannah River Site) with the fuel canister placed in the center. It is concluded that without the presence of a fairly insoluble neutron absorber, the long-term action of infiltrating water can lead to a small, but significant, probability of criticality for both the HEU and MEU fuels. The use of 1.5kg of Gd distributed throughout the MIT fuel and the use of carbon steels for the structural basket or 1.1 kg of Gd distributed in the ORR fuel will reduce the probability of criticality to virtually zero for both fuels

  4. Geography-based structural analysis of the Internet

    Energy Technology Data Exchange (ETDEWEB)

    Kasiviswanathan, Shiva [Los Alamos National Laboratory; Eidenbenz, Stephan [Los Alamos National Laboratory; Yan, Guanhua [Los Alamos National Laboratory

    2010-01-01

    In this paper, we study some geographic aspects of the Internet. We base our analysis on a large set of geolocated IP hop-level session data (including about 300,000 backbone routers, 150 million end hosts, and 1 billion sessions) that we synthesized from a variety of different input sources such as US census data, computer usage statistics, Internet market share data, IP geolocation data sets, CAJDA's Skitter data set for backbone connectivity, and BGP routing tables. We use this model to perform a nationwide and statewide geographic analysis of the Internet. Our main observations are: (1) There is a dominant coast-to-coast pattern in the US Internet traffic. In fact, in many instances even if the end-devices are not near either coast, still the traffic between them takes a long detour through the coasts. (2) More than half of the Internet paths are inflated by 100% or more compared to their corresponding geometric straight-line distance. This circuitousness makes the average ratio between the routing distance and geometric distance big (around 10). (3) The weighted mean hop count is around 5, but the hop counts are very loosely correlated with the distances. The weighted mean AS count (number of ASes traversed) is around 3. (4) The AS size and the AS location number distributions are heavy-tailed and strongly correlated. Most of the ASes are medium sized and there is a wide variability in the geographic dispersion size (measured in terms of the convex hull area) of these ASes.

  5. Design of process displays based on risk analysis techniques

    International Nuclear Information System (INIS)

    Lundtang Paulsen, J.

    2004-05-01

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, especially in view of the enormous amount of information available in computer-based supervision systems. The state of the art is discussed: How are supervision systems designed today and why? Which strategies are used? What kind of research is going on? Four different plants and their display systems, designed by the author, are described and discussed. Next we outline different methods for eliciting knowledge of a plant, particularly the risks, which is necessary information for the display designer. A chapter presents an overview of the various types of operation references: constitutive equations, set points, design parameters, component characteristics etc., and their validity in different situations. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engineer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is described in some detail. Finally we address the problem of where to put the dot and the lines: when all information is on the table, how should it be presented most adequately. Included, as an appendix is a paper concerning the analysis of maintenance reports and visualization of their information. The purpose was to develop a software tool for maintenance supervision of components in a nuclear power plant. (au)

  6. Improved mesh based photon sampling techniques for neutron activation analysis

    International Nuclear Information System (INIS)

    Relson, E.; Wilson, P. P. H.; Biondo, E. D.

    2013-01-01

    The design of fusion power systems requires analysis of neutron activation of large, complex volumes, and the resulting particles emitted from these volumes. Structured mesh-based discretization of these problems allows for improved modeling in these activation analysis problems. Finer discretization of these problems results in large computational costs, which drives the investigation of more efficient methods. Within an ad hoc subroutine of the Monte Carlo transport code MCNP, we implement sampling of voxels and photon energies for volumetric sources using the alias method. The alias method enables efficient sampling of a discrete probability distribution, and operates in 0(1) time, whereas the simpler direct discrete method requires 0(log(n)) time. By using the alias method, voxel sampling becomes a viable alternative to sampling space with the 0(1) approach of uniformly sampling the problem volume. Additionally, with voxel sampling it is straightforward to introduce biasing of volumetric sources, and we implement this biasing of voxels as an additional variance reduction technique that can be applied. We verify our implementation and compare the alias method, with and without biasing, to direct discrete sampling of voxels, and to uniform sampling. We study the behavior of source biasing in a second set of tests and find trends between improvements and source shape, material, and material density. Overall, however, the magnitude of improvements from source biasing appears to be limited. Future work will benefit from the implementation of efficient voxel sampling - particularly with conformal unstructured meshes where the uniform sampling approach cannot be applied. (authors)

  7. Retrospective analysis of 'gamma distribution' based IMRT QA criteria

    International Nuclear Information System (INIS)

    Wen, C.; Chappell, R.A.

    2010-01-01

    Full text: IMRT has been implemented into clinical practice at Royal Hobart Hospital (RHH) since mid 2006 for treating patients with Head and Neck (H and N) or prostate tumours. A local quality assurance (QA) acceptance criteria based on 'gamma distribution' for approving IMRT plan was developed and implemented in early 2007. A retrospective analysis of such criteria over 194 clinical cases will be presented. The RHH IMRT criteria was established with assumption that gamma distribution obtained through inter-comparison of 2 D dose maps between planned and delivered was governed by a positive-hail' normal distribution. A commercial system-MapCheck was used for 2 D dose map comparison with a built-in gamma analysis tool. Gamma distribution histogram was generated and recorded for all cases. By retrospectively analysing those distributions using curve fitting technique, a statistical gamma distribution can be obtained and evaluated. This analytical result can be used for future IMRT planing and treatment delivery. The analyses indicate that gamma distribution obtained through MapCheckTM is well under the normal distribution, particularly for prostate cases. The applied pass/fail criteria is not overly sensitive to identify 'false fails' but can be further tighten-up for smaller field while for larger field found in both H and N and prostate cases, the criteria was correctly applied. Non-uniform distribution of detectors in MapCheck and experience level of planners are two major factors to variation in gamma distribution among clinical cases. This criteria derived from clinical statistics is superior and more accurate than single-valued criteria for lMRT QA acceptance procedure. (author)

  8. Markov analysis of different standby computer based systems

    International Nuclear Information System (INIS)

    Srinivas, G.; Guptan, Rajee; Mohan, Nalini; Ghadge, S.G.; Bajaj, S.S.

    2006-01-01

    As against the conventional triplicated systems of hardware and the generation of control signals for the actuator elements by means of redundant hardwired median circuits, employed in the early Indian PHWR's, a new approach of generating control signals based on software by a redundant system of computers is introduced in the advanced/current generation of Indian PHWR's. Reliability is increased by fault diagnostics and automatic switch over of all the loads to one computer in case of total failure of the other computer. Independent processing by a redundant CPU in each system enables inter-comparison to quickly identify system failure, in addition to the other self-diagnostic features provided. Combinatorial models such as reliability block diagrams and fault trees are frequently used to predict the reliability, maintainability and safety of complex systems. Unfortunately, these methods cannot accurately model dynamic system behavior; Because of its unique ability to handle dynamic cases, Markov analysis can be a powerful tool in the reliability maintainability and safety (RMS) analyses of dynamic systems. A Markov model breaks the system configuration into a number of states. Each of these states is connected to all other states by transition rates. It then utilizes transition matrices to evaluate the reliability and safety of the systems, either through matrix manipulation or other analytical solution methods, such as Laplace transforms. Thus, Markov analysis is a powerful reliability, maintainability and safety analysis tool. It allows the analyst to model complex, dynamic, highly distributed, fault tolerant systems that would otherwise be very difficult to model using classical techniques like the Fault tree method. The Dual Processor Hot Standby Process Control System (DPHS-PCS) and the Computerized Channel Temperature Monitoring System (CCTM) are typical examples of hot standby systems in the Indian PHWR's. While such systems currently in use in Indian PHWR

  9. Application of intelligence based uncertainty analysis for HLW disposal

    International Nuclear Information System (INIS)

    Kato, Kazuyuki

    2003-01-01

    Safety assessment for geological disposal of high level radioactive waste inevitably involves factors that cannot be specified in a deterministic manner. These are namely: (1) 'variability' that arises from stochastic nature of the processes and features considered, e.g., distribution of canister corrosion times and spatial heterogeneity of a host geological formation; (2) 'ignorance' due to incomplete or imprecise knowledge of the processes and conditions expected in the future, e.g., uncertainty in the estimation of solubilities and sorption coefficients for important nuclides. In many cases, a decision in assessment, e.g., selection among model options or determination of a parameter value, is subjected to both variability and ignorance in a combined form. It is clearly important to evaluate both influences of variability and ignorance on the result of a safety assessment in a consistent manner. We developed a unified methodology to handle variability and ignorance by using probabilistic and possibilistic techniques respectively. The methodology has been applied to safety assessment of geological disposal of high level radioactive waste. Uncertainties associated with scenarios, models and parameters were defined in terms of fuzzy membership functions derived through a series of interviews to the experts while variability was formulated by means of probability density functions (pdfs) based on available data set. The exercise demonstrated applicability of the new methodology and, in particular, its advantage in quantifying uncertainties based on expert's opinion and in providing information on dependence of assessment result on the level of conservatism. In addition, it was also shown that sensitivity analysis could identify key parameters in reducing uncertainties associated with the overall assessment. The above information can be used to support the judgment process and guide the process of disposal system development in optimization of protection against

  10. Survey of sampling-based methods for uncertainty and sensitivity analysis

    International Nuclear Information System (INIS)

    Helton, J.C.; Johnson, J.D.; Sallaberry, C.J.; Storlie, C.B.

    2006-01-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (i) definition of probability distributions to characterize epistemic uncertainty in analysis inputs (ii) generation of samples from uncertain analysis inputs (iii) propagation of sampled inputs through an analysis (iv) presentation of uncertainty analysis results, and (v) determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two-dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition

  11. Survey of sampling-based methods for uncertainty and sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean; Helton, Jon Craig; Sallaberry, Cedric J. PhD. (.; .); Storlie, Curt B. (Colorado State University, Fort Collins, CO)

    2006-06-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (1) Definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (2) Generation of samples from uncertain analysis inputs, (3) Propagation of sampled inputs through an analysis, (4) Presentation of uncertainty analysis results, and (5) Determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition.

  12. Voxel-based morphometry and voxel-based diffusion tensor analysis in amyotrophic lateral sclerosis

    International Nuclear Information System (INIS)

    Chen Zhiye; Ma Lin; Lou Xin; Wang Yan

    2010-01-01

    Objective: To evaluate gray matter volume, white matter volume and FA value changes in amyatrophic lateral sclerosis (ALS) patients by voxel-based morphometry (VBM) and voxel-based diffusion tensor analysis (VBDTA). Methods: Thirty-nine definite or probable ALS patients diagnosed by El Escorial standard and 39 healthy controls were recruited and underwent conventional MR scans and the neuropsychological evaluation. The 3D FSPGR T 1 WI and DTI data were collected on GE Medical 3.0 T MRI system. The 3DT 1 structural images were normalized, segmented and smoothed, and then VBM analysis was performed. DTI data were acquired from 76 healthy controls, and FA map template was made. FA maps generated from the DTI data of ALS patients and healthy controls were normalized to the FA map template for voxel-based analysis. ANCOVA was applied, controlling with age and total intracranial volume for VBM and age for VBDDTA. A statistical threshold of P<0.01 (uncorrected) and cluster level of more than continuous 20 voxels determined significance. Results: Statistical results showed no significant difference in the global volumes of gray matter and white matter, total intracranial volumes and gray matter fraction between ALS patients and healthy controls, but the white matter fraction of ALS patients (0.29 ± 0.02) was significantly less than that of healthy controls (0.30 ± 0.02) statistically (P=0.003). There was significant reduction of gray matter volumes in bilateral superior frontal gyri and precentral gyri, right middle frontal gyrus, right middle and inferior temporal gyrus, left superior occipital gyrus and cuneus and left insula in ALS patients when compared with healthy controls; and the regional reduction of white matter volumes in ALS patients mainly located in genu of corpus callosum, bilateral medial frontal gyri, paracentral lobule and insula, right superior and middle frontal gyrus and left postcentral gyrus. VBDTA showed decrease in FA values in bilateral

  13. Developing a Self-Report-Based Sequential Analysis Method for Educational Technology Systems: A Process-Based Usability Evaluation

    Science.gov (United States)

    Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse

    2015-01-01

    The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…

  14. Analysis of 137Cs in fission based neutron dosimetry

    International Nuclear Information System (INIS)

    Peltonen, T.

    1995-11-01

    137 Cs analysis is based on dissolving an irradiated fission dosimeter and chemically separating the cesium from the rest of the fission material. The samples consisted of uranium and neptunium in the form of metal or oxide. The uranium samples were dissolved in nitric acid and the neptunium samples in a mixture of nitric acid and chloric acid with addition of hydrogen peroxide. Cs was precipitated into a mixture of ammonium molyndophoshate and cellulose powder. A preparate for measurement was made from the precipitate and covered with polyethen plastic. Since other fission products than cesium were precipitated as well from the more recently irradiated samples, the activity measurements could not be carried out with a NaI(Tl) cavity crystal, but had to be made with a less efficient but more selective germanium semiconductor crystal. The method is well suited for 137 Cs determination, especially for older dosimeters where the more short-lived fission products have decayed. (orig.) (6 refs., 7 figs., 7 tabs.)

  15. CAD-Based Shielding Analysis for ITER Port Diagnostics

    Directory of Open Access Journals (Sweden)

    Serikov Arkady

    2017-01-01

    Full Text Available Radiation shielding analysis conducted in support of design development of the contemporary diagnostic systems integrated inside the ITER ports is relied on the use of CAD models. This paper presents the CAD-based MCNP Monte Carlo radiation transport and activation analyses for the Diagnostic Upper and Equatorial Port Plugs (UPP #3 and EPP #8, #17. The creation process of the complicated 3D MCNP models of the diagnostics systems was substantially accelerated by application of the CAD-to-MCNP converter programs MCAM and McCad. High performance computing resources of the Helios supercomputer allowed to speed-up the MCNP parallel transport calculations with the MPI/OpenMP interface. The found shielding solutions could be universal, reducing ports R&D costs. The shield block behind the Tritium and Deposit Monitor (TDM optical box was added to study its influence on Shut-Down Dose Rate (SDDR in Port Interspace (PI of EPP#17. Influence of neutron streaming along the Lost Alpha Monitor (LAM on the neutron energy spectra calculated in the Tangential Neutron Spectrometer (TNS of EPP#8. For the UPP#3 with Charge eXchange Recombination Spectroscopy (CXRS-core, an excessive neutron streaming along the CXRS shutter, which should be prevented in further design iteration.

  16. Automatic comic page image understanding based on edge segment analysis

    Science.gov (United States)

    Liu, Dong; Wang, Yongtao; Tang, Zhi; Li, Luyuan; Gao, Liangcai

    2013-12-01

    Comic page image understanding aims to analyse the layout of the comic page images by detecting the storyboards and identifying the reading order automatically. It is the key technique to produce the digital comic documents suitable for reading on mobile devices. In this paper, we propose a novel comic page image understanding method based on edge segment analysis. First, we propose an efficient edge point chaining method to extract Canny edge segments (i.e., contiguous chains of Canny edge points) from the input comic page image; second, we propose a top-down scheme to detect line segments within each obtained edge segment; third, we develop a novel method to detect the storyboards by selecting the border lines and further identify the reading order of these storyboards. The proposed method is performed on a data set consisting of 2000 comic page images from ten printed comic series. The experimental results demonstrate that the proposed method achieves satisfactory results on different comics and outperforms the existing methods.

  17. LSD-based analysis of high-resolution stellar spectra

    Science.gov (United States)

    Tsymbal, V.; Tkachenko, A.; Van, Reeth T.

    2014-11-01

    We present a generalization of the method of least-squares deconvolution (LSD), a powerful tool for extracting high S/N average line profiles from stellar spectra. The generalization of the method is effected by extending it towards the multiprofile LSD and by introducing the possibility to correct the line strengths from the initial mask. We illustrate the new approach by two examples: (a) the detection of astroseismic signatures from low S/N spectra of single stars, and (b) disentangling spectra of multiple stellar objects. The analysis is applied to spectra obtained with 2-m class telescopes in the course of spectroscopic ground-based support for space missions such as CoRoT and Kepler. Usually, rather high S/N is required, so smaller telescopes can only compete successfully with more advanced ones when one can apply a technique that enables a remarkable increase in the S/N of the spectra which they observe. Since the LSD profiles have a potential for reconstruction what is common in all the spectral profiles, it should have a particular practical application to faint stars observed with 2-m class telescopes and whose spectra show remarkable LPVs.

  18. Design Criteria for Suspended Pipelines Based on Structural Analysis

    Directory of Open Access Journals (Sweden)

    Mariana Simão

    2016-06-01

    Full Text Available Mathematical models have become the target of numerous attempts to obtain results that can be extrapolated to the study of hydraulic pressure infrastructures associated with different engineering requests. Simulation analysis based on finite element method (FEM models are used to determine the vulnerability of hydraulic systems under different types of actions (e.g., natural events and pressure variation. As part of the numerical simulation of a suspended pipeline, the adequacy of existing supports to sustain the pressure loads is verified. With a certain value of load application, the pipeline is forced to sway sideways, possibly lifting up off its deadweight supports. Thus, identifying the frequency, consequences and predictability of accidental events is of extreme importance. This study focuses on the stability of vertical supports associated with extreme transient loads and how a pipeline design can be improved using FEM simulations, in the design stage, to avoid accidents. Distribution of bending moments, axial forces, displacements and deformations along the pipeline and supports are studied for a set of important parametric variations. A good representation of the pipeline displacements is obtained using FEM.

  19. MESA: Message-Based System Analysis Using Runtime Verification

    Science.gov (United States)

    Shafiei, Nastaran; Tkachuk, Oksana; Mehlitz, Peter

    2017-01-01

    In this paper, we present a novel approach and framework for run-time verication of large, safety critical messaging systems. This work was motivated by verifying the System Wide Information Management (SWIM) project of the Federal Aviation Administration (FAA). SWIM provides live air traffic, site and weather data streams for the whole National Airspace System (NAS), which can easily amount to several hundred messages per second. Such safety critical systems cannot be instrumented, therefore, verification and monitoring has to happen using a nonintrusive approach, by connecting to a variety of network interfaces. Due to a large number of potential properties to check, the verification framework needs to support efficient formulation of properties with a suitable Domain Specific Language (DSL). Our approach is to utilize a distributed system that is geared towards connectivity and scalability and interface it at the message queue level to a powerful verification engine. We implemented our approach in the tool called MESA: Message-Based System Analysis, which leverages the open source projects RACE (Runtime for Airspace Concept Evaluation) and TraceContract. RACE is a platform for instantiating and running highly concurrent and distributed systems and enables connectivity to SWIM and scalability. TraceContract is a runtime verication tool that allows for checking traces against properties specified in a powerful DSL. We applied our approach to verify a SWIM service against several requirements.We found errors such as duplicate and out-of-order messages.

  20. Crater ejecta scaling laws: fundamental forms based on dimensional analysis

    International Nuclear Information System (INIS)

    Housen, K.R.; Schmidt, R.M.; Holsapple, K.A.

    1983-01-01

    A model of crater ejecta is constructed using dimensional analysis and a recently developed theory of energy and momentum coupling in cratering events. General relations are derived that provide a rationale for scaling laboratory measurements of ejecta to larger events. Specific expressions are presented for ejection velocities and ejecta blanket profiles in two limiting regimes of crater formation: the so-called gravity and strength regimes. In the gravity regime, ejectra velocities at geometrically similar launch points within craters vary as the square root of the product of crater radius and gravity. This relation implies geometric similarity of ejecta blankets. That is, the thickness of an ejecta blanket as a function of distance from the crater center is the same for all sizes of craters if the thickness and range are expressed in terms of crater radii. In the strength regime, ejecta velocities are independent of crater size. Consequently, ejecta blankets are not geometrically similar in this regime. For points away from the crater rim the expressions for ejecta velocities and thickness take the form of power laws. The exponents in these power laws are functions of an exponent, α, that appears in crater radius scaling relations. Thus experimental studies of the dependence of crater radius on impact conditions determine scaling relations for ejecta. Predicted ejection velocities and ejecta-blanket profiles, based on measured values of α, are compared to existing measurements of velocities and debris profiles

  1. Analysis of trait-based models in marine ecosystems

    DEFF Research Database (Denmark)

    Heilmann, Irene Louise Torpe

    -temporal pattern formation in a predator–prey system where animals move towards higher fitness. Reaction-diffusion systems have been used extensively to describe spatio-temporal patterns in a variety of systems. However, animals rarely move completely at random, as expressed by diffusion. This has lead to models...... with taxis terms, describing individuals moving in the direction of an attractant. An example is chemotaxis models, where bacteria are attracted to a chemical substance. From an evolutionary perspective, it is expected that animals act as to optimize their fitness. Based on this principle, a predator......–prey system with fitness taxis and diffusion is proposed. Here, fitness taxis refer to animals moving towards higher values of fitness, and the specific growth rates of the populations are used as a measure of the fitness values. To determine the conditions for pattern formation, a linear stability analysis...

  2. Energy consumption quota of public buildings based on statistical analysis

    International Nuclear Information System (INIS)

    Zhao Jing; Xin Yajuan; Tong Dingding

    2012-01-01

    The establishment of building energy consumption quota as a comprehensive indicator used to evaluate the actual energy consumption level is an important measure for promoting the development of building energy efficiency. This paper focused on the determination method of the quota, and firstly introduced the procedure of establishing energy consumption quota of public buildings including four important parts: collecting data, classifying and calculating EUIs, standardizing EUIs, determining the measure method of central tendency. The paper also illustrated the standardization process of EUI by actual calculation based on the samples of 10 commercial buildings and 19 hotel buildings. According to the analysis of the frequency distribution of standardized EUIs of sample buildings and combining the characteristics of each measure method of central tendency, comprehensive application of mode and percentage rank is selected to be the best method for determining the energy consumption quota of public buildings. Finally the paper gave some policy proposals on energy consumption quota to help achieve the goal of further energy conservation. - Highlights: ► We introduce the procedure of determining energy consumption quota (ECQ). ► We illustrate the standardization process of EUI by actual calculation of samples. ► Measures of central tendency are brought into determine the ECQ. ► Comprehensive application of mode and percentage rank is the best method for ECQ. ► Punitive or incentive measures for ECQ are proposed.

  3. Defining Smart City. A Conceptual Framework Based on Keyword Analysis

    Directory of Open Access Journals (Sweden)

    Farnaz Mosannenzadeh

    2014-05-01

    Full Text Available “Smart city” is a concept that has been the subject of increasing attention in urban planning and governance during recent years. The first step to create Smart Cities is to understand its concept. However, a brief review of literature shows that the concept of Smart City is the subject of controversy. Thus, the main purpose of this paper is to provide a conceptual framework to define Smart City. To this aim, an extensive literature review was done. Then, a keyword analysis on literature was held against main research questions (why, what, who, when, where, how and based on three main domains involved in the policy decision making process and Smart City plan development: Academic, Industrial and Governmental. This resulted in a conceptual framework for Smart City. The result clarifies the definition of Smart City, while providing a framework to define Smart City’s each sub-system. Moreover, urban authorities can apply this framework in Smart City initiatives in order to recognize their main goals, main components, and key stakeholders.

  4. Analysis of Human Mobility Based on Cellular Data

    Science.gov (United States)

    Arifiansyah, F.; Saptawati, G. A. P.

    2017-01-01

    Nowadays not only adult but even teenager and children have then own mobile phones. This phenomena indicates that the mobile phone becomes an important part of everyday’s life. Based on these indication, the amount of cellular data also increased rapidly. Cellular data defined as the data that records communication among mobile phone users. Cellular data is easy to obtain because the telecommunications company had made a record of the data for the billing system of the company. Billing data keeps a log of the users cellular data usage each time. We can obtained information from the data about communication between users. Through data visualization process, an interesting pattern can be seen in the raw cellular data, so that users can obtain prior knowledge to perform data analysis. Cellular data processing can be done using data mining to find out human mobility patterns and on the existing data. In this paper, we use frequent pattern mining and finding association rules to observe the relation between attributes in cellular data and then visualize them. We used weka tools for finding the rules in stage of data mining. Generally, the utilization of cellular data can provide supporting information for the decision making process and become a data support to provide solutions and information needed by the decision makers.

  5. Residual Stress Analysis Based on Acoustic and Optical Methods

    Directory of Open Access Journals (Sweden)

    Sanichiro Yoshida

    2016-02-01

    Full Text Available Co-application of acoustoelasticity and optical interferometry to residual stress analysis is discussed. The underlying idea is to combine the advantages of both methods. Acoustoelasticity is capable of evaluating a residual stress absolutely but it is a single point measurement. Optical interferometry is able to measure deformation yielding two-dimensional, full-field data, but it is not suitable for absolute evaluation of residual stresses. By theoretically relating the deformation data to residual stresses, and calibrating it with absolute residual stress evaluated at a reference point, it is possible to measure residual stresses quantitatively, nondestructively and two-dimensionally. The feasibility of the idea has been tested with a butt-jointed dissimilar plate specimen. A steel plate 18.5 mm wide, 50 mm long and 3.37 mm thick is braze-jointed to a cemented carbide plate of the same dimension along the 18.5 mm-side. Acoustoelasticity evaluates the elastic modulus at reference points via acoustic velocity measurement. A tensile load is applied to the specimen at a constant pulling rate in a stress range substantially lower than the yield stress. Optical interferometry measures the resulting acceleration field. Based on the theory of harmonic oscillation, the acceleration field is correlated to compressive and tensile residual stresses qualitatively. The acoustic and optical results show reasonable agreement in the compressive and tensile residual stresses, indicating the feasibility of the idea.

  6. Pyrosequencing Based Microbial Community Analysis of Stabilized Mine Soils

    Science.gov (United States)

    Park, J. E.; Lee, B. T.; Son, A.

    2015-12-01

    Heavy metals leached from exhausted mines have been causing severe environmental problems in nearby soils and groundwater. Environmental mitigation was performed based on the heavy metal stabilization using Calcite and steel slag in Korea. Since the soil stabilization only temporarily immobilizes the contaminants to soil matrix, the potential risk of re-leaching heavy metal still exists. Therefore the follow-up management of stabilized soils and the corresponding evaluation methods are required to avoid the consequent contamination from the stabilized soils. In this study, microbial community analysis using pyrosequencing was performed for assessing the potential leaching of the stabilized soils. As a result of rarefaction curve and Chao1 and Shannon indices, the stabilized soil has shown lower richness and diversity as compared to non-contaminated negative control. At the phyla level, as the degree of contamination increases, most of phyla decreased with only exception of increased proteobacteria. Among proteobacteria, gamma-proteobacteria increased against the heavy metal contamination. At the species level, Methylobacter tundripaludum of gamma-proteobacteria showed the highest relative portion of microbial community, indicating that methanotrophs may play an important role in either solubilization or immobilization of heavy metals in stabilized soils.

  7. CAD-Based Shielding Analysis for ITER Port Diagnostics

    Science.gov (United States)

    Serikov, Arkady; Fischer, Ulrich; Anthoine, David; Bertalot, Luciano; De Bock, Maartin; O'Connor, Richard; Juarez, Rafael; Krasilnikov, Vitaly

    2017-09-01

    Radiation shielding analysis conducted in support of design development of the contemporary diagnostic systems integrated inside the ITER ports is relied on the use of CAD models. This paper presents the CAD-based MCNP Monte Carlo radiation transport and activation analyses for the Diagnostic Upper and Equatorial Port Plugs (UPP #3 and EPP #8, #17). The creation process of the complicated 3D MCNP models of the diagnostics systems was substantially accelerated by application of the CAD-to-MCNP converter programs MCAM and McCad. High performance computing resources of the Helios supercomputer allowed to speed-up the MCNP parallel transport calculations with the MPI/OpenMP interface. The found shielding solutions could be universal, reducing ports R&D costs. The shield block behind the Tritium and Deposit Monitor (TDM) optical box was added to study its influence on Shut-Down Dose Rate (SDDR) in Port Interspace (PI) of EPP#17. Influence of neutron streaming along the Lost Alpha Monitor (LAM) on the neutron energy spectra calculated in the Tangential Neutron Spectrometer (TNS) of EPP#8. For the UPP#3 with Charge eXchange Recombination Spectroscopy (CXRS-core), an excessive neutron streaming along the CXRS shutter, which should be prevented in further design iteration.

  8. Robust linear discriminant analysis with distance based estimators

    Science.gov (United States)

    Lim, Yai-Fung; Yahaya, Sharipah Soaad Syed; Ali, Hazlina

    2017-11-01

    Linear discriminant analysis (LDA) is one of the supervised classification techniques concerning relationship between a categorical variable and a set of continuous variables. The main objective of LDA is to create a function to distinguish between populations and allocating future observations to previously defined populations. Under the assumptions of normality and homoscedasticity, the LDA yields optimal linear discriminant rule (LDR) between two or more groups. However, the optimality of LDA highly relies on the sample mean and pooled sample covariance matrix which are known to be sensitive to outliers. To alleviate these conflicts, a new robust LDA using distance based estimators known as minimum variance vector (MVV) has been proposed in this study. The MVV estimators were used to substitute the classical sample mean and classical sample covariance to form a robust linear discriminant rule (RLDR). Simulation and real data study were conducted to examine on the performance of the proposed RLDR measured in terms of misclassification error rates. The computational result showed that the proposed RLDR is better than the classical LDR and was comparable with the existing robust LDR.

  9. Life-cycle analysis of bio-based aviation fuels.

    Science.gov (United States)

    Han, Jeongwoo; Elgowainy, Amgad; Cai, Hao; Wang, Michael Q

    2013-12-01

    Well-to-wake (WTWa) analysis of bio-based aviation fuels, including hydroprocessed renewable jet (HRJ) from various oil seeds, Fischer-Tropsch jet (FTJ) from corn-stover and co-feeding of coal and corn-stover, and pyrolysis jet from corn stover, is conducted and compared with petroleum jet. WTWa GHG emission reductions relative to petroleum jet can be 41-63% for HRJ, 68-76% for pyrolysis jet and 89% for FTJ from corn stover. The HRJ production stage dominates WTWa GHG emissions from HRJ pathways. The differences in GHG emissions from HRJ production stage among considered feedstocks are much smaller than those from fertilizer use and N2O emissions related to feedstock collection stage. Sensitivity analyses on FTJ production from coal and corn-stover are also conducted, showing the importance of biomass share in the feedstock, carbon capture and sequestration options, and overall efficiency. For both HRJ and FTJ, co-product handling methods have significant impacts on WTWa results. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Archaeological Ceramics: Provenience Based on Neutron Activation Analysis

    International Nuclear Information System (INIS)

    Yellin, J.

    2014-01-01

    Compositions of ancient artefacts are examined for a variety of reasons including chronology, provenience, ancient technology and conservation studies. In particular, the elemental composition of archaeological pottery is of prime importance in fixing the origin of the pottery and thereby throwing light on trade routes and inter-regional connections between settlements in antiquity4. An intensive program of INAA was conducted at the Institute of Archaeology, The Hebrew University of Jerusalem, during the 1970-1990s5. Although that laboratory was closed as were other similar laboratories, new INAA laboratories have sprung up. Despite advances in other methods of analysis, INAA remains the favourite and best method for pottery provenience studies. The bulk of pottery data accumulated over the past five decades is based on INAA. These data, perhaps better than data obtained by other methods, can be integrated providing care is taken in accounting for the different standards employed by different laboratories and the different degrees of precisions attained. As a consequence projects that were suspended with the closure of laboratories can sometimes be concluded by combining resources from different laboratories. Examples will be given on how data from four laboratories, Hebrew University, Lawrence Berkeley National Laboratory, Missouri University Research Reactor6 and the University of Texas at Austin are utilized

  11. Iris-based medical analysis by geometric deformation features.

    Science.gov (United States)

    Ma, Lin; Zhang, D; Li, Naimin; Cai, Yan; Zuo, Wangmeng; Wang, Kuanguan

    2013-01-01

    Iris analysis studies the relationship between human health and changes in the anatomy of the iris. Apart from the fact that iris recognition focuses on modeling the overall structure of the iris, iris diagnosis emphasizes the detecting and analyzing of local variations in the characteristics of irises. This paper focuses on studying the geometrical structure changes in irises that are caused by gastrointestinal diseases, and on measuring the observable deformations in the geometrical structures of irises that are related to roundness, diameter and other geometric forms of the pupil and the collarette. Pupil and collarette based features are defined and extracted. A series of experiments are implemented on our experimental pathological iris database, including manual clustering of both normal and pathological iris images, manual classification by non-specialists, manual classification by individuals with a medical background, classification ability verification for the proposed features, and disease recognition by applying the proposed features. The results prove the effectiveness and clinical diagnostic significance of the proposed features and a reliable recognition performance for automatic disease diagnosis. Our research results offer a novel systematic perspective for iridology studies and promote the progress of both theoretical and practical work in iris diagnosis.

  12. Corpus Based Authenicity Analysis of Language Teaching Course Books

    Directory of Open Access Journals (Sweden)

    Emrah PEKSOY

    2017-12-01

    Full Text Available In this study, the resemblance of the language learning course books used in Turkey to authentic language spoken by native speakers is explored by using a corpus-based approach. For this, the 10-million-word spoken part of the British National Corpus was selected as reference corpus. After that, all language learning course books used in high schools in Turkey were scanned and transferred to SketchEngine, an online corpus query tool. Lastly, certain grammar points were extracted first from British National Corpus and then from course books; similaritites and differences were compared. At the end of the study, it was found that the language learning course books have little similarity to authentic language in terms of certain grammatical items and frequency of their collocations. In this way, the points to be revised and changed were explored. In addition, this study emphasized the role of corpus approach as a material development and analysis tool; and tested the functionality of course books for writers and for Ministry of National Education.

  13. Theoretical analysis of noncanonical base pairing interactions in ...

    Indian Academy of Sciences (India)

    PRAKASH KUMAR

    Noncanonical base pairs in RNA have strong structural and functional implications but are currently not considered ..... Full optimizations of the systems were also carried out using ... of the individual bases in the base pair through the equation.

  14. Complete chromogen separation and analysis in double immunohistochemical stains using Photoshop-based image analysis.

    Science.gov (United States)

    Lehr, H A; van der Loos, C M; Teeling, P; Gown, A M

    1999-01-01

    Simultaneous detection of two different antigens on paraffin-embedded and frozen tissues can be accomplished by double immunohistochemistry. However, many double chromogen systems suffer from signal overlap, precluding definite signal quantification. To separate and quantitatively analyze the different chromogens, we imported images into a Macintosh computer using a CCD camera attached to a diagnostic microscope and used Photoshop software for the recognition, selection, and separation of colors. We show here that Photoshop-based image analysis allows complete separation of chromogens not only on the basis of their RGB spectral characteristics, but also on the basis of information concerning saturation, hue, and luminosity intrinsic to the digitized images. We demonstrate that Photoshop-based image analysis provides superior results compared to color separation using bandpass filters. Quantification of the individual chromogens is then provided by Photoshop using the Histogram command, which supplies information on the luminosity (corresponding to gray levels of black-and-white images) and on the number of pixels as a measure of spatial distribution. (J Histochem Cytochem 47:119-125, 1999)

  15. A REGION-BASED MULTI-SCALE APPROACH FOR OBJECT-BASED IMAGE ANALYSIS

    Directory of Open Access Journals (Sweden)

    T. Kavzoglu

    2016-06-01

    Full Text Available Within the last two decades, object-based image analysis (OBIA considering objects (i.e. groups of pixels instead of pixels has gained popularity and attracted increasing interest. The most important stage of the OBIA is image segmentation that groups spectrally similar adjacent pixels considering not only the spectral features but also spatial and textural features. Although there are several parameters (scale, shape, compactness and band weights to be set by the analyst, scale parameter stands out the most important parameter in segmentation process. Estimating optimal scale parameter is crucially important to increase the classification accuracy that depends on image resolution, image object size and characteristics of the study area. In this study, two scale-selection strategies were implemented in the image segmentation process using pan-sharped Qickbird-2 image. The first strategy estimates optimal scale parameters for the eight sub-regions. For this purpose, the local variance/rate of change (LV-RoC graphs produced by the ESP-2 tool were analysed to determine fine, moderate and coarse scales for each region. In the second strategy, the image was segmented using the three candidate scale values (fine, moderate, coarse determined from the LV-RoC graph calculated for whole image. The nearest neighbour classifier was applied in all segmentation experiments and equal number of pixels was randomly selected to calculate accuracy metrics (overall accuracy and kappa coefficient. Comparison of region-based and image-based segmentation was carried out on the classified images and found that region-based multi-scale OBIA produced significantly more accurate results than image-based single-scale OBIA. The difference in classification accuracy reached to 10% in terms of overall accuracy.

  16. A diffusion model-free framework with echo time dependence for free-water elimination and brain tissue microstructure characterization.

    Science.gov (United States)

    Molina-Romero, Miguel; Gómez, Pedro A; Sperl, Jonathan I; Czisch, Michael; Sämann, Philipp G; Jones, Derek K; Menzel, Marion I; Menze, Bjoern H

    2018-03-23

    The compartmental nature of brain tissue microstructure is typically studied by diffusion MRI, MR relaxometry or their correlation. Diffusion MRI relies on signal representations or biophysical models, while MR relaxometry and correlation studies are based on regularized inverse Laplace transforms (ILTs). Here we introduce a general framework for characterizing microstructure that does not depend on diffusion modeling and replaces ill-posed ILTs with blind source separation (BSS). This framework yields proton density, relaxation times, volume fractions, and signal disentanglement, allowing for separation of the free-water component. Diffusion experiments repeated for several different echo times, contain entangled diffusion and relaxation compartmental information. These can be disentangled by BSS using a physically constrained nonnegative matrix factorization. Computer simulations, phantom studies, together with repeatability and reproducibility experiments demonstrated that BSS is capable of estimating proton density, compartmental volume fractions and transversal relaxations. In vivo results proved its potential to correct for free-water contamination and to estimate tissue parameters. Formulation of the diffusion-relaxation dependence as a BSS problem introduces a new framework for studying microstructure compartmentalization, and a novel tool for free-water elimination. © 2018 International Society for Magnetic Resonance in Medicine.

  17. Forecasting the density of oil futures returns using model-free implied volatility and high-frequency data

    International Nuclear Information System (INIS)

    Ielpo, Florian; Sevi, Benoit

    2013-09-01

    Forecasting the density of returns is useful for many purposes in finance, such as risk management activities, portfolio choice or derivative security pricing. Existing methods to forecast the density of returns either use prices of the asset of interest or option prices on this same asset. The latter method needs to convert the risk-neutral estimate of the density into a physical measure, which is computationally cumbersome. In this paper, we take the view of a practitioner who observes the implied volatility under the form of an index, namely the recent OVX, to forecast the density of oil futures returns for horizons going from 1 to 60 days. Using the recent methodology in Maheu and McCurdy (2011) to compute density predictions, we compare the performance of time series models using implied volatility and either daily or intra-daily futures prices. Our results indicate that models based on implied volatility deliver significantly better density forecasts at all horizons, which is in line with numerous studies delivering the same evidence for volatility point forecast. (authors)

  18. Economic analysis of an internet-based depression prevention intervention.

    Science.gov (United States)

    Ruby, Alexander; Marko-Holguin, Monika; Fogel, Joshua; Van Voorhees, Benjamin W

    2013-09-01

    interventions like CATCH-IT appears economically viable in the context of an Accountable Care Organization. Furthermore, while the cost of implementing an effective safety protocol is proportionally high for this intervention, CATCH-IT is still significantly cheaper to implement than current treatment options. Limitations of this research included diminished participation in follow-up surveys assessing willingness-to-pay. IMPLICATIONS FOR HEALTH CARE PROVISION AND USE AND HEALTH POLICIES: This research emphasizes that preventive interventions have the potential to be cheaper to implement than treatment protocols, even before taking into account lost productivity due to illness. Research such as this business application analysis of the CATCH-IT program highlights the importance of supporting preventive medical interventions as the healthcare system already does for treatment interventions. This research is the first to analyze the economic costs of an Internet-based intervention. Further research into the costs and outcomes of such interventions is certainly warranted before they are widely adopted. Furthermore, more research regarding the safety of Internet-based programs will likely need to be conducted before they are broadly accepted.

  19. Association test based on SNP set: logistic kernel machine based test vs. principal component analysis.

    Directory of Open Access Journals (Sweden)

    Yang Zhao

    Full Text Available GWAS has facilitated greatly the discovery of risk SNPs associated with complex diseases. Traditional methods analyze SNP individually and are limited by low power and reproducibility since correction for multiple comparisons is necessary. Several methods have been proposed based on grouping SNPs into SNP sets using biological knowledge and/or genomic features. In this article, we compare the linear kernel machine based test (LKM and principal components analysis based approach (PCA using simulated datasets under the scenarios of 0 to 3 causal SNPs, as well as simple and complex linkage disequilibrium (LD structures of the simulated regions. Our simulation study demonstrates that both LKM and PCA can control the type I error at the significance level of 0.05. If the causal SNP is in strong LD with the genotyped SNPs, both the PCA with a small number of principal components (PCs and the LKM with kernel of linear or identical-by-state function are valid tests. However, if the LD structure is complex, such as several LD blocks in the SNP set, or when the causal SNP is not in the LD block in which most of the genotyped SNPs reside, more PCs should be included to capture the information of the causal SNP. Simulation studies also demonstrate the ability of LKM and PCA to combine information from multiple causal SNPs and to provide increased power over individual SNP analysis. We also apply LKM and PCA to analyze two SNP sets extracted from an actual GWAS dataset on non-small cell lung cancer.

  20. Traditional Mold Analysis Compared to a DNA-based Method of Mold Analysis with Applications in Asthmatics' Homes

    Science.gov (United States)

    Traditional environmental mold analysis is based-on microscopic observations and counting of mold structures collected from the air on a sticky surface or culturing of molds on growth media for identification and quantification. A DNA-based method of mold analysis called mol...