WorldWideScience

Sample records for robust kernel-based object

  1. Gradient descent for robust kernel-based regression

    Science.gov (United States)

    Guo, Zheng-Chu; Hu, Ting; Shi, Lei

    2018-06-01

    In this paper, we study the gradient descent algorithm generated by a robust loss function over a reproducing kernel Hilbert space (RKHS). The loss function is defined by a windowing function G and a scale parameter σ, which can include a wide range of commonly used robust losses for regression. There is still a gap between theoretical analysis and optimization process of empirical risk minimization based on loss: the estimator needs to be global optimal in the theoretical analysis while the optimization method can not ensure the global optimality of its solutions. In this paper, we aim to fill this gap by developing a novel theoretical analysis on the performance of estimators generated by the gradient descent algorithm. We demonstrate that with an appropriately chosen scale parameter σ, the gradient update with early stopping rules can approximate the regression function. Our elegant error analysis can lead to convergence in the standard L 2 norm and the strong RKHS norm, both of which are optimal in the mini-max sense. We show that the scale parameter σ plays an important role in providing robustness as well as fast convergence. The numerical experiments implemented on synthetic examples and real data set also support our theoretical results.

  2. Robust video object cosegmentation.

    Science.gov (United States)

    Wang, Wenguan; Shen, Jianbing; Li, Xuelong; Porikli, Fatih

    2015-10-01

    With ever-increasing volumes of video data, automatic extraction of salient object regions became even more significant for visual analytic solutions. This surge has also opened up opportunities for taking advantage of collective cues encapsulated in multiple videos in a cooperative manner. However, it also brings up major challenges, such as handling of drastic appearance, motion pattern, and pose variations, of foreground objects as well as indiscriminate backgrounds. Here, we present a cosegmentation framework to discover and segment out common object regions across multiple frames and multiple videos in a joint fashion. We incorporate three types of cues, i.e., intraframe saliency, interframe consistency, and across-video similarity into an energy optimization framework that does not make restrictive assumptions on foreground appearance and motion model, and does not require objects to be visible in all frames. We also introduce a spatio-temporal scale-invariant feature transform (SIFT) flow descriptor to integrate across-video correspondence from the conventional SIFT-flow into interframe motion flow from optical flow. This novel spatio-temporal SIFT flow generates reliable estimations of common foregrounds over the entire video data set. Experimental results show that our method outperforms the state-of-the-art on a new extensive data set (ViCoSeg).

  3. Robust Object Tracking Using Valid Fragments Selection.

    Science.gov (United States)

    Zheng, Jin; Li, Bo; Tian, Peng; Luo, Gang

    Local features are widely used in visual tracking to improve robustness in cases of partial occlusion, deformation and rotation. This paper proposes a local fragment-based object tracking algorithm. Unlike many existing fragment-based algorithms that allocate the weights to each fragment, this method firstly defines discrimination and uniqueness for local fragment, and builds an automatic pre-selection of useful fragments for tracking. Then, a Harris-SIFT filter is used to choose the current valid fragments, excluding occluded or highly deformed fragments. Based on those valid fragments, fragment-based color histogram provides a structured and effective description for the object. Finally, the object is tracked using a valid fragment template combining the displacement constraint and similarity of each valid fragment. The object template is updated by fusing feature similarity and valid fragments, which is scale-adaptive and robust to partial occlusion. The experimental results show that the proposed algorithm is accurate and robust in challenging scenarios.

  4. Active Exploration for Robust Object Detection

    OpenAIRE

    Velez, Javier J.; Hemann, Garrett A.; Huang, Albert S.; Posner, Ingmar; Roy, Nicholas

    2011-01-01

    Today, mobile robots are increasingly expected to operate in ever more complex and dynamic environments. In order to carry out many of the higher-level tasks envisioned a semantic understanding of a workspace is pivotal. Here our field has benefited significantly from successes in machine learning and vision: applications in robotics of off-the-shelf object detectors are plentiful. This paper outlines an online, any-time planning framework enabling the active exploration of such detections. O...

  5. Adaptive Shape Kernel-Based Mean Shift Tracker in Robot Vision System

    Directory of Open Access Journals (Sweden)

    Chunmei Liu

    2016-01-01

    Full Text Available This paper proposes an adaptive shape kernel-based mean shift tracker using a single static camera for the robot vision system. The question that we address in this paper is how to construct such a kernel shape that is adaptive to the object shape. We perform nonlinear manifold learning technique to obtain the low-dimensional shape space which is trained by training data with the same view as the tracking video. The proposed kernel searches the shape in the low-dimensional shape space obtained by nonlinear manifold learning technique and constructs the adaptive kernel shape in the high-dimensional shape space. It can improve mean shift tracker performance to track object position and object contour and avoid the background clutter. In the experimental part, we take the walking human as example to validate that our method is accurate and robust to track human position and describe human contour.

  6. Adaptive Shape Kernel-Based Mean Shift Tracker in Robot Vision System

    Science.gov (United States)

    2016-01-01

    This paper proposes an adaptive shape kernel-based mean shift tracker using a single static camera for the robot vision system. The question that we address in this paper is how to construct such a kernel shape that is adaptive to the object shape. We perform nonlinear manifold learning technique to obtain the low-dimensional shape space which is trained by training data with the same view as the tracking video. The proposed kernel searches the shape in the low-dimensional shape space obtained by nonlinear manifold learning technique and constructs the adaptive kernel shape in the high-dimensional shape space. It can improve mean shift tracker performance to track object position and object contour and avoid the background clutter. In the experimental part, we take the walking human as example to validate that our method is accurate and robust to track human position and describe human contour. PMID:27379165

  7. Kernel based subspace projection of hyperspectral images

    DEFF Research Database (Denmark)

    Larsen, Rasmus; Nielsen, Allan Aasbjerg; Arngren, Morten

    In hyperspectral image analysis an exploratory approach to analyse the image data is to conduct subspace projections. As linear projections often fail to capture the underlying structure of the data, we present kernel based subspace projections of PCA and Maximum Autocorrelation Factors (MAF...

  8. A Secure and Robust Object-Based Video Authentication System

    Directory of Open Access Journals (Sweden)

    He Dajun

    2004-01-01

    Full Text Available An object-based video authentication system, which combines watermarking, error correction coding (ECC, and digital signature techniques, is presented for protecting the authenticity between video objects and their associated backgrounds. In this system, a set of angular radial transformation (ART coefficients is selected as the feature to represent the video object and the background, respectively. ECC and cryptographic hashing are applied to those selected coefficients to generate the robust authentication watermark. This content-based, semifragile watermark is then embedded into the objects frame by frame before MPEG4 coding. In watermark embedding and extraction, groups of discrete Fourier transform (DFT coefficients are randomly selected, and their energy relationships are employed to hide and extract the watermark. The experimental results demonstrate that our system is robust to MPEG4 compression, object segmentation errors, and some common object-based video processing such as object translation, rotation, and scaling while securely preventing malicious object modifications. The proposed solution can be further incorporated into public key infrastructure (PKI.

  9. Scuba: scalable kernel-based gene prioritization.

    Science.gov (United States)

    Zampieri, Guido; Tran, Dinh Van; Donini, Michele; Navarin, Nicolò; Aiolli, Fabio; Sperduti, Alessandro; Valle, Giorgio

    2018-01-25

    The uncovering of genes linked to human diseases is a pressing challenge in molecular biology and precision medicine. This task is often hindered by the large number of candidate genes and by the heterogeneity of the available information. Computational methods for the prioritization of candidate genes can help to cope with these problems. In particular, kernel-based methods are a powerful resource for the integration of heterogeneous biological knowledge, however, their practical implementation is often precluded by their limited scalability. We propose Scuba, a scalable kernel-based method for gene prioritization. It implements a novel multiple kernel learning approach, based on a semi-supervised perspective and on the optimization of the margin distribution. Scuba is optimized to cope with strongly unbalanced settings where known disease genes are few and large scale predictions are required. Importantly, it is able to efficiently deal both with a large amount of candidate genes and with an arbitrary number of data sources. As a direct consequence of scalability, Scuba integrates also a new efficient strategy to select optimal kernel parameters for each data source. We performed cross-validation experiments and simulated a realistic usage setting, showing that Scuba outperforms a wide range of state-of-the-art methods. Scuba achieves state-of-the-art performance and has enhanced scalability compared to existing kernel-based approaches for genomic data. This method can be useful to prioritize candidate genes, particularly when their number is large or when input data is highly heterogeneous. The code is freely available at https://github.com/gzampieri/Scuba .

  10. Kernel based eigenvalue-decomposition methods for analysing ham

    DEFF Research Database (Denmark)

    Christiansen, Asger Nyman; Nielsen, Allan Aasbjerg; Møller, Flemming

    2010-01-01

    methods, such as PCA, MAF or MNF. We therefore investigated the applicability of kernel based versions of these transformation. This meant implementing the kernel based methods and developing new theory, since kernel based MAF and MNF is not described in the literature yet. The traditional methods only...... have two factors that are useful for segmentation and none of them can be used to segment the two types of meat. The kernel based methods have a lot of useful factors and they are able to capture the subtle differences in the images. This is illustrated in Figure 1. You can see a comparison of the most...... useful factor of PCA and kernel based PCA respectively in Figure 2. The factor of the kernel based PCA turned out to be able to segment the two types of meat and in general that factor is much more distinct, compared to the traditional factor. After the orthogonal transformation a simple thresholding...

  11. Robust object tracking combining color and scale invariant features

    Science.gov (United States)

    Zhang, Shengping; Yao, Hongxun; Gao, Peipei

    2010-07-01

    Object tracking plays a very important role in many computer vision applications. However its performance will significantly deteriorate due to some challenges in complex scene, such as pose and illumination changes, clustering background and so on. In this paper, we propose a robust object tracking algorithm which exploits both global color and local scale invariant (SIFT) features in a particle filter framework. Due to the expensive computation cost of SIFT features, the proposed tracker adopts a speed-up variation of SIFT, SURF, to extract local features. Specially, the proposed method first finds matching points between the target model and target candidate, than the weight of the corresponding particle based on scale invariant features is computed as the the proportion of matching points of that particle to matching points of all particles, finally the weight of the particle is obtained by combining weights of color and SURF features with a probabilistic way. The experimental results on a variety of challenging videos verify that the proposed method is robust to pose and illumination changes and is significantly superior to the standard particle filter tracker and the mean shift tracker.

  12. Scale-adaptive Local Patches for Robust Visual Object Tracking

    Directory of Open Access Journals (Sweden)

    Kang Sun

    2014-04-01

    Full Text Available This paper discusses the problem of robustly tracking objects which undergo rapid and dramatic scale changes. To remove the weakness of global appearance models, we present a novel scheme that combines object’s global and local appearance features. The local feature is a set of local patches that geometrically constrain the changes in the target’s appearance. In order to adapt to the object’s geometric deformation, the local patches could be removed and added online. The addition of these patches is constrained by the global features such as color, texture and motion. The global visual features are updated via the stable local patches during tracking. To deal with scale changes, we adapt the scale of patches in addition to adapting the object bound box. We evaluate our method by comparing it to several state-of-the-art trackers on publicly available datasets. The experimental results on challenging sequences confirm that, by using this scale-adaptive local patches and global properties, our tracker outperforms the related trackers in many cases by having smaller failure rate as well as better accuracy.

  13. Robustness of Multiple Objective Decision Analysis Preference Functions

    National Research Council Canada - National Science Library

    Klimack, William

    2002-01-01

    .... The impact of these differences was examined to improve implementation efficiency. The robustness of the decision model was examined with respect to the preference functions to reduce the time burden imposed on the decision maker...

  14. Supervised linear dimensionality reduction with robust margins for object recognition

    Science.gov (United States)

    Dornaika, F.; Assoum, A.

    2013-01-01

    Linear Dimensionality Reduction (LDR) techniques have been increasingly important in computer vision and pattern recognition since they permit a relatively simple mapping of data onto a lower dimensional subspace, leading to simple and computationally efficient classification strategies. Recently, many linear discriminant methods have been developed in order to reduce the dimensionality of visual data and to enhance the discrimination between different groups or classes. Many existing linear embedding techniques relied on the use of local margins in order to get a good discrimination performance. However, dealing with outliers and within-class diversity has not been addressed by margin-based embedding method. In this paper, we explored the use of different margin-based linear embedding methods. More precisely, we propose to use the concepts of Median miss and Median hit for building robust margin-based criteria. Based on such margins, we seek the projection directions (linear embedding) such that the sum of local margins is maximized. Our proposed approach has been applied to the problem of appearance-based face recognition. Experiments performed on four public face databases show that the proposed approach can give better generalization performance than the classic Average Neighborhood Margin Maximization (ANMM). Moreover, thanks to the use of robust margins, the proposed method down-grades gracefully when label outliers contaminate the training data set. In particular, we show that the concept of Median hit was crucial in order to get robust performance in the presence of outliers.

  15. Robust Optimization Using Supremum of the Objective Function for Nonlinear Programming Problems

    International Nuclear Information System (INIS)

    Lee, Se Jung; Park, Gyung Jin

    2014-01-01

    In the robust optimization field, the robustness of the objective function emphasizes an insensitive design. In general, the robustness of the objective function can be achieved by reducing the change of the objective function with respect to the variation of the design variables and parameters. However, in conventional methods, when an insensitive design is emphasized, the performance of the objective function can be deteriorated. Besides, if the numbers of the design variables are increased, the numerical cost is quite high in robust optimization for nonlinear programming problems. In this research, the robustness index for the objective function and a process of robust optimization are proposed. Moreover, a method using the supremum of linearized functions is also proposed to reduce the computational cost. Mathematical examples are solved for the verification of the proposed method and the results are compared with those from the conventional methods. The proposed approach improves the performance of the objective function and its efficiency

  16. Robust object tacking based on self-adaptive search area

    Science.gov (United States)

    Dong, Taihang; Zhong, Sheng

    2018-02-01

    Discriminative correlation filter (DCF) based trackers have recently achieved excellent performance with great computational efficiency. However, DCF based trackers suffer boundary effects, which result in the unstable performance in challenging situations exhibiting fast motion. In this paper, we propose a novel method to mitigate this side-effect in DCF based trackers. We change the search area according to the prediction of target motion. When the object moves fast, broad search area could alleviate boundary effects and reserve the probability of locating object. When the object moves slowly, narrow search area could prevent effect of useless background information and improve computational efficiency to attain real-time performance. This strategy can impressively soothe boundary effects in situations exhibiting fast motion and motion blur, and it can be used in almost all DCF based trackers. The experiments on OTB benchmark show that the proposed framework improves the performance compared with the baseline trackers.

  17. Hybrid Robust Multi-Objective Evolutionary Optimization Algorithm

    Science.gov (United States)

    2009-03-10

    xfar by xint. Else, generate a new individual, using the Sobol pseudo- random sequence generator within the upper and lower bounds of the variables...12. Deb, K., Multi-Objective Optimization Using Evolutionary Algorithms, John Wiley & Sons. 2002. 13. Sobol , I. M., "Uniformly Distributed Sequences

  18. Robust Object Segmentation Using a Multi-Layer Laser Scanner

    Science.gov (United States)

    Kim, Beomseong; Choi, Baehoon; Yoo, Minkyun; Kim, Hyunju; Kim, Euntai

    2014-01-01

    The major problem in an advanced driver assistance system (ADAS) is the proper use of sensor measurements and recognition of the surrounding environment. To this end, there are several types of sensors to consider, one of which is the laser scanner. In this paper, we propose a method to segment the measurement of the surrounding environment as obtained by a multi-layer laser scanner. In the segmentation, a full set of measurements is decomposed into several segments, each representing a single object. Sometimes a ghost is detected due to the ground or fog, and the ghost has to be eliminated to ensure the stability of the system. The proposed method is implemented on a real vehicle, and its performance is tested in a real-world environment. The experiments show that the proposed method demonstrates good performance in many real-life situations. PMID:25356645

  19. Many-objective robust decision making for water allocation under climate change

    NARCIS (Netherlands)

    Yan, Dan; Ludwig, Fulco; Huang, He Qing; Werners, Saskia E.

    2017-01-01

    Water allocation is facing profound challenges due to climate change uncertainties. To identify adaptive water allocation strategies that are robust to climate change uncertainties, a model framework combining many-objective robust decision making and biophysical modeling is developed for large

  20. A robust multi-objective global supplier selection model under currency fluctuation and price discount

    Science.gov (United States)

    Zarindast, Atousa; Seyed Hosseini, Seyed Mohamad; Pishvaee, Mir Saman

    2017-06-01

    Robust supplier selection problem, in a scenario-based approach has been proposed, when the demand and exchange rates are subject to uncertainties. First, a deterministic multi-objective mixed integer linear programming is developed; then, the robust counterpart of the proposed mixed integer linear programming is presented using the recent extension in robust optimization theory. We discuss decision variables, respectively, by a two-stage stochastic planning model, a robust stochastic optimization planning model which integrates worst case scenario in modeling approach and finally by equivalent deterministic planning model. The experimental study is carried out to compare the performances of the three models. Robust model resulted in remarkable cost saving and it illustrated that to cope with such uncertainties, we should consider them in advance in our planning. In our case study different supplier were selected due to this uncertainties and since supplier selection is a strategic decision, it is crucial to consider these uncertainties in planning approach.

  1. ROBUSTNESS AND PREDICTION ACCURACY OF MACHINE LEARNING FOR OBJECTIVE VISUAL QUALITY ASSESSMENT

    OpenAIRE

    Hines, Andrew; Kendrick, Paul; Barri, Adriaan; Narwaria, Manish; Redi, Judith A.

    2014-01-01

    Machine Learning (ML) is a powerful tool to support the development of objective visual quality assessment metrics, serving as a substitute model for the perceptual mechanisms acting in visual quality appreciation. Nevertheless, the reliability of ML-based techniques within objective quality assessment metrics is often questioned. In this study, the robustness of ML in supporting objective quality assessment is investigated, specifically when the feature set adopted for prediction is suboptim...

  2. Nonlinear Knowledge in Kernel-Based Multiple Criteria Programming Classifier

    Science.gov (United States)

    Zhang, Dongling; Tian, Yingjie; Shi, Yong

    Kernel-based Multiple Criteria Linear Programming (KMCLP) model is used as classification methods, which can learn from training examples. Whereas, in traditional machine learning area, data sets are classified only by prior knowledge. Some works combine the above two classification principle to overcome the defaults of each approach. In this paper, we propose a model to incorporate the nonlinear knowledge into KMCLP in order to solve the problem when input consists of not only training example, but also nonlinear prior knowledge. In dealing with real world case breast cancer diagnosis, the model shows its better performance than the model solely based on training data.

  3. Robust optimization of robotic pick and place operations for deformable objects through simulation

    DEFF Research Database (Denmark)

    Bo Jorgensen, Troels; Debrabant, Kristian; Kruger, Norbert

    2016-01-01

    for the task. The solutions are parameterized in terms of the robot motion and the gripper configuration, and after each simulation various objective scores are determined and combined. This enables the use of various optimization strategies. Based on visual inspection of the most robust solution found...

  4. Weighing Efficiency-Robustness in Supply Chain Disruption by Multi-Objective Firefly Algorithm

    Directory of Open Access Journals (Sweden)

    Tong Shu

    2016-03-01

    Full Text Available This paper investigates various supply chain disruptions in terms of scenario planning, including node disruption and chain disruption; namely, disruptions in distribution centers and disruptions between manufacturing centers and distribution centers. Meanwhile, it also focuses on the simultaneous disruption on one node or a number of nodes, simultaneous disruption in one chain or a number of chains and the corresponding mathematical models and exemplification in relation to numerous manufacturing centers and diverse products. Robustness of the design of the supply chain network is examined by weighing efficiency against robustness during supply chain disruptions. Efficiency is represented by operating cost; robustness is indicated by the expected disruption cost and the weighing issue is calculated by the multi-objective firefly algorithm for consistency in the results. It has been shown that the total cost achieved by the optimal target function is lower than that at the most effective time of supply chains. In other words, the decrease of expected disruption cost by improving robustness in supply chains is greater than the increase of operating cost by reducing efficiency, thus leading to cost advantage. Consequently, by approximating the Pareto Front Chart of weighing between efficiency and robustness, enterprises can choose appropriate efficiency and robustness for their longer-term development.

  5. Robust multi-objective calibration strategies – possibilities for improving flood forecasting

    Directory of Open Access Journals (Sweden)

    G. H. Schmitz

    2012-10-01

    Full Text Available Process-oriented rainfall-runoff models are designed to approximate the complex hydrologic processes within a specific catchment and in particular to simulate the discharge at the catchment outlet. Most of these models exhibit a high degree of complexity and require the determination of various parameters by calibration. Recently, automatic calibration methods became popular in order to identify parameter vectors with high corresponding model performance. The model performance is often assessed by a purpose-oriented objective function. Practical experience suggests that in many situations one single objective function cannot adequately describe the model's ability to represent any aspect of the catchment's behaviour. This is regardless of whether the objective is aggregated of several criteria that measure different (possibly opposite aspects of the system behaviour. One strategy to circumvent this problem is to define multiple objective functions and to apply a multi-objective optimisation algorithm to identify the set of Pareto optimal or non-dominated solutions. Nonetheless, there is a major disadvantage of automatic calibration procedures that understand the problem of model calibration just as the solution of an optimisation problem: due to the complex-shaped response surface, the estimated solution of the optimisation problem can result in different near-optimum parameter vectors that can lead to a very different performance on the validation data. Bárdossy and Singh (2008 studied this problem for single-objective calibration problems using the example of hydrological models and proposed a geometrical sampling approach called Robust Parameter Estimation (ROPE. This approach applies the concept of data depth in order to overcome the shortcomings of automatic calibration procedures and find a set of robust parameter vectors. Recent studies confirmed the effectivity of this method. However, all ROPE approaches published so far just identify

  6. Many-objective robust decision making for water allocation under climate change.

    Science.gov (United States)

    Yan, Dan; Ludwig, Fulco; Huang, He Qing; Werners, Saskia E

    2017-12-31

    Water allocation is facing profound challenges due to climate change uncertainties. To identify adaptive water allocation strategies that are robust to climate change uncertainties, a model framework combining many-objective robust decision making and biophysical modeling is developed for large rivers. The framework was applied to the Pearl River basin (PRB), China where sufficient flow to the delta is required to reduce saltwater intrusion in the dry season. Before identifying and assessing robust water allocation plans for the future, the performance of ten state-of-the-art MOEAs (multi-objective evolutionary algorithms) is evaluated for the water allocation problem in the PRB. The Borg multi-objective evolutionary algorithm (Borg MOEA), which is a self-adaptive optimization algorithm, has the best performance during the historical periods. Therefore it is selected to generate new water allocation plans for the future (2079-2099). This study shows that robust decision making using carefully selected MOEAs can help limit saltwater intrusion in the Pearl River Delta. However, the framework could perform poorly due to larger than expected climate change impacts on water availability. Results also show that subjective design choices from the researchers and/or water managers could potentially affect the ability of the model framework, and cause the most robust water allocation plans to fail under future climate change. Developing robust allocation plans in a river basin suffering from increasing water shortage requires the researchers and water managers to well characterize future climate change of the study regions and vulnerabilities of their tools. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Designing Dynamic Adaptive Policy Pathways using Many-Objective Robust Decision Making

    Science.gov (United States)

    Kwakkel, Jan; Haasnoot, Marjolijn

    2017-04-01

    Dealing with climate risks in water management requires confronting a wide variety of deeply uncertain factors, while navigating a many dimensional space of trade-offs amongst objectives. There is an emerging body of literature on supporting this type of decision problem, under the label of decision making under deep uncertainty. Two approaches within this literature are Many-Objective Robust Decision Making, and Dynamic Adaptive Policy Pathways. In recent work, these approaches have been compared. One of the main conclusions of this comparison was that they are highly complementary. Many-Objective Robust Decision Making is a model based decision support approach, while Dynamic Adaptive Policy Pathways is primarily a conceptual framework for the design of flexible strategies that can be adapted over time in response to how the future is actually unfolding. In this research we explore this complementarity in more detail. Specifically, we demonstrate how Many-Objective Robust Decision Making can be used to design adaptation pathways. We demonstrate this combined approach using a water management problem, in the Netherlands. The water level of Lake IJselmeer, the main fresh water resource of the Netherlands, is currently managed through discharge by gravity. Due to climate change, this won't be possible in the future, unless water levels are changed. Changing the water level has undesirable flood risk and spatial planning consequences. The challenge is to find promising adaptation pathways that balance objectives related to fresh water supply, flood risk, and spatial issues, while accounting for uncertain climatic and land use change. We conclude that the combination of Many-Objective Robust Decision Making and Dynamic Adaptive Policy Pathways is particularly suited for dealing with deeply uncertain climate risks.

  8. Robust

    DEFF Research Database (Denmark)

    2017-01-01

    Robust – Reflections on Resilient Architecture’, is a scientific publication following the conference of the same name in November of 2017. Researches and PhD-Fellows, associated with the Masters programme: Cultural Heritage, Transformation and Restoration (Transformation), at The Royal Danish...

  9. Robust Multi-Objective PQ Scheduling for Electric Vehicles in Flexible Unbalanced Distribution Grids

    DEFF Research Database (Denmark)

    Knezovic, Katarina; Soroudi, Alireza; Marinelli, Mattia

    2017-01-01

    With increased penetration of distributed energy resources and electric vehicles (EVs), different EV management strategies can be used for mitigating adverse effects and supporting the distribution grid. This paper proposes a robust multi-objective methodology for determining the optimal day...... demand response programs. The method is tested on a real Danish unbalanced distribution grid with 35% EV penetration to demonstrate the effectiveness of the proposed approach. It is shown that the proposed formulation guarantees an optimal EV cost as long as the price uncertainties are lower than....... The robust formulation effectively considers the errors in the electricity price forecast and its influence on the EV schedule. Moreover, the impact of EV reactive power support on objective values and technical parameters is analysed both when EVs are the only flexible resources and when linked with other...

  10. A mean–variance objective for robust production optimization in uncertain geological scenarios

    DEFF Research Database (Denmark)

    Capolei, Andrea; Suwartadi, Eka; Foss, Bjarne

    2014-01-01

    directly. In the mean–variance bi-criterion objective function risk appears directly, it also considers an ensemble of reservoir models, and has robust optimization as a special extreme case. The mean–variance objective is common for portfolio optimization problems in finance. The Markowitz portfolio...... optimization problem is the original and simplest example of a mean–variance criterion for mitigating risk. Risk is mitigated in oil production by including both the expected NPV (mean of NPV) and the risk (variance of NPV) for the ensemble of possible reservoir models. With the inclusion of the risk...

  11. Kernel based pattern analysis methods using eigen-decompositions for reading Icelandic sagas

    DEFF Research Database (Denmark)

    Christiansen, Asger Nyman; Carstensen, Jens Michael

    We want to test the applicability of kernel based eigen-decomposition methods, compared to the traditional eigen-decomposition methods. We have implemented and tested three kernel based methods methods, namely PCA, MAF and MNF, all using a Gaussian kernel. We tested the methods on a multispectral...... image of a page in the book 'hauksbok', which contains Icelandic sagas....

  12. Robust selectivity to two-object images in human visual cortex

    Science.gov (United States)

    Agam, Yigal; Liu, Hesheng; Papanastassiou, Alexander; Buia, Calin; Golby, Alexandra J.; Madsen, Joseph R.; Kreiman, Gabriel

    2010-01-01

    SUMMARY We can recognize objects in a fraction of a second in spite of the presence of other objects [1–3]. The responses in macaque areas V4 and inferior temporal cortex [4–15] to a neuron’s preferred stimuli are typically suppressed by the addition of a second object within the receptive field (see however [16, 17]). How can this suppression be reconciled with rapid visual recognition in complex scenes? One option is that certain “special categories” are unaffected by other objects [18] but this leaves the problem unsolved for other categories. Another possibility is that serial attentional shifts help ameliorate the problem of distractor objects [19–21]. Yet, psychophysical studies [1–3], scalp recordings [1] and neurophysiological recordings [14, 16, 22–24], suggest that the initial sweep of visual processing contains a significant amount of information. We recorded intracranial field potentials in human visual cortex during presentation of flashes of two-object images. Visual selectivity from temporal cortex during the initial ~200 ms was largely robust to the presence of other objects. We could train linear decoders on the responses to isolated objects and decode information in two-object images. These observations are compatible with parallel, hierarchical and feed-forward theories of rapid visual recognition [25] and may provide a neural substrate to begin to unravel rapid recognition in natural scenes. PMID:20417105

  13. Kernel-based discriminant feature extraction using a representative dataset

    Science.gov (United States)

    Li, Honglin; Sancho Gomez, Jose-Luis; Ahalt, Stanley C.

    2002-07-01

    Discriminant Feature Extraction (DFE) is widely recognized as an important pre-processing step in classification applications. Most DFE algorithms are linear and thus can only explore the linear discriminant information among the different classes. Recently, there has been several promising attempts to develop nonlinear DFE algorithms, among which is Kernel-based Feature Extraction (KFE). The efficacy of KFE has been experimentally verified by both synthetic data and real problems. However, KFE has some known limitations. First, KFE does not work well for strongly overlapped data. Second, KFE employs all of the training set samples during the feature extraction phase, which can result in significant computation when applied to very large datasets. Finally, KFE can result in overfitting. In this paper, we propose a substantial improvement to KFE that overcomes the above limitations by using a representative dataset, which consists of critical points that are generated from data-editing techniques and centroid points that are determined by using the Frequency Sensitive Competitive Learning (FSCL) algorithm. Experiments show that this new KFE algorithm performs well on significantly overlapped datasets, and it also reduces computational complexity. Further, by controlling the number of centroids, the overfitting problem can be effectively alleviated.

  14. Fusion of an Ensemble of Augmented Image Detectors for Robust Object Detection.

    Science.gov (United States)

    Wei, Pan; Ball, John E; Anderson, Derek T

    2018-03-17

    A significant challenge in object detection is accurate identification of an object's position in image space, whereas one algorithm with one set of parameters is usually not enough, and the fusion of multiple algorithms and/or parameters can lead to more robust results. Herein, a new computational intelligence fusion approach based on the dynamic analysis of agreement among object detection outputs is proposed. Furthermore, we propose an online versus just in training image augmentation strategy. Experiments comparing the results both with and without fusion are presented. We demonstrate that the augmented and fused combination results are the best, with respect to higher accuracy rates and reduction of outlier influences. The approach is demonstrated in the context of cone, pedestrian and box detection for Advanced Driver Assistance Systems (ADAS) applications.

  15. Salient Point Detection in Protrusion Parts of 3D Object Robust to Isometric Variations

    Science.gov (United States)

    Mirloo, Mahsa; Ebrahimnezhad, Hosein

    2018-03-01

    In this paper, a novel method is proposed to detect 3D object salient points robust to isometric variations and stable against scaling and noise. Salient points can be used as the representative points from object protrusion parts in order to improve the object matching and retrieval algorithms. The proposed algorithm is started by determining the first salient point of the model based on the average geodesic distance of several random points. Then, according to the previous salient point, a new point is added to this set of points in each iteration. By adding every salient point, decision function is updated. Hence, a condition is created for selecting the next point in which the iterative point is not extracted from the same protrusion part so that drawing out of a representative point from every protrusion part is guaranteed. This method is stable against model variations with isometric transformations, scaling, and noise with different levels of strength due to using a feature robust to isometric variations and considering the relation between the salient points. In addition, the number of points used in averaging process is decreased in this method, which leads to lower computational complexity in comparison with the other salient point detection algorithms.

  16. Using Multi-Objective Optimization to Explore Robust Policies in the Colorado River Basin

    Science.gov (United States)

    Alexander, E.; Kasprzyk, J. R.; Zagona, E. A.; Prairie, J. R.; Jerla, C.; Butler, A.

    2017-12-01

    The long term reliability of water deliveries in the Colorado River Basin has degraded due to the imbalance of growing demand and dwindling supply. The Colorado River meanders 1,450 miles across a watershed that covers seven US states and Mexico and is an important cultural, economic, and natural resource for nearly 40 million people. Its complex operating policy is based on the "Law of the River," which has evolved since the Colorado River Compact in 1922. Recent (2007) refinements to address shortage reductions and coordinated operations of Lakes Powell and Mead were negotiated with stakeholders in which thousands of scenarios were explored to identify operating guidelines that could ultimately be agreed on. This study explores a different approach to searching for robust operating policies to inform the policy making process. The Colorado River Simulation System (CRSS), a long-term water management simulation model implemented in RiverWare, is combined with the Borg multi-objective evolutionary algorithm (MOEA) to solve an eight objective problem formulation. Basin-wide performance metrics are closely tied to system health through incorporating critical reservoir pool elevations, duration, frequency and quantity of shortage reductions in the objective set. For example, an objective to minimize the frequency that Lake Powell falls below the minimum power pool elevation of 3,490 feet for Glen Canyon Dam protects a vital economic and renewable energy source for the southwestern US. The decision variables correspond to operating tiers in Lakes Powell and Mead that drive the implementation of various shortage and release policies, thus affecting system performance. The result will be a set of non-dominated solutions that can be compared with respect to their trade-offs based on the various objectives. These could inform policy making processes by eliminating dominated solutions and revealing robust solutions that could remain hidden under conventional analysis.

  17. A hybrid multi-objective imperialist competitive algorithm and Monte Carlo method for robust safety design of a rail vehicle

    Science.gov (United States)

    Nejlaoui, Mohamed; Houidi, Ajmi; Affi, Zouhaier; Romdhane, Lotfi

    2017-10-01

    This paper deals with the robust safety design optimization of a rail vehicle system moving in short radius curved tracks. A combined multi-objective imperialist competitive algorithm and Monte Carlo method is developed and used for the robust multi-objective optimization of the rail vehicle system. This robust optimization of rail vehicle safety considers simultaneously the derailment angle and its standard deviation where the design parameters uncertainties are considered. The obtained results showed that the robust design reduces significantly the sensitivity of the rail vehicle safety to the design parameters uncertainties compared to the determinist one and to the literature results.

  18. Fusion of an Ensemble of Augmented Image Detectors for Robust Object Detection

    Directory of Open Access Journals (Sweden)

    Pan Wei

    2018-03-01

    Full Text Available A significant challenge in object detection is accurate identification of an object’s position in image space, whereas one algorithm with one set of parameters is usually not enough, and the fusion of multiple algorithms and/or parameters can lead to more robust results. Herein, a new computational intelligence fusion approach based on the dynamic analysis of agreement among object detection outputs is proposed. Furthermore, we propose an online versus just in training image augmentation strategy. Experiments comparing the results both with and without fusion are presented. We demonstrate that the augmented and fused combination results are the best, with respect to higher accuracy rates and reduction of outlier influences. The approach is demonstrated in the context of cone, pedestrian and box detection for Advanced Driver Assistance Systems (ADAS applications.

  19. Tuning rules for robust FOPID controllers based on multi-objective optimization with FOPDT models.

    Science.gov (United States)

    Sánchez, Helem Sabina; Padula, Fabrizio; Visioli, Antonio; Vilanova, Ramon

    2017-01-01

    In this paper a set of optimally balanced tuning rules for fractional-order proportional-integral-derivative controllers is proposed. The control problem of minimizing at once the integrated absolute error for both the set-point and the load disturbance responses is addressed. The control problem is stated as a multi-objective optimization problem where a first-order-plus-dead-time process model subject to a robustness, maximum sensitivity based, constraint has been considered. A set of Pareto optimal solutions is obtained for different normalized dead times and then the optimal balance between the competing objectives is obtained by choosing the Nash solution among the Pareto-optimal ones. A curve fitting procedure has then been applied in order to generate suitable tuning rules. Several simulation results show the effectiveness of the proposed approach. Copyright © 2016. Published by Elsevier Ltd.

  20. Efficient Kernel-Based Ensemble Gaussian Mixture Filtering

    KAUST Repository

    Liu, Bo

    2015-11-11

    We consider the Bayesian filtering problem for data assimilation following the kernel-based ensemble Gaussian-mixture filtering (EnGMF) approach introduced by Anderson and Anderson (1999). In this approach, the posterior distribution of the system state is propagated with the model using the ensemble Monte Carlo method, providing a forecast ensemble that is then used to construct a prior Gaussian-mixture (GM) based on the kernel density estimator. This results in two update steps: a Kalman filter (KF)-like update of the ensemble members and a particle filter (PF)-like update of the weights, followed by a resampling step to start a new forecast cycle. After formulating EnGMF for any observational operator, we analyze the influence of the bandwidth parameter of the kernel function on the covariance of the posterior distribution. We then focus on two aspects: i) the efficient implementation of EnGMF with (relatively) small ensembles, where we propose a new deterministic resampling strategy preserving the first two moments of the posterior GM to limit the sampling error; and ii) the analysis of the effect of the bandwidth parameter on contributions of KF and PF updates and on the weights variance. Numerical results using the Lorenz-96 model are presented to assess the behavior of EnGMF with deterministic resampling, study its sensitivity to different parameters and settings, and evaluate its performance against ensemble KFs. The proposed EnGMF approach with deterministic resampling suggests improved estimates in all tested scenarios, and is shown to require less localization and to be less sensitive to the choice of filtering parameters.

  1. A class of kernel based real-time elastography algorithms.

    Science.gov (United States)

    Kibria, Md Golam; Hasan, Md Kamrul

    2015-08-01

    In this paper, a novel real-time kernel-based and gradient-based Phase Root Seeking (PRS) algorithm for ultrasound elastography is proposed. The signal-to-noise ratio of the strain image resulting from this method is improved by minimizing the cross-correlation discrepancy between the pre- and post-compression radio frequency signals with an adaptive temporal stretching method and employing built-in smoothing through an exponentially weighted neighborhood kernel in the displacement calculation. Unlike conventional PRS algorithms, displacement due to tissue compression is estimated from the root of the weighted average of the zero-lag cross-correlation phases of the pair of corresponding analytic pre- and post-compression windows in the neighborhood kernel. In addition to the proposed one, the other time- and frequency-domain elastography algorithms (Ara et al., 2013; Hussain et al., 2012; Hasan et al., 2012) proposed by our group are also implemented in real-time using Java where the computations are serially executed or parallely executed in multiple processors with efficient memory management. Simulation results using finite element modeling simulation phantom show that the proposed method significantly improves the strain image quality in terms of elastographic signal-to-noise ratio (SNRe), elastographic contrast-to-noise ratio (CNRe) and mean structural similarity (MSSIM) for strains as high as 4% as compared to other reported techniques in the literature. Strain images obtained for the experimental phantom as well as in vivo breast data of malignant or benign masses also show the efficacy of our proposed method over the other reported techniques in the literature. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Robust object tracking techniques for vision-based 3D motion analysis applications

    Science.gov (United States)

    Knyaz, Vladimir A.; Zheltov, Sergey Y.; Vishnyakov, Boris V.

    2016-04-01

    Automated and accurate spatial motion capturing of an object is necessary for a wide variety of applications including industry and science, virtual reality and movie, medicine and sports. For the most part of applications a reliability and an accuracy of the data obtained as well as convenience for a user are the main characteristics defining the quality of the motion capture system. Among the existing systems for 3D data acquisition, based on different physical principles (accelerometry, magnetometry, time-of-flight, vision-based), optical motion capture systems have a set of advantages such as high speed of acquisition, potential for high accuracy and automation based on advanced image processing algorithms. For vision-based motion capture accurate and robust object features detecting and tracking through the video sequence are the key elements along with a level of automation of capturing process. So for providing high accuracy of obtained spatial data the developed vision-based motion capture system "Mosca" is based on photogrammetric principles of 3D measurements and supports high speed image acquisition in synchronized mode. It includes from 2 to 4 technical vision cameras for capturing video sequences of object motion. The original camera calibration and external orientation procedures provide the basis for high accuracy of 3D measurements. A set of algorithms as for detecting, identifying and tracking of similar targets, so for marker-less object motion capture is developed and tested. The results of algorithms' evaluation show high robustness and high reliability for various motion analysis tasks in technical and biomechanics applications.

  3. A Novel Kernel-Based Regularization Technique for PET Image Reconstruction

    Directory of Open Access Journals (Sweden)

    Abdelwahhab Boudjelal

    2017-06-01

    Full Text Available Positron emission tomography (PET is an imaging technique that generates 3D detail of physiological processes at the cellular level. The technique requires a radioactive tracer, which decays and releases a positron that collides with an electron; consequently, annihilation photons are emitted, which can be measured. The purpose of PET is to use the measurement of photons to reconstruct the distribution of radioisotopes in the body. Currently, PET is undergoing a revamp, with advancements in data measurement instruments and the computing methods used to create the images. These computer methods are required to solve the inverse problem of “image reconstruction from projection”. This paper proposes a novel kernel-based regularization technique for maximum-likelihood expectation-maximization ( κ -MLEM to reconstruct the image. Compared to standard MLEM, the proposed algorithm is more robust and is more effective in removing background noise, whilst preserving the edges; this suppresses image artifacts, such as out-of-focus slice blur.

  4. Robust Individual-Cell/Object Tracking via PCANet Deep Network in Biomedicine and Computer Vision

    Directory of Open Access Journals (Sweden)

    Bineng Zhong

    2016-01-01

    Full Text Available Tracking individual-cell/object over time is important in understanding drug treatment effects on cancer cells and video surveillance. A fundamental problem of individual-cell/object tracking is to simultaneously address the cell/object appearance variations caused by intrinsic and extrinsic factors. In this paper, inspired by the architecture of deep learning, we propose a robust feature learning method for constructing discriminative appearance models without large-scale pretraining. Specifically, in the initial frames, an unsupervised method is firstly used to learn the abstract feature of a target by exploiting both classic principal component analysis (PCA algorithms with recent deep learning representation architectures. We use learned PCA eigenvectors as filters and develop a novel algorithm to represent a target by composing of a PCA-based filter bank layer, a nonlinear layer, and a patch-based pooling layer, respectively. Then, based on the feature representation, a neural network with one hidden layer is trained in a supervised mode to construct a discriminative appearance model. Finally, to alleviate the tracker drifting problem, a sample update scheme is carefully designed to keep track of the most representative and diverse samples during tracking. We test the proposed tracking method on two standard individual cell/object tracking benchmarks to show our tracker's state-of-the-art performance.

  5. Robust multi-objective control of hybrid renewable microgeneration systems with energy storage

    International Nuclear Information System (INIS)

    Allison, John

    2017-01-01

    Highlights: • A hybrid energy system of micro-CHP, solar PV, and battery storage is presented. • Possible to exploit synergy of systems to fulfil the thermal and electrical demands. • Can control to minimise the interaction with the local electrical network. • Three different control approaches were compared. • The nonlinear inversion-based control strategy exhibits optimum performance. - Abstract: Microgeneration technologies are positioned to address future building energy efficiency requirements and facilitate the integration of renewables into buildings to ensure a sustainable, energy-secure future. This paper explores the development of a robust multi-input multi-output (MIMO) controller applicable to the control of hybrid renewable microgeneration systems with the objective of minimising the electrical grid utilisation of a building while fulfilling the thermal demands. The controller employs the inverse dynamics of the building, servicing systems, and energy storage with a robust control methodology. These inverse dynamics provides the control system with knowledge of the complex cause and effect relationships between the system, the controlled inputs, and the external disturbances, while an outer-loop control ensures robust, stable control in the presence of modelling deficiencies/uncertainty and unknown disturbances. Variable structure control compensates for the physical limitations of the systems whereby the control strategy employed switches depending on the current utilisation and availability of the energy supplies. Preliminary results presented for a system consisting of a micro-CHP unit, solar PV, and battery storage indicate that the control strategy is effective in minimising the interaction with the local electrical network and maximising the utilisation of the available renewable energy.

  6. Many-Objective Robust Decision Making: Managing Water in a Deeply Uncertain World of Change (Invited)

    Science.gov (United States)

    Reed, P. M.

    2013-12-01

    Water resources planning and management has always required the consideration of uncertainties and the associated system vulnerabilities that they may cause. Despite the long legacy of these issues, our decision support frameworks that have dominated the literature over the past 50 years have struggled with the strongly multiobjective and deeply uncertain nature of water resources systems. The term deep uncertainty (or Knightian uncertainty) refers to factors in planning that strongly shape system risks that maybe unknown and even if known there is a strong lack of consensus on their likelihoods over decadal planning horizons (population growth, financial stability, valuation of resources, ecosystem requirements, evolving water institutions, regulations, etc). In this presentation, I will propose and demonstrate the many-objective robust decision making (MORDM) framework for water resources management under deep uncertainty. The MORDM framework will be demonstrated using an urban water portfolio management test case. In the test case, a city in the Lower Rio Grande Valley managing population and drought pressures must cost effectively maintain the reliability of its water supply by blending permanent rights to reservoir inflows with alternative strategies for purchasing water within the region's water market. The case study illustrates the significant potential pitfalls in the classic Cost-Reliability conception of the problem. Moreover, the proposed MORDM framework exploits recent advances in multiobjective search, visualization, and sensitivity analysis to better expose these pitfalls en route to identifying highly robust water planning alternatives.

  7. Some objective measures indicative of perceived voice robustness in student teachers.

    Science.gov (United States)

    Orr, Rosemary; de Jong, Felix; Cranen, Bert

    2002-01-01

    One of the problems confronted in the teaching profession is the maintenance of a healthy voice. This basic pedagogical tool is subjected to extensive use, and frequently suffers from overload, with some teachers having to give up their profession altogether. In some teacher training schools, it is the current practice to examine the student's voice, and to refer any perceived susceptibility to strain to voice specialists. For this study, a group of vocally healthy students were examined first at the teacher training schools, and then at the ENT clinic at the University Hospital of Nijmegen. The aim was to predict whether the subject's voice might be at risk for occupational dysphonia as a result of the vocal load of the teaching profession. We tried to find objective measures of voice quality in student teachers, used in current clinical practice, which reflect the judgements of the therapists and phoniatricians. We tried to explain such measures physiologically in terms of robustness of, and control over voicing. Objective measures used included video-laryngostroboscopy, phonetography and spectrography. Maximum phonation time, melodic range in conjunction with maximum intensity range, and the production of soft voice are suggested as possible predictive parameters for the risk of occupational voice strain.

  8. Segmentation of Brain Tissues from Magnetic Resonance Images Using Adaptively Regularized Kernel-Based Fuzzy C-Means Clustering

    Directory of Open Access Journals (Sweden)

    Ahmed Elazab

    2015-01-01

    Full Text Available An adaptively regularized kernel-based fuzzy C-means clustering framework is proposed for segmentation of brain magnetic resonance images. The framework can be in the form of three algorithms for the local average grayscale being replaced by the grayscale of the average filter, median filter, and devised weighted images, respectively. The algorithms employ the heterogeneity of grayscales in the neighborhood and exploit this measure for local contextual information and replace the standard Euclidean distance with Gaussian radial basis kernel functions. The main advantages are adaptiveness to local context, enhanced robustness to preserve image details, independence of clustering parameters, and decreased computational costs. The algorithms have been validated against both synthetic and clinical magnetic resonance images with different types and levels of noises and compared with 6 recent soft clustering algorithms. Experimental results show that the proposed algorithms are superior in preserving image details and segmentation accuracy while maintaining a low computational complexity.

  9. Image Acquisition of Robust Vision Systems to Monitor Blurred Objects in Hazy Smoking Environments

    International Nuclear Information System (INIS)

    Ahn, Yongjin; Park, Seungkyu; Baik, Sunghoon; Kim, Donglyul; Nam, Sungmo; Jeong, Kyungmin

    2014-01-01

    Image information in disaster area or radiation area of nuclear industry is an important data for safety inspection and preparing appropriate damage control plans. So, robust vision system for structures and facilities in blurred smoking environments, such as the places of a fire and detonation, is essential in remote monitoring. Vision systems can't acquire an image when the illumination light is blocked by disturbance materials, such as smoke, fog, dust. The vision system based on wavefront correction can be applied to blurred imaging environments and the range-gated imaging system can be applied to both of blurred imaging and darken light environments. Wavefront control is a widely used technique to improve the performance of optical systems by actively correcting wavefront distortions, such as atmospheric turbulence, thermally-induced distortions, and laser or laser device aberrations, which can reduce the peak intensity and smear an acquired image. The principal applications of wavefront control are for improving the image quality in optical imaging systems such as infrared astronomical telescopes, in imaging and tracking rapidly moving space objects, and in compensating for laser beam distortion through the atmosphere. A conventional wavefront correction system consists of a wavefront sensor, a deformable mirror and a control computer. The control computer measures the wavefront distortions using a wavefront sensor and corrects it using a deformable mirror in a closed-loop. Range-gated imaging (RGI) is a direct active visualization technique using a highly sensitive image sensor and a high intensity illuminant. Currently, the range-gated imaging technique providing 2D and 3D images is one of emerging active vision technologies. The range-gated imaging system gets vision information by summing time sliced vision images. In the RGI system, a high intensity illuminant illuminates for ultra-short time and a highly sensitive image sensor is gated by ultra

  10. Image Acquisition of Robust Vision Systems to Monitor Blurred Objects in Hazy Smoking Environments

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Yongjin; Park, Seungkyu; Baik, Sunghoon; Kim, Donglyul; Nam, Sungmo; Jeong, Kyungmin [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-05-15

    Image information in disaster area or radiation area of nuclear industry is an important data for safety inspection and preparing appropriate damage control plans. So, robust vision system for structures and facilities in blurred smoking environments, such as the places of a fire and detonation, is essential in remote monitoring. Vision systems can't acquire an image when the illumination light is blocked by disturbance materials, such as smoke, fog, dust. The vision system based on wavefront correction can be applied to blurred imaging environments and the range-gated imaging system can be applied to both of blurred imaging and darken light environments. Wavefront control is a widely used technique to improve the performance of optical systems by actively correcting wavefront distortions, such as atmospheric turbulence, thermally-induced distortions, and laser or laser device aberrations, which can reduce the peak intensity and smear an acquired image. The principal applications of wavefront control are for improving the image quality in optical imaging systems such as infrared astronomical telescopes, in imaging and tracking rapidly moving space objects, and in compensating for laser beam distortion through the atmosphere. A conventional wavefront correction system consists of a wavefront sensor, a deformable mirror and a control computer. The control computer measures the wavefront distortions using a wavefront sensor and corrects it using a deformable mirror in a closed-loop. Range-gated imaging (RGI) is a direct active visualization technique using a highly sensitive image sensor and a high intensity illuminant. Currently, the range-gated imaging technique providing 2D and 3D images is one of emerging active vision technologies. The range-gated imaging system gets vision information by summing time sliced vision images. In the RGI system, a high intensity illuminant illuminates for ultra-short time and a highly sensitive image sensor is gated by ultra

  11. Solving advanced multi-objective robust designs by means of multiple objective evolutionary algorithms (MOEA): A reliability application

    Energy Technology Data Exchange (ETDEWEB)

    Salazar A, Daniel E. [Division de Computacion Evolutiva (CEANI), Instituto de Sistemas Inteligentes y Aplicaciones Numericas en Ingenieria (IUSIANI), Universidad de Las Palmas de Gran Canaria. Canary Islands (Spain)]. E-mail: danielsalazaraponte@gmail.com; Rocco S, Claudio M. [Universidad Central de Venezuela, Facultad de Ingenieria, Caracas (Venezuela)]. E-mail: crocco@reacciun.ve

    2007-06-15

    This paper extends the approach proposed by the second author in [Rocco et al. Robust design using a hybrid-cellular-evolutionary and interval-arithmetic approach: a reliability application. In: Tarantola S, Saltelli A, editors. SAMO 2001: Methodological advances and useful applications of sensitivity analysis. Reliab Eng Syst Saf 2003;79(2):149-59 [special issue

  12. Low cost, robust and real time system for detecting and tracking moving objects to automate cargo handling in port terminals

    NARCIS (Netherlands)

    Vaquero, V.; Repiso, E.; Sanfeliu, A.; Vissers, J.; Kwakkernaat, M.

    2016-01-01

    The presented paper addresses the problem of detecting and tracking moving objects for autonomous cargo handling in port terminals using a perception system which input data is a single layer laser scanner. A computationally low cost and robust Detection and Tracking Moving Objects (DATMO) algorithm

  13. A Wavelet Kernel-Based Primal Twin Support Vector Machine for Economic Development Prediction

    Directory of Open Access Journals (Sweden)

    Fang Su

    2013-01-01

    Full Text Available Economic development forecasting allows planners to choose the right strategies for the future. This study is to propose economic development prediction method based on the wavelet kernel-based primal twin support vector machine algorithm. As gross domestic product (GDP is an important indicator to measure economic development, economic development prediction means GDP prediction in this study. The wavelet kernel-based primal twin support vector machine algorithm can solve two smaller sized quadratic programming problems instead of solving a large one as in the traditional support vector machine algorithm. Economic development data of Anhui province from 1992 to 2009 are used to study the prediction performance of the wavelet kernel-based primal twin support vector machine algorithm. The comparison of mean error of economic development prediction between wavelet kernel-based primal twin support vector machine and traditional support vector machine models trained by the training samples with the 3–5 dimensional input vectors, respectively, is given in this paper. The testing results show that the economic development prediction accuracy of the wavelet kernel-based primal twin support vector machine model is better than that of traditional support vector machine.

  14. Multi-objective robust optimization method for the modified epoxy resin sheet molding compounds of the impeller

    Directory of Open Access Journals (Sweden)

    Xiaozhang Qu

    2016-07-01

    Full Text Available A kind of modified epoxy resin sheet molding compounds of the impeller has been designed. Through the test, the non-metal impeller has a better environmental aging performance, but must do the waterproof processing design. In order to improve the stability of the impeller vibration design, the influence of uncertainty factors is considered, and a multi-objective robust optimization method is proposed to reduce the weight of the impeller. Firstly, based on the fluid-structure interaction,the analysis model of the impeller vibration is constructed. Secondly, the optimal approximate model of the impeller is constructed by using the Latin hypercube and radial basis function, and the fitting and optimization accuracy of the approximate model is improved by increasing the sample points. Finally, the micro multi-objective genetic algorithm is applied to the robust optimization of approximate model, and the Monte Carlo simulation and Sobol sampling techniques are used for reliability analysis. By comparing the results of the deterministic, different sigma levels and different materials, the multi-objective optimization of the SMC molding impeller can meet the requirements of engineering stability and lightweight. And the effectiveness of the proposed multi-objective robust optimization method is verified by the error analysis. After the SMC molding and the robust optimization of the impeller, the optimized rate reached 42.5%, which greatly improved the economic benefit, and greatly reduce the vibration of the ventilation system.

  15. Reducing regional vulnerabilities and multi-city robustness conflicts using many-objective optimization under deep uncertainty

    Science.gov (United States)

    Reed, Patrick; Trindade, Bernardo; Jonathan, Herman; Harrison, Zeff; Gregory, Characklis

    2016-04-01

    Emerging water scarcity concerns in southeastern US are associated with several deeply uncertain factors, including rapid population growth, limited coordination across adjacent municipalities and the increasing risks for sustained regional droughts. Managing these uncertainties will require that regional water utilities identify regionally coordinated, scarcity-mitigating strategies that trigger the appropriate actions needed to avoid water shortages and financial instabilities. This research focuses on the Research Triangle area of North Carolina, seeking to engage the water utilities within Raleigh, Durham, Cary and Chapel Hill in cooperative and robust regional water portfolio planning. Prior analysis of this region through the year 2025 has identified significant regional vulnerabilities to volumetric shortfalls and financial losses. Moreover, efforts to maximize the individual robustness of any of the mentioned utilities also have the potential to strongly degrade the robustness of the others. This research advances a multi-stakeholder Many-Objective Robust Decision Making (MORDM) framework to better account for deeply uncertain factors when identifying cooperative management strategies. Results show that the sampling of deeply uncertain factors in the computational search phase of MORDM can aid in the discovery of management actions that substantially improve the robustness of individual utilities as well as the overall region to water scarcity. Cooperative water transfers, financial risk mitigation tools, and coordinated regional demand management must be explored jointly to decrease robustness conflicts between the utilities. The insights from this work have general merit for regions where adjacent municipalities can benefit from cooperative regional water portfolio planning.

  16. System identification via sparse multiple kernel-based regularization using sequential convex optimization techniques

    DEFF Research Database (Denmark)

    Chen, Tianshi; Andersen, Martin Skovgaard; Ljung, Lennart

    2014-01-01

    Model estimation and structure detection with short data records are two issues that receive increasing interests in System Identification. In this paper, a multiple kernel-based regularization method is proposed to handle those issues. Multiple kernels are conic combinations of fixed kernels...

  17. Reducing regional drought vulnerabilities and multi-city robustness conflicts using many-objective optimization under deep uncertainty

    Science.gov (United States)

    Trindade, B. C.; Reed, P. M.; Herman, J. D.; Zeff, H. B.; Characklis, G. W.

    2017-06-01

    Emerging water scarcity concerns in many urban regions are associated with several deeply uncertain factors, including rapid population growth, limited coordination across adjacent municipalities and the increasing risks for sustained regional droughts. Managing these uncertainties will require that regional water utilities identify coordinated, scarcity-mitigating strategies that trigger the appropriate actions needed to avoid water shortages and financial instabilities. This research focuses on the Research Triangle area of North Carolina, seeking to engage the water utilities within Raleigh, Durham, Cary and Chapel Hill in cooperative and robust regional water portfolio planning. Prior analysis of this region through the year 2025 has identified significant regional vulnerabilities to volumetric shortfalls and financial losses. Moreover, efforts to maximize the individual robustness of any of the mentioned utilities also have the potential to strongly degrade the robustness of the others. This research advances a multi-stakeholder Many-Objective Robust Decision Making (MORDM) framework to better account for deeply uncertain factors when identifying cooperative drought management strategies. Our results show that appropriately designing adaptive risk-of-failure action triggers required stressing them with a comprehensive sample of deeply uncertain factors in the computational search phase of MORDM. Search under the new ensemble of states-of-the-world is shown to fundamentally change perceived performance tradeoffs and substantially improve the robustness of individual utilities as well as the overall region to water scarcity. Search under deep uncertainty enhanced the discovery of how cooperative water transfers, financial risk mitigation tools, and coordinated regional demand management must be employed jointly to improve regional robustness and decrease robustness conflicts between the utilities. Insights from this work have general merit for regions where

  18. Robust Optimization Approach for Design for a Dynamic Cell Formation Considering Labor Utilization: Bi-objective Mathematical Mode

    Directory of Open Access Journals (Sweden)

    Hiwa Farughi

    2016-05-01

    Full Text Available In this paper, robust optimization of a bi-objective mathematical model in a dynamic cell formation problem considering labor utilization with uncertain data is carried out. The robust approach is used to reduce the effects of fluctuations of the uncertain parameters with regards to all the possible future scenarios. In this research, cost parameters of the cell formation and demand fluctuations are subject to uncertainty and a mixed-integer programming (MIP model is developed to formulate the related robust dynamic cell formation problem. Then the problem is transformed into a bi-objective linear one. The first objective function seeks to minimize relevant costs of the problem including machine procurement and relocation costs, machine variable cost, inter-cell movement and intra-cell movement costs, overtime cost and labor shifting cost between cells, machine maintenance cost, inventory, holding part cost. The second objective function seeks to minimize total man-hour deviations between cells or indeed labor utilization of the modeled.

  19. Robustness and prediction accuracy of machine learning for objective visual quality assessment

    OpenAIRE

    HINES, ANDREW

    2014-01-01

    PUBLISHED Lisbon, Portugal Machine Learning (ML) is a powerful tool to support the development of objective visual quality assessment metrics, serving as a substitute model for the perceptual mechanisms acting in visual quality appreciation. Nevertheless, the reli- ability of ML-based techniques within objective quality as- sessment metrics is often questioned. In this study, the ro- bustness of ML in supporting objective quality assessment is investigated, specific...

  20. Bi-variate statistical attribute filtering : A tool for robust detection of faint objects

    NARCIS (Netherlands)

    Teeninga, Paul; Moschini, Ugo; Trager, Scott C.; Wilkinson, M.H.F.

    2013-01-01

    We present a new method for morphological connected attribute filtering for object detection in astronomical images. In this approach, a threshold is set on one attribute (power), based on its distribution due to noise, as a function of object area. The results show an order of magnitude higher

  1. Autonomous learning of robust visual object detection and identification on a humanoid

    NARCIS (Netherlands)

    Leitner, J.; Chandrashekhariah, P.; Harding, S.; Frank, M.; Spina, G.; Förster, A.; Triesch, J.; Schmidhuber, J.

    2012-01-01

    In this work we introduce a technique for a humanoid robot to autonomously learn the representations of objects within its visual environment. Our approach involves an attention mechanism in association with feature based segmentation that explores the environment and provides object samples for

  2. Critical object recognition in millimeter-wave images with robustness to rotation and scale.

    Science.gov (United States)

    Mohammadzade, Hoda; Ghojogh, Benyamin; Faezi, Sina; Shabany, Mahdi

    2017-06-01

    Locating critical objects is crucial in various security applications and industries. For example, in security applications, such as in airports, these objects might be hidden or covered under shields or secret sheaths. Millimeter-wave images can be utilized to discover and recognize the critical objects out of the hidden cases without any health risk due to their non-ionizing features. However, millimeter-wave images usually have waves in and around the detected objects, making object recognition difficult. Thus, regular image processing and classification methods cannot be used for these images and additional pre-processings and classification methods should be introduced. This paper proposes a novel pre-processing method for canceling rotation and scale using principal component analysis. In addition, a two-layer classification method is introduced and utilized for recognition. Moreover, a large dataset of millimeter-wave images is collected and created for experiments. Experimental results show that a typical classification method such as support vector machines can recognize 45.5% of a type of critical objects at 34.2% false alarm rate (FAR), which is a drastically poor recognition. The same method within the proposed recognition framework achieves 92.9% recognition rate at 0.43% FAR, which indicates a highly significant improvement. The significant contribution of this work is to introduce a new method for analyzing millimeter-wave images based on machine vision and learning approaches, which is not yet widely noted in the field of millimeter-wave image analysis.

  3. Epileptic Seizure Detection with Log-Euclidean Gaussian Kernel-Based Sparse Representation.

    Science.gov (United States)

    Yuan, Shasha; Zhou, Weidong; Wu, Qi; Zhang, Yanli

    2016-05-01

    Epileptic seizure detection plays an important role in the diagnosis of epilepsy and reducing the massive workload of reviewing electroencephalography (EEG) recordings. In this work, a novel algorithm is developed to detect seizures employing log-Euclidean Gaussian kernel-based sparse representation (SR) in long-term EEG recordings. Unlike the traditional SR for vector data in Euclidean space, the log-Euclidean Gaussian kernel-based SR framework is proposed for seizure detection in the space of the symmetric positive definite (SPD) matrices, which form a Riemannian manifold. Since the Riemannian manifold is nonlinear, the log-Euclidean Gaussian kernel function is applied to embed it into a reproducing kernel Hilbert space (RKHS) for performing SR. The EEG signals of all channels are divided into epochs and the SPD matrices representing EEG epochs are generated by covariance descriptors. Then, the testing samples are sparsely coded over the dictionary composed by training samples utilizing log-Euclidean Gaussian kernel-based SR. The classification of testing samples is achieved by computing the minimal reconstructed residuals. The proposed method is evaluated on the Freiburg EEG dataset of 21 patients and shows its notable performance on both epoch-based and event-based assessments. Moreover, this method handles multiple channels of EEG recordings synchronously which is more speedy and efficient than traditional seizure detection methods.

  4. A multi-objective robust optimization model for logistics planning in the earthquake response phase

    NARCIS (Netherlands)

    Najafi, M.; Eshghi, K.; Dullaert, W.E.H.

    2013-01-01

    Usually, resources are short in supply when earthquakes occur. In such emergency situations, disaster relief organizations must use these scarce resources efficiently to achieve the best possible emergency relief. This paper therefore proposes a multi-objective, multi-mode, multi-commodity, and

  5. Estimation of the applicability domain of kernel-based machine learning models for virtual screening

    Directory of Open Access Journals (Sweden)

    Fechner Nikolas

    2010-03-01

    Full Text Available Abstract Background The virtual screening of large compound databases is an important application of structural-activity relationship models. Due to the high structural diversity of these data sets, it is impossible for machine learning based QSAR models, which rely on a specific training set, to give reliable results for all compounds. Thus, it is important to consider the subset of the chemical space in which the model is applicable. The approaches to this problem that have been published so far mostly use vectorial descriptor representations to define this domain of applicability of the model. Unfortunately, these cannot be extended easily to structured kernel-based machine learning models. For this reason, we propose three approaches to estimate the domain of applicability of a kernel-based QSAR model. Results We evaluated three kernel-based applicability domain estimations using three different structured kernels on three virtual screening tasks. Each experiment consisted of the training of a kernel-based QSAR model using support vector regression and the ranking of a disjoint screening data set according to the predicted activity. For each prediction, the applicability of the model for the respective compound is quantitatively described using a score obtained by an applicability domain formulation. The suitability of the applicability domain estimation is evaluated by comparing the model performance on the subsets of the screening data sets obtained by different thresholds for the applicability scores. This comparison indicates that it is possible to separate the part of the chemspace, in which the model gives reliable predictions, from the part consisting of structures too dissimilar to the training set to apply the model successfully. A closer inspection reveals that the virtual screening performance of the model is considerably improved if half of the molecules, those with the lowest applicability scores, are omitted from the screening

  6. Estimation of the applicability domain of kernel-based machine learning models for virtual screening.

    Science.gov (United States)

    Fechner, Nikolas; Jahn, Andreas; Hinselmann, Georg; Zell, Andreas

    2010-03-11

    The virtual screening of large compound databases is an important application of structural-activity relationship models. Due to the high structural diversity of these data sets, it is impossible for machine learning based QSAR models, which rely on a specific training set, to give reliable results for all compounds. Thus, it is important to consider the subset of the chemical space in which the model is applicable. The approaches to this problem that have been published so far mostly use vectorial descriptor representations to define this domain of applicability of the model. Unfortunately, these cannot be extended easily to structured kernel-based machine learning models. For this reason, we propose three approaches to estimate the domain of applicability of a kernel-based QSAR model. We evaluated three kernel-based applicability domain estimations using three different structured kernels on three virtual screening tasks. Each experiment consisted of the training of a kernel-based QSAR model using support vector regression and the ranking of a disjoint screening data set according to the predicted activity. For each prediction, the applicability of the model for the respective compound is quantitatively described using a score obtained by an applicability domain formulation. The suitability of the applicability domain estimation is evaluated by comparing the model performance on the subsets of the screening data sets obtained by different thresholds for the applicability scores. This comparison indicates that it is possible to separate the part of the chemspace, in which the model gives reliable predictions, from the part consisting of structures too dissimilar to the training set to apply the model successfully. A closer inspection reveals that the virtual screening performance of the model is considerably improved if half of the molecules, those with the lowest applicability scores, are omitted from the screening. The proposed applicability domain formulations

  7. Two-Phase Iteration for Value Function Approximation and Hyperparameter Optimization in Gaussian-Kernel-Based Adaptive Critic Design

    OpenAIRE

    Chen, Xin; Xie, Penghuan; Xiong, Yonghua; He, Yong; Wu, Min

    2015-01-01

    Adaptive Dynamic Programming (ADP) with critic-actor architecture is an effective way to perform online learning control. To avoid the subjectivity in the design of a neural network that serves as a critic network, kernel-based adaptive critic design (ACD) was developed recently. There are two essential issues for a static kernel-based model: how to determine proper hyperparameters in advance and how to select right samples to describe the value function. They all rely on the assessment of sa...

  8. Confidence-Based Data Association and Discriminative Deep Appearance Learning for Robust Online Multi-Object Tracking.

    Science.gov (United States)

    Bae, Seung-Hwan; Yoon, Kuk-Jin

    2018-03-01

    Online multi-object tracking aims at estimating the tracks of multiple objects instantly with each incoming frame and the information provided up to the moment. It still remains a difficult problem in complex scenes, because of the large ambiguity in associating multiple objects in consecutive frames and the low discriminability between objects appearances. In this paper, we propose a robust online multi-object tracking method that can handle these difficulties effectively. We first define the tracklet confidence using the detectability and continuity of a tracklet, and decompose a multi-object tracking problem into small subproblems based on the tracklet confidence. We then solve the online multi-object tracking problem by associating tracklets and detections in different ways according to their confidence values. Based on this strategy, tracklets sequentially grow with online-provided detections, and fragmented tracklets are linked up with others without any iterative and expensive association steps. For more reliable association between tracklets and detections, we also propose a deep appearance learning method to learn a discriminative appearance model from large training datasets, since the conventional appearance learning methods do not provide rich representation that can distinguish multiple objects with large appearance variations. In addition, we combine online transfer learning for improving appearance discriminability by adapting the pre-trained deep model during online tracking. Experiments with challenging public datasets show distinct performance improvement over other state-of-the-arts batch and online tracking methods, and prove the effect and usefulness of the proposed methods for online multi-object tracking.

  9. Reactionless robust finite-time control for manipulation of passive objects by free-floating space robots

    International Nuclear Information System (INIS)

    Guo Sheng-Peng; Li Dong-Xu; Meng Yun-He; Fan Cai-Zhi

    2014-01-01

    On-orbit servicing requires efficient techniques for manipulating passive objects. The paper aims at developing a reactionless control method that drives the manipulator to manipulate passive objects with high precision, while inducing no disturbances to its base attitude. To this end, decomposition of the target dynamics from the base dynamics is discussed, so that they can be considered as two independent subsystems. A reactionless nonlinear controller is presented, which ensures high-precision manipulation of the targets and that the base orientation is unchanged. This is achieved by combining the robust finite-time control with the reaction null space. Finally, the performance of the proposed method is examined by comparing it with that of a reactionless PD controller and a pure finite-time controller. (electromagnetism, optics, acoustics, heat transfer, classical mechanics, and fluid dynamics)

  10. Construction of phylogenetic trees by kernel-based comparative analysis of metabolic networks.

    Science.gov (United States)

    Oh, S June; Joung, Je-Gun; Chang, Jeong-Ho; Zhang, Byoung-Tak

    2006-06-06

    To infer the tree of life requires knowledge of the common characteristics of each species descended from a common ancestor as the measuring criteria and a method to calculate the distance between the resulting values of each measure. Conventional phylogenetic analysis based on genomic sequences provides information about the genetic relationships between different organisms. In contrast, comparative analysis of metabolic pathways in different organisms can yield insights into their functional relationships under different physiological conditions. However, evaluating the similarities or differences between metabolic networks is a computationally challenging problem, and systematic methods of doing this are desirable. Here we introduce a graph-kernel method for computing the similarity between metabolic networks in polynomial time, and use it to profile metabolic pathways and to construct phylogenetic trees. To compare the structures of metabolic networks in organisms, we adopted the exponential graph kernel, which is a kernel-based approach with a labeled graph that includes a label matrix and an adjacency matrix. To construct the phylogenetic trees, we used an unweighted pair-group method with arithmetic mean, i.e., a hierarchical clustering algorithm. We applied the kernel-based network profiling method in a comparative analysis of nine carbohydrate metabolic networks from 81 biological species encompassing Archaea, Eukaryota, and Eubacteria. The resulting phylogenetic hierarchies generally support the tripartite scheme of three domains rather than the two domains of prokaryotes and eukaryotes. By combining the kernel machines with metabolic information, the method infers the context of biosphere development that covers physiological events required for adaptation by genetic reconstruction. The results show that one may obtain a global view of the tree of life by comparing the metabolic pathway structures using meta-level information rather than sequence

  11. Construction of phylogenetic trees by kernel-based comparative analysis of metabolic networks

    Directory of Open Access Journals (Sweden)

    Chang Jeong-Ho

    2006-06-01

    Full Text Available Abstract Background To infer the tree of life requires knowledge of the common characteristics of each species descended from a common ancestor as the measuring criteria and a method to calculate the distance between the resulting values of each measure. Conventional phylogenetic analysis based on genomic sequences provides information about the genetic relationships between different organisms. In contrast, comparative analysis of metabolic pathways in different organisms can yield insights into their functional relationships under different physiological conditions. However, evaluating the similarities or differences between metabolic networks is a computationally challenging problem, and systematic methods of doing this are desirable. Here we introduce a graph-kernel method for computing the similarity between metabolic networks in polynomial time, and use it to profile metabolic pathways and to construct phylogenetic trees. Results To compare the structures of metabolic networks in organisms, we adopted the exponential graph kernel, which is a kernel-based approach with a labeled graph that includes a label matrix and an adjacency matrix. To construct the phylogenetic trees, we used an unweighted pair-group method with arithmetic mean, i.e., a hierarchical clustering algorithm. We applied the kernel-based network profiling method in a comparative analysis of nine carbohydrate metabolic networks from 81 biological species encompassing Archaea, Eukaryota, and Eubacteria. The resulting phylogenetic hierarchies generally support the tripartite scheme of three domains rather than the two domains of prokaryotes and eukaryotes. Conclusion By combining the kernel machines with metabolic information, the method infers the context of biosphere development that covers physiological events required for adaptation by genetic reconstruction. The results show that one may obtain a global view of the tree of life by comparing the metabolic pathway

  12. Robust multiple cue fusion-based high-speed and nonrigid object tracking algorithm for short track speed skating

    Science.gov (United States)

    Liu, Chenguang; Cheng, Heng-Da; Zhang, Yingtao; Wang, Yuxuan; Xian, Min

    2016-01-01

    This paper presents a methodology for tracking multiple skaters in short track speed skating competitions. Nonrigid skaters move at high speed with severe occlusions happening frequently among them. The camera is panned quickly in order to capture the skaters in a large and dynamic scene. To automatically track the skaters and precisely output their trajectories becomes a challenging task in object tracking. We employ the global rink information to compensate camera motion and obtain the global spatial information of skaters, utilize random forest to fuse multiple cues and predict the blob of each skater, and finally apply a silhouette- and edge-based template-matching and blob-evolving method to labelling pixels to a skater. The effectiveness and robustness of the proposed method are verified through thorough experiments.

  13. Monitoring the change in colour of meat: A comparison of traditional and kernel-based orthogonal transformations

    Directory of Open Access Journals (Sweden)

    Asger Nyman Christiansen

    2012-11-01

    Full Text Available Currently, no objective method exists for estimating the rate of change in the colour of meat. Consequently, the purpose of this work is to develop a procedure capable of monitoring the change in colour of meat over time, environment and ingredients. This provides a useful tool to determine which storage environments and ingredients a manufacturer should add to meat to reduce the rate of change in colour. The procedure consists of taking multi-spectral images of a piece of meat as a function of time, clustering the pixels of these images into categories, including several types of meat, and extracting colour information from each category. The focus has primarily been on achieving an accurate categorisation since this is crucial to develop a useful method. The categorisation is done by applying an orthogonal transformation followed by k-means clustering. The purpose of the orthogonal transformation is to reduce the noise and amount of data while enhancing the difference between the categories. The orthogonal transformations principal components analysis, minimum noise fraction analysis and kernel-based versions of these have been applied to test which produce the most accurate categorisation.

  14. Sparse approximation of multilinear problems with applications to kernel-based methods in UQ

    KAUST Repository

    Nobile, Fabio; Tempone, Raul; Wolfers, Sö ren

    2017-01-01

    We provide a framework for the sparse approximation of multilinear problems and show that several problems in uncertainty quantification fit within this framework. In these problems, the value of a multilinear map has to be approximated using approximations of different accuracy and computational work of the arguments of this map. We propose and analyze a generalized version of Smolyak’s algorithm, which provides sparse approximation formulas with convergence rates that mitigate the curse of dimension that appears in multilinear approximation problems with a large number of arguments. We apply the general framework to response surface approximation and optimization under uncertainty for parametric partial differential equations using kernel-based approximation. The theoretical results are supplemented by numerical experiments.

  15. Sparse approximation of multilinear problems with applications to kernel-based methods in UQ

    KAUST Repository

    Nobile, Fabio

    2017-11-16

    We provide a framework for the sparse approximation of multilinear problems and show that several problems in uncertainty quantification fit within this framework. In these problems, the value of a multilinear map has to be approximated using approximations of different accuracy and computational work of the arguments of this map. We propose and analyze a generalized version of Smolyak’s algorithm, which provides sparse approximation formulas with convergence rates that mitigate the curse of dimension that appears in multilinear approximation problems with a large number of arguments. We apply the general framework to response surface approximation and optimization under uncertainty for parametric partial differential equations using kernel-based approximation. The theoretical results are supplemented by numerical experiments.

  16. A framework for optimal kernel-based manifold embedding of medical image data.

    Science.gov (United States)

    Zimmer, Veronika A; Lekadir, Karim; Hoogendoorn, Corné; Frangi, Alejandro F; Piella, Gemma

    2015-04-01

    Kernel-based dimensionality reduction is a widely used technique in medical image analysis. To fully unravel the underlying nonlinear manifold the selection of an adequate kernel function and of its free parameters is critical. In practice, however, the kernel function is generally chosen as Gaussian or polynomial and such standard kernels might not always be optimal for a given image dataset or application. In this paper, we present a study on the effect of the kernel functions in nonlinear manifold embedding of medical image data. To this end, we first carry out a literature review on existing advanced kernels developed in the statistics, machine learning, and signal processing communities. In addition, we implement kernel-based formulations of well-known nonlinear dimensional reduction techniques such as Isomap and Locally Linear Embedding, thus obtaining a unified framework for manifold embedding using kernels. Subsequently, we present a method to automatically choose a kernel function and its associated parameters from a pool of kernel candidates, with the aim to generate the most optimal manifold embeddings. Furthermore, we show how the calculated selection measures can be extended to take into account the spatial relationships in images, or used to combine several kernels to further improve the embedding results. Experiments are then carried out on various synthetic and phantom datasets for numerical assessment of the methods. Furthermore, the workflow is applied to real data that include brain manifolds and multispectral images to demonstrate the importance of the kernel selection in the analysis of high-dimensional medical images. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Mapping Fire Severity Using Imaging Spectroscopy and Kernel Based Image Analysis

    Science.gov (United States)

    Prasad, S.; Cui, M.; Zhang, Y.; Veraverbeke, S.

    2014-12-01

    Improved spatial representation of within-burn heterogeneity after wildfires is paramount to effective land management decisions and more accurate fire emissions estimates. In this work, we demonstrate feasibility and efficacy of airborne imaging spectroscopy (hyperspectral imagery) for quantifying wildfire burn severity, using kernel based image analysis techniques. Two different airborne hyperspectral datasets, acquired over the 2011 Canyon and 2013 Rim fire in California using the Airborne Visible InfraRed Imaging Spectrometer (AVIRIS) sensor, were used in this study. The Rim Fire, covering parts of the Yosemite National Park started on August 17, 2013, and was the third largest fire in California's history. Canyon Fire occurred in the Tehachapi mountains, and started on September 4, 2011. In addition to post-fire data for both fires, half of the Rim fire was also covered with pre-fire images. Fire severity was measured in the field using Geo Composite Burn Index (GeoCBI). The field data was utilized to train and validate our models, wherein the trained models, in conjunction with imaging spectroscopy data were used for GeoCBI estimation wide geographical regions. This work presents an approach for using remotely sensed imagery combined with GeoCBI field data to map fire scars based on a non-linear (kernel based) epsilon-Support Vector Regression (e-SVR), which was used to learn the relationship between spectra and GeoCBI in a kernel-induced feature space. Classification of healthy vegetation versus fire-affected areas based on morphological multi-attribute profiles was also studied. The availability of pre- and post-fire imaging spectroscopy data over the Rim Fire provided a unique opportunity to evaluate the performance of bi-temporal imaging spectroscopy for assessing post-fire effects. This type of data is currently constrained because of limited airborne acquisitions before a fire, but will become widespread with future spaceborne sensors such as those on

  18. A comparison of graph- and kernel-based -omics data integration algorithms for classifying complex traits.

    Science.gov (United States)

    Yan, Kang K; Zhao, Hongyu; Pang, Herbert

    2017-12-06

    High-throughput sequencing data are widely collected and analyzed in the study of complex diseases in quest of improving human health. Well-studied algorithms mostly deal with single data source, and cannot fully utilize the potential of these multi-omics data sources. In order to provide a holistic understanding of human health and diseases, it is necessary to integrate multiple data sources. Several algorithms have been proposed so far, however, a comprehensive comparison of data integration algorithms for classification of binary traits is currently lacking. In this paper, we focus on two common classes of integration algorithms, graph-based that depict relationships with subjects denoted by nodes and relationships denoted by edges, and kernel-based that can generate a classifier in feature space. Our paper provides a comprehensive comparison of their performance in terms of various measurements of classification accuracy and computation time. Seven different integration algorithms, including graph-based semi-supervised learning, graph sharpening integration, composite association network, Bayesian network, semi-definite programming-support vector machine (SDP-SVM), relevance vector machine (RVM) and Ada-boost relevance vector machine are compared and evaluated with hypertension and two cancer data sets in our study. In general, kernel-based algorithms create more complex models and require longer computation time, but they tend to perform better than graph-based algorithms. The performance of graph-based algorithms has the advantage of being faster computationally. The empirical results demonstrate that composite association network, relevance vector machine, and Ada-boost RVM are the better performers. We provide recommendations on how to choose an appropriate algorithm for integrating data from multiple sources.

  19. A kernel-based approach to MIMO LPV state-space identification and application to a nonlinear process system

    NARCIS (Netherlands)

    Rizvi, S.Z.; Mohammadpour, J.; Toth, R.; Meskin, N.

    2015-01-01

    This paper first describes the development of a nonparametric identification method for linear parameter-varying (LPV) state-space models and then applies it to a nonlinear process system. The proposed method uses kernel-based least-squares support vector machines (LS-SVM). While parametric

  20. Two-Phase Iteration for Value Function Approximation and Hyperparameter Optimization in Gaussian-Kernel-Based Adaptive Critic Design

    Directory of Open Access Journals (Sweden)

    Xin Chen

    2015-01-01

    Full Text Available Adaptive Dynamic Programming (ADP with critic-actor architecture is an effective way to perform online learning control. To avoid the subjectivity in the design of a neural network that serves as a critic network, kernel-based adaptive critic design (ACD was developed recently. There are two essential issues for a static kernel-based model: how to determine proper hyperparameters in advance and how to select right samples to describe the value function. They all rely on the assessment of sample values. Based on the theoretical analysis, this paper presents a two-phase simultaneous learning method for a Gaussian-kernel-based critic network. It is able to estimate the values of samples without infinitively revisiting them. And the hyperparameters of the kernel model are optimized simultaneously. Based on the estimated sample values, the sample set can be refined by adding alternatives or deleting redundances. Combining this critic design with actor network, we present a Gaussian-kernel-based Adaptive Dynamic Programming (GK-ADP approach. Simulations are used to verify its feasibility, particularly the necessity of two-phase learning, the convergence characteristics, and the improvement of the system performance by using a varying sample set.

  1. Impact of mechanism vibration characteristics by joint clearance and optimization design of its multi-objective robustness

    Science.gov (United States)

    Zeng, Baoping; Wang, Chao; Zhang, Yu; Gong, Yajun; Hu, Sanbao

    2017-12-01

    Joint clearances and friction characteristics significantly influence the mechanism vibration characteristics; for example: as for joint clearances, the shaft and bearing of its clearance joint collide to bring about the dynamic normal contact force and tangential coulomb friction force while the mechanism works; thus, the whole system may vibrate; moreover, the mechanism is under contact-impact with impact force constraint from free movement under action of the above dynamic forces; in addition, the mechanism topology structure also changes. The constraint relationship between joints may be established by a repeated complex nonlinear dynamic process (idle stroke - contact-impact - elastic compression - rebound - impact relief - idle stroke movement - contact-impact). Analysis of vibration characteristics of joint parts is still a challenging open task by far. The dynamic equations for any mechanism with clearance is often a set of strong coupling, high-dimensional and complex time-varying nonlinear differential equations which are solved very difficultly. Moreover, complicated chaotic motions very sensitive to initial values in impact and vibration due to clearance let high-precision simulation and prediction of their dynamic behaviors be more difficult; on the other hand, their subsequent wearing necessarily leads to some certain fluctuation of structure clearance parameters, which acts as one primary factor for vibration of the mechanical system. A dynamic model was established to the device for opening the deepwater robot cabin door with joint clearance by utilizing the finite element method and analysis was carried out to its vibration characteristics in this study. Moreover, its response model was carried out by utilizing the DOE method and then the robust optimization design was performed to sizes of the joint clearance and the friction coefficient change range so that the optimization design results may be regarded as reference data for selecting bearings

  2. Research on a Novel Kernel Based Grey Prediction Model and Its Applications

    Directory of Open Access Journals (Sweden)

    Xin Ma

    2016-01-01

    Full Text Available The discrete grey prediction models have attracted considerable interest of research due to its effectiveness to improve the modelling accuracy of the traditional grey prediction models. The autoregressive GM(1,1 model, abbreviated as ARGM(1,1, is a novel discrete grey model which is easy to use and accurate in prediction of approximate nonhomogeneous exponential time series. However, the ARGM(1,1 is essentially a linear model; thus, its applicability is still limited. In this paper a novel kernel based ARGM(1,1 model is proposed, abbreviated as KARGM(1,1. The KARGM(1,1 has a nonlinear function which can be expressed by a kernel function using the kernel method, and its modelling procedures are presented in details. Two case studies of predicting the monthly gas well production are carried out with the real world production data. The results of KARGM(1,1 model are compared to the existing discrete univariate grey prediction models, including ARGM(1,1, NDGM(1,1,k, DGM(1,1, and NGBMOP, and it is shown that the KARGM(1,1 outperforms the other four models.

  3. Kernel-based whole-genome prediction of complex traits: a review.

    Science.gov (United States)

    Morota, Gota; Gianola, Daniel

    2014-01-01

    Prediction of genetic values has been a focus of applied quantitative genetics since the beginning of the 20th century, with renewed interest following the advent of the era of whole genome-enabled prediction. Opportunities offered by the emergence of high-dimensional genomic data fueled by post-Sanger sequencing technologies, especially molecular markers, have driven researchers to extend Ronald Fisher and Sewall Wright's models to confront new challenges. In particular, kernel methods are gaining consideration as a regression method of choice for genome-enabled prediction. Complex traits are presumably influenced by many genomic regions working in concert with others (clearly so when considering pathways), thus generating interactions. Motivated by this view, a growing number of statistical approaches based on kernels attempt to capture non-additive effects, either parametrically or non-parametrically. This review centers on whole-genome regression using kernel methods applied to a wide range of quantitative traits of agricultural importance in animals and plants. We discuss various kernel-based approaches tailored to capturing total genetic variation, with the aim of arriving at an enhanced predictive performance in the light of available genome annotation information. Connections between prediction machines born in animal breeding, statistics, and machine learning are revisited, and their empirical prediction performance is discussed. Overall, while some encouraging results have been obtained with non-parametric kernels, recovering non-additive genetic variation in a validation dataset remains a challenge in quantitative genetics.

  4. Kernel-based whole-genome prediction of complex traits: a review

    Directory of Open Access Journals (Sweden)

    Gota eMorota

    2014-10-01

    Full Text Available Prediction of genetic values has been a focus of applied quantitative genetics since the beginning of the 20th century, with renewed interest following the advent of the era of whole genome-enabled prediction. Opportunities offered by the emergence of high-dimensional genomic data fueled by post-Sanger sequencing technologies, especially molecular markers, have driven researchers to extend Ronald Fisher and Sewall Wright's models to confront new challenges. In particular, kernel methods are gaining consideration as a regression method of choice for genome-enabled prediction. Complex traits are presumably influenced by many genomic regions working in concert with others (clearly so when considering pathways, thus generating interactions. Motivated by this view, a growing number of statistical approaches based on kernels attempt to capture non-additive effects, either parametrically or non-parametrically. This review centers on whole-genome regression using kernel methods applied to a wide range of quantitative traits of agricultural importance in animals and plants. We discuss various kernel-based approaches tailored to capturing total genetic variation, with the aim of arriving at an enhanced predictive performance in the light of available genome annotation information. Connections between prediction machines born in animal breeding, statistics, and machine learning are revisited, and their empirical prediction performance is discussed. Overall, while some encouraging results have been obtained with non-parametric kernels, recovering non-additive genetic variation in a validation dataset remains a challenge in quantitative genetics.

  5. A Novel Mittag-Leffler Kernel Based Hybrid Fault Diagnosis Method for Wheeled Robot Driving System

    Directory of Open Access Journals (Sweden)

    Xianfeng Yuan

    2015-01-01

    presents a novel hybrid fault diagnosis framework based on Mittag-Leffler kernel (ML-kernel support vector machine (SVM and Dempster-Shafer (D-S fusion. Using sensor data sampled under different running conditions, the proposed approach initially establishes multiple principal component analysis (PCA models for fault feature extraction. The fault feature vectors are then applied to train the probabilistic SVM (PSVM classifiers that arrive at a preliminary fault diagnosis. To improve the accuracy of preliminary results, a novel ML-kernel based PSVM classifier is proposed in this paper, and the positive definiteness of the ML-kernel is proved as well. The basic probability assignments (BPAs are defined based on the preliminary fault diagnosis results and their confidence values. Eventually, the final fault diagnosis result is archived by the fusion of the BPAs. Experimental results show that the proposed framework not only is capable of detecting and identifying the faults in the robot driving system, but also has better performance in stability and diagnosis accuracy compared with the traditional methods.

  6. Kernel-Based Relevance Analysis with Enhanced Interpretability for Detection of Brain Activity Patterns

    Directory of Open Access Journals (Sweden)

    Andres M. Alvarez-Meza

    2017-10-01

    Full Text Available We introduce Enhanced Kernel-based Relevance Analysis (EKRA that aims to support the automatic identification of brain activity patterns using electroencephalographic recordings. EKRA is a data-driven strategy that incorporates two kernel functions to take advantage of the available joint information, associating neural responses to a given stimulus condition. Regarding this, a Centered Kernel Alignment functional is adjusted to learning the linear projection that best discriminates the input feature set, optimizing the required free parameters automatically. Our approach is carried out in two scenarios: (i feature selection by computing a relevance vector from extracted neural features to facilitating the physiological interpretation of a given brain activity task, and (ii enhanced feature selection to perform an additional transformation of relevant features aiming to improve the overall identification accuracy. Accordingly, we provide an alternative feature relevance analysis strategy that allows improving the system performance while favoring the data interpretability. For the validation purpose, EKRA is tested in two well-known tasks of brain activity: motor imagery discrimination and epileptic seizure detection. The obtained results show that the EKRA approach estimates a relevant representation space extracted from the provided supervised information, emphasizing the salient input features. As a result, our proposal outperforms the state-of-the-art methods regarding brain activity discrimination accuracy with the benefit of enhanced physiological interpretation about the task at hand.

  7. A trace ratio maximization approach to multiple kernel-based dimensionality reduction.

    Science.gov (United States)

    Jiang, Wenhao; Chung, Fu-lai

    2014-01-01

    Most dimensionality reduction techniques are based on one metric or one kernel, hence it is necessary to select an appropriate kernel for kernel-based dimensionality reduction. Multiple kernel learning for dimensionality reduction (MKL-DR) has been recently proposed to learn a kernel from a set of base kernels which are seen as different descriptions of data. As MKL-DR does not involve regularization, it might be ill-posed under some conditions and consequently its applications are hindered. This paper proposes a multiple kernel learning framework for dimensionality reduction based on regularized trace ratio, termed as MKL-TR. Our method aims at learning a transformation into a space of lower dimension and a corresponding kernel from the given base kernels among which some may not be suitable for the given data. The solutions for the proposed framework can be found based on trace ratio maximization. The experimental results demonstrate its effectiveness in benchmark datasets, which include text, image and sound datasets, for supervised, unsupervised as well as semi-supervised settings. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Towards a Holistic Cortical Thickness Descriptor: Heat Kernel-Based Grey Matter Morphology Signatures.

    Science.gov (United States)

    Wang, Gang; Wang, Yalin

    2017-02-15

    In this paper, we propose a heat kernel based regional shape descriptor that may be capable of better exploiting volumetric morphological information than other available methods, thereby improving statistical power on brain magnetic resonance imaging (MRI) analysis. The mechanism of our analysis is driven by the graph spectrum and the heat kernel theory, to capture the volumetric geometry information in the constructed tetrahedral meshes. In order to capture profound brain grey matter shape changes, we first use the volumetric Laplace-Beltrami operator to determine the point pair correspondence between white-grey matter and CSF-grey matter boundary surfaces by computing the streamlines in a tetrahedral mesh. Secondly, we propose multi-scale grey matter morphology signatures to describe the transition probability by random walk between the point pairs, which reflects the inherent geometric characteristics. Thirdly, a point distribution model is applied to reduce the dimensionality of the grey matter morphology signatures and generate the internal structure features. With the sparse linear discriminant analysis, we select a concise morphology feature set with improved classification accuracies. In our experiments, the proposed work outperformed the cortical thickness features computed by FreeSurfer software in the classification of Alzheimer's disease and its prodromal stage, i.e., mild cognitive impairment, on publicly available data from the Alzheimer's Disease Neuroimaging Initiative. The multi-scale and physics based volumetric structure feature may bring stronger statistical power than some traditional methods for MRI-based grey matter morphology analysis. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Kernel based methods for accelerated failure time model with ultra-high dimensional data

    Directory of Open Access Journals (Sweden)

    Jiang Feng

    2010-12-01

    Full Text Available Abstract Background Most genomic data have ultra-high dimensions with more than 10,000 genes (probes. Regularization methods with L1 and Lp penalty have been extensively studied in survival analysis with high-dimensional genomic data. However, when the sample size n ≪ m (the number of genes, directly identifying a small subset of genes from ultra-high (m > 10, 000 dimensional data is time-consuming and not computationally efficient. In current microarray analysis, what people really do is select a couple of thousands (or hundreds of genes using univariate analysis or statistical tests, and then apply the LASSO-type penalty to further reduce the number of disease associated genes. This two-step procedure may introduce bias and inaccuracy and lead us to miss biologically important genes. Results The accelerated failure time (AFT model is a linear regression model and a useful alternative to the Cox model for survival analysis. In this paper, we propose a nonlinear kernel based AFT model and an efficient variable selection method with adaptive kernel ridge regression. Our proposed variable selection method is based on the kernel matrix and dual problem with a much smaller n × n matrix. It is very efficient when the number of unknown variables (genes is much larger than the number of samples. Moreover, the primal variables are explicitly updated and the sparsity in the solution is exploited. Conclusions Our proposed methods can simultaneously identify survival associated prognostic factors and predict survival outcomes with ultra-high dimensional genomic data. We have demonstrated the performance of our methods with both simulation and real data. The proposed method performs superbly with limited computational studies.

  10. A comparison study for dose calculation in radiation therapy: pencil beam Kernel based vs. Monte Carlo simulation vs. measurements

    Energy Technology Data Exchange (ETDEWEB)

    Cheong, Kwang-Ho; Suh, Tae-Suk; Lee, Hyoung-Koo; Choe, Bo-Young [The Catholic Univ. of Korea, Seoul (Korea, Republic of); Kim, Hoi-Nam; Yoon, Sei-Chul [Kangnam St. Mary' s Hospital, Seoul (Korea, Republic of)

    2002-07-01

    Accurate dose calculation in radiation treatment planning is most important for successful treatment. Since human body is composed of various materials and not an ideal shape, it is not easy to calculate the accurate effective dose in the patients. Many methods have been proposed to solve inhomogeneity and surface contour problems. Monte Carlo simulations are regarded as the most accurate method, but it is not appropriate for routine planning because it takes so much time. Pencil beam kernel based convolution/superposition methods were also proposed to correct those effects. Nowadays, many commercial treatment planning systems have adopted this algorithm as a dose calculation engine. The purpose of this study is to verify the accuracy of the dose calculated from pencil beam kernel based treatment planning system comparing to Monte Carlo simulations and measurements especially in inhomogeneous region. Home-made inhomogeneous phantom, Helax-TMS ver. 6.0 and Monte Carlo code BEAMnrc and DOSXYZnrc were used in this study. In homogeneous media, the accuracy was acceptable but in inhomogeneous media, the errors were more significant. However in general clinical situation, pencil beam kernel based convolution algorithm is thought to be a valuable tool to calculate the dose.

  11. Bi-objective integrating sustainable order allocation and sustainable supply chain network strategic design with stochastic demand using a novel robust hybrid multi-objective metaheuristic

    DEFF Research Database (Denmark)

    Govindan, Kannan; Jafarian, Ahmad; Nourbakhsh, Vahid

    2015-01-01

    simultaneously considering the sustainable OAP in the sustainable SCND as a strategic decision. The proposed supply chain network is composed of five echelons including suppliers classified in different classes, plants, distribution centers that dispatch products via two different ways, direct shipment......, a novel multi-objective hybrid approach called MOHEV with two strategies for its best particle selection procedure (BPSP), minimum distance, and crowding distance is proposed. MOHEV is constructed through hybridization of two multi-objective algorithms, namely the adapted multi-objective electromagnetism...

  12. Objectivity

    CERN Document Server

    Daston, Lorraine

    2010-01-01

    Objectivity has a history, and it is full of surprises. In Objectivity, Lorraine Daston and Peter Galison chart the emergence of objectivity in the mid-nineteenth-century sciences--and show how the concept differs from its alternatives, truth-to-nature and trained judgment. This is a story of lofty epistemic ideals fused with workaday practices in the making of scientific images. From the eighteenth through the early twenty-first centuries, the images that reveal the deepest commitments of the empirical sciences--from anatomy to crystallography--are those featured in scientific atlases, the compendia that teach practitioners what is worth looking at and how to look at it. Galison and Daston use atlas images to uncover a hidden history of scientific objectivity and its rivals. Whether an atlas maker idealizes an image to capture the essentials in the name of truth-to-nature or refuses to erase even the most incidental detail in the name of objectivity or highlights patterns in the name of trained judgment is a...

  13. Modelling lateral beam quality variations in pencil kernel based photon dose calculations

    International Nuclear Information System (INIS)

    Nyholm, T; Olofsson, J; Ahnesjoe, A; Karlsson, M

    2006-01-01

    Standard treatment machines for external radiotherapy are designed to yield flat dose distributions at a representative treatment depth. The common method to reach this goal is to use a flattening filter to decrease the fluence in the centre of the beam. A side effect of this filtering is that the average energy of the beam is generally lower at a distance from the central axis, a phenomenon commonly referred to as off-axis softening. The off-axis softening results in a relative change in beam quality that is almost independent of machine brand and model. Central axis dose calculations using pencil beam kernels show no drastic loss in accuracy when the off-axis beam quality variations are neglected. However, for dose calculated at off-axis positions the effect should be considered, otherwise errors of several per cent can be introduced. This work proposes a method to explicitly include the effect of off-axis softening in pencil kernel based photon dose calculations for arbitrary positions in a radiation field. Variations of pencil kernel values are modelled through a generic relation between half value layer (HVL) thickness and off-axis position for standard treatment machines. The pencil kernel integration for dose calculation is performed through sampling of energy fluence and beam quality in sectors of concentric circles around the calculation point. The method is fully based on generic data and therefore does not require any specific measurements for characterization of the off-axis softening effect, provided that the machine performance is in agreement with the assumed HVL variations. The model is verified versus profile measurements at different depths and through a model self-consistency check, using the dose calculation model to estimate HVL values at off-axis positions. A comparison between calculated and measured profiles at different depths showed a maximum relative error of 4% without explicit modelling of off-axis softening. The maximum relative error

  14. MO-FG-CAMPUS-TeP3-04: Deliverable Robust Optimization in IMPT Using Quadratic Objective Function

    Energy Technology Data Exchange (ETDEWEB)

    Shan, J; Liu, W; Bues, M; Schild, S [Mayo Clinic Arizona, Phoenix, AZ (United States)

    2016-06-15

    Purpose: To find and evaluate the way of applying deliverable MU constraints into robust spot intensity optimization in Intensity-Modulated- Proton-Therapy (IMPT) to prevent plan quality and robustness from degrading due to machine deliverable MU-constraints. Methods: Currently, the influence of the deliverable MU-constraints is retrospectively evaluated by post-processing immediately following optimization. In this study, we propose a new method based on the quasi-Newton-like L-BFGS-B algorithm with which we turn deliverable MU-constraints on and off alternatively during optimization. Seven patients with two different machine settings (small and large spot size) were planned with both conventional and new methods. For each patient, three kinds of plans were generated — conventional non-deliverable plan (plan A), conventional deliverable plan with post-processing (plan B), and new deliverable plan (plan C). We performed this study with both realistic (small) and artificial (large) deliverable MU-constraints. Results: With small minimum MU-constraints considered, new method achieved a slightly better plan quality than conventional method (D95% CTV normalized to the prescription dose: 0.994[0.992∼0.996] (Plan C) vs 0.992[0.986∼0.996] (Plan B)). With large minimum MU constraints considered, results show that the new method maintains plan quality while plan quality from the conventional method is degraded greatly (D95% CTV normalized to the prescription dose: 0.987[0.978∼0.994] (Plan C) vs 0.797[0.641∼1.000] (Plan B)). Meanwhile, plan robustness of these two method’s results is comparable. (For all 7 patients, CTV DVH band gap at D95% normalized to the prescription dose: 0.015[0.005∼0.043] (Plan C) vs 0.012[0.006∼0.038] (Plan B) with small MU-constraints and 0.019[0.009∼0.039] (Plan C) vs 0.030[0.015∼0.041] (Plan B) with large MU-constraints) Conclusion: Positive correlation has been found between plan quality degeneration and magnitude of

  15. MO-FG-CAMPUS-TeP3-04: Deliverable Robust Optimization in IMPT Using Quadratic Objective Function

    International Nuclear Information System (INIS)

    Shan, J; Liu, W; Bues, M; Schild, S

    2016-01-01

    Purpose: To find and evaluate the way of applying deliverable MU constraints into robust spot intensity optimization in Intensity-Modulated- Proton-Therapy (IMPT) to prevent plan quality and robustness from degrading due to machine deliverable MU-constraints. Methods: Currently, the influence of the deliverable MU-constraints is retrospectively evaluated by post-processing immediately following optimization. In this study, we propose a new method based on the quasi-Newton-like L-BFGS-B algorithm with which we turn deliverable MU-constraints on and off alternatively during optimization. Seven patients with two different machine settings (small and large spot size) were planned with both conventional and new methods. For each patient, three kinds of plans were generated — conventional non-deliverable plan (plan A), conventional deliverable plan with post-processing (plan B), and new deliverable plan (plan C). We performed this study with both realistic (small) and artificial (large) deliverable MU-constraints. Results: With small minimum MU-constraints considered, new method achieved a slightly better plan quality than conventional method (D95% CTV normalized to the prescription dose: 0.994[0.992∼0.996] (Plan C) vs 0.992[0.986∼0.996] (Plan B)). With large minimum MU constraints considered, results show that the new method maintains plan quality while plan quality from the conventional method is degraded greatly (D95% CTV normalized to the prescription dose: 0.987[0.978∼0.994] (Plan C) vs 0.797[0.641∼1.000] (Plan B)). Meanwhile, plan robustness of these two method’s results is comparable. (For all 7 patients, CTV DVH band gap at D95% normalized to the prescription dose: 0.015[0.005∼0.043] (Plan C) vs 0.012[0.006∼0.038] (Plan B) with small MU-constraints and 0.019[0.009∼0.039] (Plan C) vs 0.030[0.015∼0.041] (Plan B) with large MU-constraints) Conclusion: Positive correlation has been found between plan quality degeneration and magnitude of

  16. Methods for robustness programming

    NARCIS (Netherlands)

    Olieman, N.J.

    2008-01-01

    Robustness of an object is defined as the probability that an object will have properties as required. Robustness Programming (RP) is a mathematical approach for Robustness estimation and Robustness optimisation. An example in the context of designing a food product, is finding the best composition

  17. The Female Advantage in Object Location Memory is Robust to Verbalizability and Mode of Presentation of Test Stimuli

    Science.gov (United States)

    Lejbak, Lisa; Vrbancic, Mirna; Crossley, Margaret

    2009-01-01

    This study extends Duff and Hampson's [Duff, S., & Hampson, E. (2001). A sex difference on a novel spatial working memory task in humans. "Brain and Cognition, 47," 470-493] finding of a sex-related difference in favor of females for an object location memory task. Twenty female and 20 male undergraduate students performed both manual and…

  18. A Robust Programming Approach to Bi-objective Optimization Model in the Disaster Relief Logistics Response Phase

    Directory of Open Access Journals (Sweden)

    Mohsen Saffarian

    2015-05-01

    Full Text Available Accidents and natural disasters and crises coming out of them indicate the importance of an integrated planning to reduce their effected. Therefore, disaster relief logistics is one of the main activities in disaster management. In this paper, we study the response phase of the disaster management cycle and a bi-objective model has been developed for relief chain logistic in uncertainty condition including uncertainty in traveling time an also amount of demand in damaged areas. The proposed mathematical model has two objective functions. The first one is to minimize the sum of arrival times to damaged area multiplying by amount of demand and the second objective function is to maximize the minimum ratio of satisfied demands in total period in order to fairness in the distribution of goods. In the proposed model, the problem has been considered periodically and in order to solve the mathematical model, Global Criterion method has been used and a case study has been done at South Khorasan.

  19. Toward a Robust Security Paradigm for Bluetooth Low Energy-Based Smart Objects in the Internet-of-Things

    Science.gov (United States)

    Cha, Shi-Cho; Chen, Jyun-Fu

    2017-01-01

    Bluetooth Low Energy (BLE) has emerged as one of the most promising technologies to enable the Internet-of-Things (IoT) paradigm. In BLE-based IoT applications, e.g., wearables-oriented service applications, the Bluetooth MAC addresses of devices will be swapped for device pairings. The random address technique is adopted to prevent malicious users from tracking the victim’s devices with stationary Bluetooth MAC addresses and accordingly the device privacy can be preserved. However, there exists a tradeoff between privacy and security in the random address technique. That is, when device pairing is launched and one device cannot actually identify another one with addresses, it provides an opportunity for malicious users to break the system security via impersonation attacks. Hence, using random addresses may lead to higher security risks. In this study, we point out the potential risk of using random address technique and then present critical security requirements for BLE-based IoT applications. To fulfill the claimed requirements, we present a privacy-aware mechanism, which is based on elliptic curve cryptography, for secure communication and access-control among BLE-based IoT objects. Moreover, to ensure the security of smartphone application associated with BLE-based IoT objects, we construct a Smart Contract-based Investigation Report Management framework (SCIRM) which enables smartphone application users to obtain security inspection reports of BLE-based applications of interest with smart contracts. PMID:29036900

  20. Toward a Robust Security Paradigm for Bluetooth Low Energy-Based Smart Objects in the Internet-of-Things

    Directory of Open Access Journals (Sweden)

    Shi-Cho Cha

    2017-10-01

    Full Text Available Bluetooth Low Energy (BLE has emerged as one of the most promising technologies to enable the Internet-of-Things (IoT paradigm. In BLE-based IoT applications, e.g., wearables-oriented service applications, the Bluetooth MAC addresses of devices will be swapped for device pairings. The random address technique is adopted to prevent malicious users from tracking the victim’s devices with stationary Bluetooth MAC addresses and accordingly the device privacy can be preserved. However, there exists a tradeoff between privacy and security in the random address technique. That is, when device pairing is launched and one device cannot actually identify another one with addresses, it provides an opportunity for malicious users to break the system security via impersonation attacks. Hence, using random addresses may lead to higher security risks. In this study, we point out the potential risk of using random address technique and then present critical security requirements for BLE-based IoT applications. To fulfill the claimed requirements, we present a privacy-aware mechanism, which is based on elliptic curve cryptography, for secure communication and access-control among BLE-based IoT objects. Moreover, to ensure the security of smartphone application associated with BLE-based IoT objects, we construct a Smart Contract-based Investigation Report Management framework (SCIRM which enables smartphone application users to obtain security inspection reports of BLE-based applications of interest with smart contracts.

  1. Toward a Robust Security Paradigm for Bluetooth Low Energy-Based Smart Objects in the Internet-of-Things.

    Science.gov (United States)

    Cha, Shi-Cho; Yeh, Kuo-Hui; Chen, Jyun-Fu

    2017-10-14

    Bluetooth Low Energy (BLE) has emerged as one of the most promising technologies to enable the Internet-of-Things (IoT) paradigm. In BLE-based IoT applications, e.g., wearables-oriented service applications, the Bluetooth MAC addresses of devices will be swapped for device pairings. The random address technique is adopted to prevent malicious users from tracking the victim's devices with stationary Bluetooth MAC addresses and accordingly the device privacy can be preserved. However, there exists a tradeoff between privacy and security in the random address technique. That is, when device pairing is launched and one device cannot actually identify another one with addresses, it provides an opportunity for malicious users to break the system security via impersonation attacks. Hence, using random addresses may lead to higher security risks. In this study, we point out the potential risk of using random address technique and then present critical security requirements for BLE-based IoT applications. To fulfill the claimed requirements, we present a privacy-aware mechanism, which is based on elliptic curve cryptography, for secure communication and access-control among BLE-based IoT objects. Moreover, to ensure the security of smartphone application associated with BLE-based IoT objects, we construct a Smart Contract-based Investigation Report Management framework (SCIRM) which enables smartphone application users to obtain security inspection reports of BLE-based applications of interest with smart contracts.

  2. A robust object-based shadow detection method for cloud-free high resolution satellite images over urban areas and water bodies

    Science.gov (United States)

    Tatar, Nurollah; Saadatseresht, Mohammad; Arefi, Hossein; Hadavand, Ahmad

    2018-06-01

    Unwanted contrast in high resolution satellite images such as shadow areas directly affects the result of further processing in urban remote sensing images. Detecting and finding the precise position of shadows is critical in different remote sensing processing chains such as change detection, image classification and digital elevation model generation from stereo images. The spectral similarity between shadow areas, water bodies, and some dark asphalt roads makes the development of robust shadow detection algorithms challenging. In addition, most of the existing methods work on pixel-level and neglect the contextual information contained in neighboring pixels. In this paper, a new object-based shadow detection framework is introduced. In the proposed method a pixel-level shadow mask is built by extending established thresholding methods with a new C4 index which enables to solve the ambiguity of shadow and water bodies. Then the pixel-based results are further processed in an object-based majority analysis to detect the final shadow objects. Four different high resolution satellite images are used to validate this new approach. The result shows the superiority of the proposed method over some state-of-the-art shadow detection method with an average of 96% in F-measure.

  3. Detection of Stress Levels from Biosignals Measured in Virtual Reality Environments Using a Kernel-Based Extreme Learning Machine.

    Science.gov (United States)

    Cho, Dongrae; Ham, Jinsil; Oh, Jooyoung; Park, Jeanho; Kim, Sayup; Lee, Nak-Kyu; Lee, Boreom

    2017-10-24

    Virtual reality (VR) is a computer technique that creates an artificial environment composed of realistic images, sounds, and other sensations. Many researchers have used VR devices to generate various stimuli, and have utilized them to perform experiments or to provide treatment. In this study, the participants performed mental tasks using a VR device while physiological signals were measured: a photoplethysmogram (PPG), electrodermal activity (EDA), and skin temperature (SKT). In general, stress is an important factor that can influence the autonomic nervous system (ANS). Heart-rate variability (HRV) is known to be related to ANS activity, so we used an HRV derived from the PPG peak interval. In addition, the peak characteristics of the skin conductance (SC) from EDA and SKT variation can also reflect ANS activity; we utilized them as well. Then, we applied a kernel-based extreme-learning machine (K-ELM) to correctly classify the stress levels induced by the VR task to reflect five different levels of stress situations: baseline, mild stress, moderate stress, severe stress, and recovery. Twelve healthy subjects voluntarily participated in the study. Three physiological signals were measured in stress environment generated by VR device. As a result, the average classification accuracy was over 95% using K-ELM and the integrated feature (IT = HRV + SC + SKT). In addition, the proposed algorithm can embed a microcontroller chip since K-ELM algorithm have very short computation time. Therefore, a compact wearable device classifying stress levels using physiological signals can be developed.

  4. Kernel-based Joint Feature Selection and Max-Margin Classification for Early Diagnosis of Parkinson’s Disease

    Science.gov (United States)

    Adeli, Ehsan; Wu, Guorong; Saghafi, Behrouz; An, Le; Shi, Feng; Shen, Dinggang

    2017-01-01

    Feature selection methods usually select the most compact and relevant set of features based on their contribution to a linear regression model. Thus, these features might not be the best for a non-linear classifier. This is especially crucial for the tasks, in which the performance is heavily dependent on the feature selection techniques, like the diagnosis of neurodegenerative diseases. Parkinson’s disease (PD) is one of the most common neurodegenerative disorders, which progresses slowly while affects the quality of life dramatically. In this paper, we use the data acquired from multi-modal neuroimaging data to diagnose PD by investigating the brain regions, known to be affected at the early stages. We propose a joint kernel-based feature selection and classification framework. Unlike conventional feature selection techniques that select features based on their performance in the original input feature space, we select features that best benefit the classification scheme in the kernel space. We further propose kernel functions, specifically designed for our non-negative feature types. We use MRI and SPECT data of 538 subjects from the PPMI database, and obtain a diagnosis accuracy of 97.5%, which outperforms all baseline and state-of-the-art methods.

  5. Kernel-based Joint Feature Selection and Max-Margin Classification for Early Diagnosis of Parkinson’s Disease

    Science.gov (United States)

    Adeli, Ehsan; Wu, Guorong; Saghafi, Behrouz; An, Le; Shi, Feng; Shen, Dinggang

    2017-01-01

    Feature selection methods usually select the most compact and relevant set of features based on their contribution to a linear regression model. Thus, these features might not be the best for a non-linear classifier. This is especially crucial for the tasks, in which the performance is heavily dependent on the feature selection techniques, like the diagnosis of neurodegenerative diseases. Parkinson’s disease (PD) is one of the most common neurodegenerative disorders, which progresses slowly while affects the quality of life dramatically. In this paper, we use the data acquired from multi-modal neuroimaging data to diagnose PD by investigating the brain regions, known to be affected at the early stages. We propose a joint kernel-based feature selection and classification framework. Unlike conventional feature selection techniques that select features based on their performance in the original input feature space, we select features that best benefit the classification scheme in the kernel space. We further propose kernel functions, specifically designed for our non-negative feature types. We use MRI and SPECT data of 538 subjects from the PPMI database, and obtain a diagnosis accuracy of 97.5%, which outperforms all baseline and state-of-the-art methods. PMID:28120883

  6. A Spectral-Texture Kernel-Based Classification Method for Hyperspectral Images

    Directory of Open Access Journals (Sweden)

    Yi Wang

    2016-11-01

    Full Text Available Classification of hyperspectral images always suffers from high dimensionality and very limited labeled samples. Recently, the spectral-spatial classification has attracted considerable attention and can achieve higher classification accuracy and smoother classification maps. In this paper, a novel spectral-spatial classification method for hyperspectral images by using kernel methods is investigated. For a given hyperspectral image, the principle component analysis (PCA transform is first performed. Then, the first principle component of the input image is segmented into non-overlapping homogeneous regions by using the entropy rate superpixel (ERS algorithm. Next, the local spectral histogram model is applied to each homogeneous region to obtain the corresponding texture features. Because this step is performed within each homogenous region, instead of within a fixed-size image window, the obtained local texture features in the image are more accurate, which can effectively benefit the improvement of classification accuracy. In the following step, a contextual spectral-texture kernel is constructed by combining spectral information in the image and the extracted texture information using the linearity property of the kernel methods. Finally, the classification map is achieved by the support vector machines (SVM classifier using the proposed spectral-texture kernel. Experiments on two benchmark airborne hyperspectral datasets demonstrate that our method can effectively improve classification accuracies, even though only a very limited training sample is available. Specifically, our method can achieve from 8.26% to 15.1% higher in terms of overall accuracy than the traditional SVM classifier. The performance of our method was further compared to several state-of-the-art classification methods of hyperspectral images using objective quantitative measures and a visual qualitative evaluation.

  7. Quantification and recognition of parkinsonian gait from monocular video imaging using kernel-based principal component analysis

    Directory of Open Access Journals (Sweden)

    Chen Shih-Wei

    2011-11-01

    Full Text Available Abstract Background The computer-aided identification of specific gait patterns is an important issue in the assessment of Parkinson's disease (PD. In this study, a computer vision-based gait analysis approach is developed to assist the clinical assessments of PD with kernel-based principal component analysis (KPCA. Method Twelve PD patients and twelve healthy adults with no neurological history or motor disorders within the past six months were recruited and separated according to their "Non-PD", "Drug-On", and "Drug-Off" states. The participants were asked to wear light-colored clothing and perform three walking trials through a corridor decorated with a navy curtain at their natural pace. The participants' gait performance during the steady-state walking period was captured by a digital camera for gait analysis. The collected walking image frames were then transformed into binary silhouettes for noise reduction and compression. Using the developed KPCA-based method, the features within the binary silhouettes can be extracted to quantitatively determine the gait cycle time, stride length, walking velocity, and cadence. Results and Discussion The KPCA-based method uses a feature-extraction approach, which was verified to be more effective than traditional image area and principal component analysis (PCA approaches in classifying "Non-PD" controls and "Drug-Off/On" PD patients. Encouragingly, this method has a high accuracy rate, 80.51%, for recognizing different gaits. Quantitative gait parameters are obtained, and the power spectrums of the patients' gaits are analyzed. We show that that the slow and irregular actions of PD patients during walking tend to transfer some of the power from the main lobe frequency to a lower frequency band. Our results indicate the feasibility of using gait performance to evaluate the motor function of patients with PD. Conclusion This KPCA-based method requires only a digital camera and a decorated corridor setup

  8. SU-F-T-672: A Novel Kernel-Based Dose Engine for KeV Photon Beams

    Energy Technology Data Exchange (ETDEWEB)

    Reinhart, M; Fast, M F; Nill, S; Oelfke, U [The Institute of Cancer Research and The Royal Marsden NHS Foundation Trust, London (United Kingdom)

    2016-06-15

    Purpose: Mimicking state-of-the-art patient radiotherapy with high precision irradiators for small animals allows advanced dose-effect studies and radiobiological investigations. One example is the implementation of pre-clinical IMRT-like irradiations, which requires the development of inverse planning for keV photon beams. As a first step, we present a novel kernel-based dose calculation engine for keV x-rays with explicit consideration of energy and material dependencies. Methods: We follow a superposition-convolution approach adapted to keV x-rays, based on previously published work on micro-beam therapy. In small animal radiotherapy, we assume local energy deposition at the photon interaction point, since the electron ranges in tissue are of the same order of magnitude as the voxel size. This allows us to use photon-only kernel sets generated by MC simulations, which are pre-calculated for six energy windows and ten base materials. We validate our stand-alone dose engine against Geant4 MC simulations for various beam configurations in water, slab phantoms with bone and lung inserts, and on a mouse CT with (0.275mm)3 voxels. Results: We observe good agreement for all cases. For field sizes of 1mm{sup 2} to 1cm{sup 2} in water, the depth dose curves agree within 1% (mean), with the largest deviations in the first voxel (4%) and at depths>5cm (<2.5%). The out-of-field doses at 1cm depth agree within 8% (mean) for all but the smallest field size. In slab geometries, the mean agreement was within 3%, with maximum deviations of 8% at water-bone interfaces. The γ-index (1mm/1%) passing rate for a single-field mouse irradiation is 71%. Conclusion: The presented dose engine yields an accurate representation of keV-photon doses suitable for inverse treatment planning for IMRT. It has the potential to become a significantly faster yet sufficiently accurate alternative to full MC simulations. Further investigations will focus on energy sampling as well as calculation

  9. A Low-Power Wireless Image Sensor Node with Noise-Robust Moving Object Detection and a Region-of-Interest Based Rate Controller

    Science.gov (United States)

    2017-03-01

    from both environment and hardware further reduces the transmission energy with negligible computation and memory overhead. The rate controller...detection, Region-of-interest, Rate control Introduction In wireless image sensor nodes for moving object surveillance, energy efficiency can be...noise, reliable moving object detection is required to avoid unnecessary transmission of background scenes [1]. Transmission energy can be further

  10. Robustness Beamforming Algorithms

    Directory of Open Access Journals (Sweden)

    Sajad Dehghani

    2014-04-01

    Full Text Available Adaptive beamforming methods are known to degrade in the presence of steering vector and covariance matrix uncertinity. In this paper, a new approach is presented to robust adaptive minimum variance distortionless response beamforming make robust against both uncertainties in steering vector and covariance matrix. This method minimize a optimization problem that contains a quadratic objective function and a quadratic constraint. The optimization problem is nonconvex but is converted to a convex optimization problem in this paper. It is solved by the interior-point method and optimum weight vector to robust beamforming is achieved.

  11. Application of a kernel-based online learning algorithm to the classification of nodule candidates in computer-aided detection of CT lung nodules

    International Nuclear Information System (INIS)

    Matsumoto, S.; Ohno, Y.; Takenaka, D.; Sugimura, K.; Yamagata, H.

    2007-01-01

    Classification of the nodule candidates in computer-aided detection (CAD) of lung nodules in CT images was addressed by constructing a nonlinear discriminant function using a kernel-based learning algorithm called the kernel recursive least-squares (KRLS) algorithm. Using the nodule candidates derived from the processing by a CAD scheme of 100 CT datasets containing 253 non-calcified nodules or 3 mm or larger as determined by the consensus of two thoracic radiologists, the following trial were carried out 100 times: by randomly selecting 50 datasets for training, a nonlinear discriminant function was obtained using the nodule candidates in the training datasets and tested with the remaining candidates; for comparison, a rule-based classification was tested in a similar manner. At the number of false positives per case of about 5, the nonlinear classification method showed an improved sensitivity of 80% (mean over the 100 trials) compared with 74% of the rule-based method. (orig.)

  12. Robust Scientists

    DEFF Research Database (Denmark)

    Gorm Hansen, Birgitte

    their core i nterests, 2) developing a selfsupply of industry interests by becoming entrepreneurs and thus creating their own compliant industry partner and 3) balancing resources within a larger collective of researchers, thus countering changes in the influx of funding caused by shifts in political...... knowledge", Danish research policy seems to have helped develop politically and economically "robust scientists". Scientific robustness is acquired by way of three strategies: 1) tasting and discriminating between resources so as to avoid funding that erodes academic profiles and push scientists away from...

  13. Robust surgery loading

    NARCIS (Netherlands)

    Hans, Elias W.; Wullink, Gerhard; van Houdenhoven, Mark; Kazemier, Geert

    2008-01-01

    We consider the robust surgery loading problem for a hospital’s operating theatre department, which concerns assigning surgeries and sufficient planned slack to operating room days. The objective is to maximize capacity utilization and minimize the risk of overtime, and thus cancelled patients. This

  14. Monitoring the change in colour of meat: A comparison of traditional and kernel-based orthogonal transformations

    DEFF Research Database (Denmark)

    Christiansen, Asger Nyman; Carstensen, Jens Michael; Møller, Flemming

    2012-01-01

    Currently, no objective method exists for estimating the rate of change in the colour of meat. Consequently, the purpose of this work is to develop a procedure capable of monitoring the change in colour of meat over time, environment and ingredients. This provides a useful tool to determine which...... storage environments and ingredients a manufacturer should add to meat to reduce the rate of change in colour. The procedure consists of taking multi-spectral images of a piece of meat as a function of time, clustering the pixels of these images into categories, including several types of meat...

  15. Polytomous diagnosis of ovarian tumors as benign, borderline, primary invasive or metastatic: development and validation of standard and kernel-based risk prediction models

    Directory of Open Access Journals (Sweden)

    Testa Antonia C

    2010-10-01

    Full Text Available Abstract Background Hitherto, risk prediction models for preoperative ultrasound-based diagnosis of ovarian tumors were dichotomous (benign versus malignant. We develop and validate polytomous models (models that predict more than two events to diagnose ovarian tumors as benign, borderline, primary invasive or metastatic invasive. The main focus is on how different types of models perform and compare. Methods A multi-center dataset containing 1066 women was used for model development and internal validation, whilst another multi-center dataset of 1938 women was used for temporal and external validation. Models were based on standard logistic regression and on penalized kernel-based algorithms (least squares support vector machines and kernel logistic regression. We used true polytomous models as well as combinations of dichotomous models based on the 'pairwise coupling' technique to produce polytomous risk estimates. Careful variable selection was performed, based largely on cross-validated c-index estimates. Model performance was assessed with the dichotomous c-index (i.e. the area under the ROC curve and a polytomous extension, and with calibration graphs. Results For all models, between 9 and 11 predictors were selected. Internal validation was successful with polytomous c-indexes between 0.64 and 0.69. For the best model dichotomous c-indexes were between 0.73 (primary invasive vs metastatic and 0.96 (borderline vs metastatic. On temporal and external validation, overall discrimination performance was good with polytomous c-indexes between 0.57 and 0.64. However, discrimination between primary and metastatic invasive tumors decreased to near random levels. Standard logistic regression performed well in comparison with advanced algorithms, and combining dichotomous models performed well in comparison with true polytomous models. The best model was a combination of dichotomous logistic regression models. This model is available online

  16. Non-linear modeling of 1H NMR metabonomic data using kernel-based orthogonal projections to latent structures optimized by simulated annealing

    International Nuclear Information System (INIS)

    Fonville, Judith M.; Bylesjoe, Max; Coen, Muireann; Nicholson, Jeremy K.; Holmes, Elaine; Lindon, John C.; Rantalainen, Mattias

    2011-01-01

    Highlights: → Non-linear modeling of metabonomic data using K-OPLS. → automated optimization of the kernel parameter by simulated annealing. → K-OPLS provides improved prediction performance for exemplar spectral data sets. → software implementation available for R and Matlab under GPL v2 license. - Abstract: Linear multivariate projection methods are frequently applied for predictive modeling of spectroscopic data in metabonomic studies. The OPLS method is a commonly used computational procedure for characterizing spectral metabonomic data, largely due to its favorable model interpretation properties providing separate descriptions of predictive variation and response-orthogonal structured noise. However, when the relationship between descriptor variables and the response is non-linear, conventional linear models will perform sub-optimally. In this study we have evaluated to what extent a non-linear model, kernel-based orthogonal projections to latent structures (K-OPLS), can provide enhanced predictive performance compared to the linear OPLS model. Just like its linear counterpart, K-OPLS provides separate model components for predictive variation and response-orthogonal structured noise. The improved model interpretation by this separate modeling is a property unique to K-OPLS in comparison to other kernel-based models. Simulated annealing (SA) was used for effective and automated optimization of the kernel-function parameter in K-OPLS (SA-K-OPLS). Our results reveal that the non-linear K-OPLS model provides improved prediction performance in three separate metabonomic data sets compared to the linear OPLS model. We also demonstrate how response-orthogonal K-OPLS components provide valuable biological interpretation of model and data. The metabonomic data sets were acquired using proton Nuclear Magnetic Resonance (NMR) spectroscopy, and include a study of the liver toxin galactosamine, a study of the nephrotoxin mercuric chloride and a study of

  17. Analysis of the performance, emission and combustion characteristics of a turbocharged diesel engine fuelled with Jatropha curcas biodiesel-diesel blends using kernel-based extreme learning machine.

    Science.gov (United States)

    Silitonga, Arridina Susan; Hassan, Masjuki Haji; Ong, Hwai Chyuan; Kusumo, Fitranto

    2017-11-01

    The purpose of this study is to investigate the performance, emission and combustion characteristics of a four-cylinder common-rail turbocharged diesel engine fuelled with Jatropha curcas biodiesel-diesel blends. A kernel-based extreme learning machine (KELM) model is developed in this study using MATLAB software in order to predict the performance, combustion and emission characteristics of the engine. To acquire the data for training and testing the KELM model, the engine speed was selected as the input parameter, whereas the performance, exhaust emissions and combustion characteristics were chosen as the output parameters of the KELM model. The performance, emissions and combustion characteristics predicted by the KELM model were validated by comparing the predicted data with the experimental data. The results show that the coefficient of determination of the parameters is within a range of 0.9805-0.9991 for both the KELM model and the experimental data. The mean absolute percentage error is within a range of 0.1259-2.3838. This study shows that KELM modelling is a useful technique in biodiesel production since it facilitates scientists and researchers to predict the performance, exhaust emissions and combustion characteristics of internal combustion engines with high accuracy.

  18. An extensive analysis of disease-gene associations using network integration and fast kernel-based gene prioritization methods

    Science.gov (United States)

    Valentini, Giorgio; Paccanaro, Alberto; Caniza, Horacio; Romero, Alfonso E.; Re, Matteo

    2014-01-01

    Objective In the context of “network medicine”, gene prioritization methods represent one of the main tools to discover candidate disease genes by exploiting the large amount of data covering different types of functional relationships between genes. Several works proposed to integrate multiple sources of data to improve disease gene prioritization, but to our knowledge no systematic studies focused on the quantitative evaluation of the impact of network integration on gene prioritization. In this paper, we aim at providing an extensive analysis of gene-disease associations not limited to genetic disorders, and a systematic comparison of different network integration methods for gene prioritization. Materials and methods We collected nine different functional networks representing different functional relationships between genes, and we combined them through both unweighted and weighted network integration methods. We then prioritized genes with respect to each of the considered 708 medical subject headings (MeSH) diseases by applying classical guilt-by-association, random walk and random walk with restart algorithms, and the recently proposed kernelized score functions. Results The results obtained with classical random walk algorithms and the best single network achieved an average area under the curve (AUC) across the 708 MeSH diseases of about 0.82, while kernelized score functions and network integration boosted the average AUC to about 0.89. Weighted integration, by exploiting the different “informativeness” embedded in different functional networks, outperforms unweighted integration at 0.01 significance level, according to the Wilcoxon signed rank sum test. For each MeSH disease we provide the top-ranked unannotated candidate genes, available for further bio-medical investigation. Conclusions Network integration is necessary to boost the performances of gene prioritization methods. Moreover the methods based on kernelized score functions can further

  19. A particle swarm optimized kernel-based clustering method for crop mapping from multi-temporal polarimetric L-band SAR observations

    Science.gov (United States)

    Tamiminia, Haifa; Homayouni, Saeid; McNairn, Heather; Safari, Abdoreza

    2017-06-01

    Polarimetric Synthetic Aperture Radar (PolSAR) data, thanks to their specific characteristics such as high resolution, weather and daylight independence, have become a valuable source of information for environment monitoring and management. The discrimination capability of observations acquired by these sensors can be used for land cover classification and mapping. The aim of this paper is to propose an optimized kernel-based C-means clustering algorithm for agriculture crop mapping from multi-temporal PolSAR data. Firstly, several polarimetric features are extracted from preprocessed data. These features are linear polarization intensities, and several statistical and physical based decompositions such as Cloude-Pottier, Freeman-Durden and Yamaguchi techniques. Then, the kernelized version of hard and fuzzy C-means clustering algorithms are applied to these polarimetric features in order to identify crop types. The kernel function, unlike the conventional partitioning clustering algorithms, simplifies the non-spherical and non-linearly patterns of data structure, to be clustered easily. In addition, in order to enhance the results, Particle Swarm Optimization (PSO) algorithm is used to tune the kernel parameters, cluster centers and to optimize features selection. The efficiency of this method was evaluated by using multi-temporal UAVSAR L-band images acquired over an agricultural area near Winnipeg, Manitoba, Canada, during June and July in 2012. The results demonstrate more accurate crop maps using the proposed method when compared to the classical approaches, (e.g. 12% improvement in general). In addition, when the optimization technique is used, greater improvement is observed in crop classification, e.g. 5% in overall. Furthermore, a strong relationship between Freeman-Durden volume scattering component, which is related to canopy structure, and phenological growth stages is observed.

  20. Robust snapshot interferometric spectropolarimetry.

    Science.gov (United States)

    Kim, Daesuk; Seo, Yoonho; Yoon, Yonghee; Dembele, Vamara; Yoon, Jae Woong; Lee, Kyu Jin; Magnusson, Robert

    2016-05-15

    This Letter describes a Stokes vector measurement method based on a snapshot interferometric common-path spectropolarimeter. The proposed scheme, which employs an interferometric polarization-modulation module, can extract the spectral polarimetric parameters Ψ(k) and Δ(k) of a transmissive anisotropic object by which an accurate Stokes vector can be calculated in the spectral domain. It is inherently strongly robust to the object 3D pose variation, since it is designed distinctly so that the measured object can be placed outside of the interferometric module. Experiments are conducted to verify the feasibility of the proposed system. The proposed snapshot scheme enables us to extract the spectral Stokes vector of a transmissive anisotropic object within tens of msec with high accuracy.

  1. Robust portfolio selection under norm uncertainty

    Directory of Open Access Journals (Sweden)

    Lei Wang

    2016-06-01

    Full Text Available Abstract In this paper, we consider the robust portfolio selection problem which has a data uncertainty described by the ( p , w $(p,w$ -norm in the objective function. We show that the robust formulation of this problem is equivalent to a linear optimization problem. Moreover, we present some numerical results concerning our robust portfolio selection problem.

  2. Intra-individual gait patterns across different time-scales as revealed by means of a supervised learning model using kernel-based discriminant regression.

    Directory of Open Access Journals (Sweden)

    Fabian Horst

    Full Text Available Traditionally, gait analysis has been centered on the idea of average behavior and normality. On one hand, clinical diagnoses and therapeutic interventions typically assume that average gait patterns remain constant over time. On the other hand, it is well known that all our movements are accompanied by a certain amount of variability, which does not allow us to make two identical steps. The purpose of this study was to examine changes in the intra-individual gait patterns across different time-scales (i.e., tens-of-mins, tens-of-hours.Nine healthy subjects performed 15 gait trials at a self-selected speed on 6 sessions within one day (duration between two subsequent sessions from 10 to 90 mins. For each trial, time-continuous ground reaction forces and lower body joint angles were measured. A supervised learning model using a kernel-based discriminant regression was applied for classifying sessions within individual gait patterns.Discernable characteristics of intra-individual gait patterns could be distinguished between repeated sessions by classification rates of 67.8 ± 8.8% and 86.3 ± 7.9% for the six-session-classification of ground reaction forces and lower body joint angles, respectively. Furthermore, the one-on-one-classification showed that increasing classification rates go along with increasing time durations between two sessions and indicate that changes of gait patterns appear at different time-scales.Discernable characteristics between repeated sessions indicate continuous intrinsic changes in intra-individual gait patterns and suggest a predominant role of deterministic processes in human motor control and learning. Natural changes of gait patterns without any externally induced injury or intervention may reflect continuous adaptations of the motor system over several time-scales. Accordingly, the modelling of walking by means of average gait patterns that are assumed to be near constant over time needs to be reconsidered in the

  3. Intra-individual gait patterns across different time-scales as revealed by means of a supervised learning model using kernel-based discriminant regression.

    Science.gov (United States)

    Horst, Fabian; Eekhoff, Alexander; Newell, Karl M; Schöllhorn, Wolfgang I

    2017-01-01

    Traditionally, gait analysis has been centered on the idea of average behavior and normality. On one hand, clinical diagnoses and therapeutic interventions typically assume that average gait patterns remain constant over time. On the other hand, it is well known that all our movements are accompanied by a certain amount of variability, which does not allow us to make two identical steps. The purpose of this study was to examine changes in the intra-individual gait patterns across different time-scales (i.e., tens-of-mins, tens-of-hours). Nine healthy subjects performed 15 gait trials at a self-selected speed on 6 sessions within one day (duration between two subsequent sessions from 10 to 90 mins). For each trial, time-continuous ground reaction forces and lower body joint angles were measured. A supervised learning model using a kernel-based discriminant regression was applied for classifying sessions within individual gait patterns. Discernable characteristics of intra-individual gait patterns could be distinguished between repeated sessions by classification rates of 67.8 ± 8.8% and 86.3 ± 7.9% for the six-session-classification of ground reaction forces and lower body joint angles, respectively. Furthermore, the one-on-one-classification showed that increasing classification rates go along with increasing time durations between two sessions and indicate that changes of gait patterns appear at different time-scales. Discernable characteristics between repeated sessions indicate continuous intrinsic changes in intra-individual gait patterns and suggest a predominant role of deterministic processes in human motor control and learning. Natural changes of gait patterns without any externally induced injury or intervention may reflect continuous adaptations of the motor system over several time-scales. Accordingly, the modelling of walking by means of average gait patterns that are assumed to be near constant over time needs to be reconsidered in the context of

  4. Robustness in laying hens

    NARCIS (Netherlands)

    Star, L.

    2008-01-01

    The aim of the project ‘The genetics of robustness in laying hens’ was to investigate nature and regulation of robustness in laying hens under sub-optimal conditions and the possibility to increase robustness by using animal breeding without loss of production. At the start of the project, a robust

  5. Robust continuous clustering.

    Science.gov (United States)

    Shah, Sohil Atul; Koltun, Vladlen

    2017-09-12

    Clustering is a fundamental procedure in the analysis of scientific data. It is used ubiquitously across the sciences. Despite decades of research, existing clustering algorithms have limited effectiveness in high dimensions and often require tuning parameters for different domains and datasets. We present a clustering algorithm that achieves high accuracy across multiple domains and scales efficiently to high dimensions and large datasets. The presented algorithm optimizes a smooth continuous objective, which is based on robust statistics and allows heavily mixed clusters to be untangled. The continuous nature of the objective also allows clustering to be integrated as a module in end-to-end feature learning pipelines. We demonstrate this by extending the algorithm to perform joint clustering and dimensionality reduction by efficiently optimizing a continuous global objective. The presented approach is evaluated on large datasets of faces, hand-written digits, objects, newswire articles, sensor readings from the Space Shuttle, and protein expression levels. Our method achieves high accuracy across all datasets, outperforming the best prior algorithm by a factor of 3 in average rank.

  6. Robust automated knowledge capture.

    Energy Technology Data Exchange (ETDEWEB)

    Stevens-Adams, Susan Marie; Abbott, Robert G.; Forsythe, James Chris; Trumbo, Michael Christopher Stefan; Haass, Michael Joseph; Hendrickson, Stacey M. Langfitt

    2011-10-01

    This report summarizes research conducted through the Sandia National Laboratories Robust Automated Knowledge Capture Laboratory Directed Research and Development project. The objective of this project was to advance scientific understanding of the influence of individual cognitive attributes on decision making. The project has developed a quantitative model known as RumRunner that has proven effective in predicting the propensity of an individual to shift strategies on the basis of task and experience related parameters. Three separate studies are described which have validated the basic RumRunner model. This work provides a basis for better understanding human decision making in high consequent national security applications, and in particular, the individual characteristics that underlie adaptive thinking.

  7. Perceptual Robust Design

    DEFF Research Database (Denmark)

    Pedersen, Søren Nygaard

    The research presented in this PhD thesis has focused on a perceptual approach to robust design. The results of the research and the original contribution to knowledge is a preliminary framework for understanding, positioning, and applying perceptual robust design. Product quality is a topic...... been presented. Therefore, this study set out to contribute to the understanding and application of perceptual robust design. To achieve this, a state-of-the-art and current practice review was performed. From the review two main research problems were identified. Firstly, a lack of tools...... for perceptual robustness was found to overlap with the optimum for functional robustness and at most approximately 2.2% out of the 14.74% could be ascribed solely to the perceptual robustness optimisation. In conclusion, the thesis have offered a new perspective on robust design by merging robust design...

  8. Robustness of Structural Systems

    DEFF Research Database (Denmark)

    Canisius, T.D.G.; Sørensen, John Dalsgaard; Baker, J.W.

    2007-01-01

    The importance of robustness as a property of structural systems has been recognised following several structural failures, such as that at Ronan Point in 1968,where the consequenceswere deemed unacceptable relative to the initiating damage. A variety of research efforts in the past decades have...... attempted to quantify aspects of robustness such as redundancy and identify design principles that can improve robustness. This paper outlines the progress of recent work by the Joint Committee on Structural Safety (JCSS) to develop comprehensive guidance on assessing and providing robustness in structural...... systems. Guidance is provided regarding the assessment of robustness in a framework that considers potential hazards to the system, vulnerability of system components, and failure consequences. Several proposed methods for quantifying robustness are reviewed, and guidelines for robust design...

  9. Robust multivariate analysis

    CERN Document Server

    J Olive, David

    2017-01-01

    This text presents methods that are robust to the assumption of a multivariate normal distribution or methods that are robust to certain types of outliers. Instead of using exact theory based on the multivariate normal distribution, the simpler and more applicable large sample theory is given.  The text develops among the first practical robust regression and robust multivariate location and dispersion estimators backed by theory.   The robust techniques  are illustrated for methods such as principal component analysis, canonical correlation analysis, and factor analysis.  A simple way to bootstrap confidence regions is also provided. Much of the research on robust multivariate analysis in this book is being published for the first time. The text is suitable for a first course in Multivariate Statistical Analysis or a first course in Robust Statistics. This graduate text is also useful for people who are familiar with the traditional multivariate topics, but want to know more about handling data sets with...

  10. Robust and Efficient Parametric Face Alignment

    NARCIS (Netherlands)

    Tzimiropoulos, Georgios; Zafeiriou, Stefanos; Pantic, Maja

    2011-01-01

    We propose a correlation-based approach to parametric object alignment particularly suitable for face analysis applications which require efficiency and robustness against occlusions and illumination changes. Our algorithm registers two images by iteratively maximizing their correlation coefficient

  11. Robustness studies on coal gasification process variables

    African Journals Online (AJOL)

    coal before feeding to the gasification process [1]. .... to-control variables will make up the terms in the response surface model for the ... Montgomery (1999) explained that all the Taguchi engineering objectives for a robust ..... software [3].

  12. Transportation Infrastructure Robustness : Joint Engineering and Economic Analysis

    Science.gov (United States)

    2017-11-01

    The objectives of this study are to develop a methodology for assessing the robustness of transportation infrastructure facilities and assess the effect of damage to such facilities on travel demand and the facilities users welfare. The robustness...

  13. Robustness of Structures

    DEFF Research Database (Denmark)

    Faber, Michael Havbro; Vrouwenvelder, A.C.W.M.; Sørensen, John Dalsgaard

    2011-01-01

    In 2005, the Joint Committee on Structural Safety (JCSS) together with Working Commission (WC) 1 of the International Association of Bridge and Structural Engineering (IABSE) organized a workshop on robustness of structures. Two important decisions resulted from this workshop, namely...... ‘COST TU0601: Robustness of Structures’ was initiated in February 2007, aiming to provide a platform for exchanging and promoting research in the area of structural robustness and to provide a basic framework, together with methods, strategies and guidelines enhancing robustness of structures...... the development of a joint European project on structural robustness under the COST (European Cooperation in Science and Technology) programme and the decision to develop a more elaborate document on structural robustness in collaboration between experts from the JCSS and the IABSE. Accordingly, a project titled...

  14. Robust Growth Determinants

    OpenAIRE

    Doppelhofer, Gernot; Weeks, Melvyn

    2011-01-01

    This paper investigates the robustness of determinants of economic growth in the presence of model uncertainty, parameter heterogeneity and outliers. The robust model averaging approach introduced in the paper uses a flexible and parsi- monious mixture modeling that allows for fat-tailed errors compared to the normal benchmark case. Applying robust model averaging to growth determinants, the paper finds that eight out of eighteen variables found to be significantly related to economic growth ...

  15. Robust Programming by Example

    OpenAIRE

    Bishop , Matt; Elliott , Chip

    2011-01-01

    Part 2: WISE 7; International audience; Robust programming lies at the heart of the type of coding called “secure programming”. Yet it is rarely taught in academia. More commonly, the focus is on how to avoid creating well-known vulnerabilities. While important, that misses the point: a well-structured, robust program should anticipate where problems might arise and compensate for them. This paper discusses one view of robust programming and gives an example of how it may be taught.

  16. Robust procedures in chemometrics

    DEFF Research Database (Denmark)

    Kotwa, Ewelina

    properties of the analysed data. The broad theoretical background of robust procedures was given as a very useful supplement to the classical methods, and a new tool, based on robust PCA, aiming at identifying Rayleigh and Raman scatters in excitation-mission (EEM) data was developed. The results show...

  17. Object and Objective Lost?

    DEFF Research Database (Denmark)

    Lopdrup-Hjorth, Thomas

    2015-01-01

    This paper explores the erosion and problematization of ‘the organization’ as a demarcated entity. Utilizing Foucault's reflections on ‘state-phobia’ as a source of inspiration, I show how an organization-phobia has gained a hold within Organization Theory (OT). By attending to the history...... of this organization-phobia, the paper argues that OT has become increasingly incapable of speaking about its core object. I show how organizations went from being conceptualized as entities of major importance to becoming theoretically deconstructed and associated with all kinds of ills. Through this history......, organizations as distinct entities have been rendered so problematic that they have gradually come to be removed from the center of OT. The costs of this have been rather significant. Besides undermining the grounds that gave OT intellectual credibility and legitimacy to begin with, the organization-phobia...

  18. On the robustness of Herlihy's hierarchy

    Science.gov (United States)

    Jayanti, Prasad

    1993-01-01

    A wait-free hierarchy maps object types to levels in Z(+) U (infinity) and has the following property: if a type T is at level N, and T' is an arbitrary type, then there is a wait-free implementation of an object of type T', for N processes, using only registers and objects of type T. The infinite hierarchy defined by Herlihy is an example of a wait-free hierarchy. A wait-free hierarchy is robust if it has the following property: if T is at level N, and S is a finite set of types belonging to levels N - 1 or lower, then there is no wait-free implementation of an object of type T, for N processes, using any number and any combination of objects belonging to the types in S. Robustness implies that there are no clever ways of combining weak shared objects to obtain stronger ones. Contrary to what many researchers believe, we prove that Herlihy's hierarchy is not robust. We then define some natural variants of Herlihy's hierarchy, which are also infinite wait-free hierarchies. With the exception of one, which is still open, these are not robust either. We conclude with the open question of whether non-trivial robust wait-free hierarchies exist.

  19. A Survey on Robustness in Railway Planning

    DEFF Research Database (Denmark)

    Lusby, Richard Martin; Larsen, Jesper; Bull, Simon Henry

    2018-01-01

    Planning problems in passenger railway range from long term strategic decision making to the detailed planning of operations.Operations research methods have played an increasing role in this planning process. However, recently more attention has been given to considerations of robustness...... in the quality of solutions to individual planning problems, and of operations in general. Robustness in general is the capacity for some system to absorb or resist changes. In the context of railway robustness it is often taken to be the capacity for operations to continue at some level when faced...... with a disruption such as delay or failure. This has resulted in more attention given to the inclusion of robustness measures and objectives in individual planning problems, and to the providing of tools to ensure operations continue under disrupted situations. In this paper we survey the literature on robustness...

  20. Robustness Metrics: Consolidating the multiple approaches to quantify Robustness

    DEFF Research Database (Denmark)

    Göhler, Simon Moritz; Eifler, Tobias; Howard, Thomas J.

    2016-01-01

    robustness metrics; 3) Functional expectancy and dispersion robustness metrics; and 4) Probability of conformance robustness metrics. The goal was to give a comprehensive overview of robustness metrics and guidance to scholars and practitioners to understand the different types of robustness metrics...

  1. Robustness of Structures

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2008-01-01

    This paper describes the background of the robustness requirements implemented in the Danish Code of Practice for Safety of Structures and in the Danish National Annex to the Eurocode 0, see (DS-INF 146, 2003), (DS 409, 2006), (EN 1990 DK NA, 2007) and (Sørensen and Christensen, 2006). More...... frequent use of advanced types of structures with limited redundancy and serious consequences in case of failure combined with increased requirements to efficiency in design and execution followed by increased risk of human errors has made the need of requirements to robustness of new structures essential....... According to Danish design rules robustness shall be documented for all structures in high consequence class. The design procedure to document sufficient robustness consists of: 1) Review of loads and possible failure modes / scenarios and determination of acceptable collapse extent; 2) Review...

  2. Robustness of structures

    DEFF Research Database (Denmark)

    Vrouwenvelder, T.; Sørensen, John Dalsgaard

    2009-01-01

    After the collapse of the World Trade Centre towers in 2001 and a number of collapses of structural systems in the beginning of the century, robustness of structural systems has gained renewed interest. Despite many significant theoretical, methodical and technological advances, structural...... of robustness for structural design such requirements are not substantiated in more detail, nor have the engineering profession been able to agree on an interpretation of robustness which facilitates for its uantification. A European COST action TU 601 on ‘Robustness of structures' has started in 2007...... by a group of members of the CSS. This paper describes the ongoing work in this action, with emphasis on the development of a theoretical and risk based quantification and optimization procedure on the one side and a practical pre-normative guideline on the other....

  3. Robust Approaches to Forecasting

    OpenAIRE

    Jennifer Castle; David Hendry; Michael P. Clements

    2014-01-01

    We investigate alternative robust approaches to forecasting, using a new class of robust devices, contrasted with equilibrium correction models. Their forecasting properties are derived facing a range of likely empirical problems at the forecast origin, including measurement errors, implulses, omitted variables, unanticipated location shifts and incorrectly included variables that experience a shift. We derive the resulting forecast biases and error variances, and indicate when the methods ar...

  4. Robustness - theoretical framework

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Rizzuto, Enrico; Faber, Michael H.

    2010-01-01

    More frequent use of advanced types of structures with limited redundancy and serious consequences in case of failure combined with increased requirements to efficiency in design and execution followed by increased risk of human errors has made the need of requirements to robustness of new struct...... of this fact sheet is to describe a theoretical and risk based framework to form the basis for quantification of robustness and for pre-normative guidelines....

  5. Qualitative Robustness in Estimation

    Directory of Open Access Journals (Sweden)

    Mohammed Nasser

    2012-07-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Times New Roman","serif";} Qualitative robustness, influence function, and breakdown point are three main concepts to judge an estimator from the viewpoint of robust estimation. It is important as well as interesting to study relation among them. This article attempts to present the concept of qualitative robustness as forwarded by first proponents and its later development. It illustrates intricacies of qualitative robustness and its relation with consistency, and also tries to remove commonly believed misunderstandings about relation between influence function and qualitative robustness citing some examples from literature and providing a new counter-example. At the end it places a useful finite and a simulated version of   qualitative robustness index (QRI. In order to assess the performance of the proposed measures, we have compared fifteen estimators of correlation coefficient using simulated as well as real data sets.

  6. Robust recognition via information theoretic learning

    CERN Document Server

    He, Ran; Yuan, Xiaotong; Wang, Liang

    2014-01-01

    This Springer Brief represents a comprehensive review of information theoretic methods for robust recognition. A variety of information theoretic methods have been proffered in the past decade, in a large variety of computer vision applications; this work brings them together, attempts to impart the theory, optimization and usage of information entropy.The?authors?resort to a new information theoretic concept, correntropy, as a robust measure and apply it to solve robust face recognition and object recognition problems. For computational efficiency,?the brief?introduces the additive and multip

  7. Collision risk in white-tailed eagles. Modelling kernel-based collision risk using satellite telemetry data in Smoela wind-power plant

    Energy Technology Data Exchange (ETDEWEB)

    May, Roel; Nygaard, Torgeir; Dahl, Espen Lie; Reitan, Ole; Bevanger, Kjetil

    2011-05-15

    Large soaring birds of prey, such as the white-tailed eagle, are recognized to be perhaps the most vulnerable bird group regarding risk of collisions with turbines in wind-power plants. Their mortalities have called for methods capable of modelling collision risks in connection with the planning of new wind-power developments. The so-called 'Band model' estimates collision risk based on the number of birds flying through the rotor swept zone and the probability of being hit by the passing rotor blades. In the calculations for the expected collision mortality a correction factor for avoidance behaviour is included. The overarching objective of this study was to use satellite telemetry data and recorded mortality to back-calculate the correction factor for white-tailed eagles. The Smoela wind-power plant consists of 68 turbines, over an area of approximately 18 km2. Since autumn 2006 the number of collisions has been recorded on a weekly basis. The analyses were based on satellite telemetry data from 28 white-tailed eagles equipped with backpack transmitters since 2005. The correction factor (i.e. 'avoidance rate') including uncertainty levels used within the Band collision risk model for white-tailed eagles was 99% (94-100%) for spring and 100% for the other seasons. The year-round estimate, irrespective of season, was 98% (95-99%). Although the year-round estimate was similar, the correction factor for spring was higher than the correction factor of 95% derived earlier from vantage point data. The satellite telemetry data may provide an alternative way to provide insight into relative risk among seasons, and help identify periods or areas with increased risk either in a pre- or post construction situation. (Author)

  8. Robustness Evaluation of Timber Structures

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Sørensen, John Dalsgaard; čizmar, D.

    2010-01-01

    The present paper outlines results from working group 3 (WG3) in the EU COST Action E55 – ‘Modelling of the performance of timber structures’. The objectives of the project are related to the three main research activities: the identification and modelling of relevant load and environmental...... exposure scenarios, the improvement of knowledge concerning the behaviour of timber structural elements and the development of a generic framework for the assessment of the life-cycle vulnerability and robustness of timber structures....

  9. Robustness in econometrics

    CERN Document Server

    Sriboonchitta, Songsak; Huynh, Van-Nam

    2017-01-01

    This book presents recent research on robustness in econometrics. Robust data processing techniques – i.e., techniques that yield results minimally affected by outliers – and their applications to real-life economic and financial situations are the main focus of this book. The book also discusses applications of more traditional statistical techniques to econometric problems. Econometrics is a branch of economics that uses mathematical (especially statistical) methods to analyze economic systems, to forecast economic and financial dynamics, and to develop strategies for achieving desirable economic performance. In day-by-day data, we often encounter outliers that do not reflect the long-term economic trends, e.g., unexpected and abrupt fluctuations. As such, it is important to develop robust data processing techniques that can accommodate these fluctuations.

  10. Robust Manufacturing Control

    CERN Document Server

    2013-01-01

    This contributed volume collects research papers, presented at the CIRP Sponsored Conference Robust Manufacturing Control: Innovative and Interdisciplinary Approaches for Global Networks (RoMaC 2012, Jacobs University, Bremen, Germany, June 18th-20th 2012). These research papers present the latest developments and new ideas focusing on robust manufacturing control for global networks. Today, Global Production Networks (i.e. the nexus of interconnected material and information flows through which products and services are manufactured, assembled and distributed) are confronted with and expected to adapt to: sudden and unpredictable large-scale changes of important parameters which are occurring more and more frequently, event propagation in networks with high degree of interconnectivity which leads to unforeseen fluctuations, and non-equilibrium states which increasingly characterize daily business. These multi-scale changes deeply influence logistic target achievement and call for robust planning and control ...

  11. Robust plasmonic substrates

    DEFF Research Database (Denmark)

    Kostiučenko, Oksana; Fiutowski, Jacek; Tamulevicius, Tomas

    2014-01-01

    Robustness is a key issue for the applications of plasmonic substrates such as tip-enhanced Raman spectroscopy, surface-enhanced spectroscopies, enhanced optical biosensing, optical and optoelectronic plasmonic nanosensors and others. A novel approach for the fabrication of robust plasmonic...... substrates is presented, which relies on the coverage of gold nanostructures with diamond-like carbon (DLC) thin films of thicknesses 25, 55 and 105 nm. DLC thin films were grown by direct hydrocarbon ion beam deposition. In order to find the optimum balance between optical and mechanical properties...

  12. Robust Self Tuning Controllers

    DEFF Research Database (Denmark)

    Poulsen, Niels Kjølstad

    1985-01-01

    The present thesis concerns robustness properties of adaptive controllers. It is addressed to methods for robustifying self tuning controllers with respect to abrupt changes in the plant parameters. In the thesis an algorithm for estimating abruptly changing parameters is presented. The estimator...... has several operation modes and a detector for controlling the mode. A special self tuning controller has been developed to regulate plant with changing time delay.......The present thesis concerns robustness properties of adaptive controllers. It is addressed to methods for robustifying self tuning controllers with respect to abrupt changes in the plant parameters. In the thesis an algorithm for estimating abruptly changing parameters is presented. The estimator...

  13. Robust B+ -Tree-Based Indexing of Moving Objects

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Tiesyte, Dalia; Tradisauskas, Nerius

    2006-01-01

    Bx-tree is based on the B+-tree and is relatively easy to integrate into an existing DBMS. However, the Bx-tree is sensitive to data skew. This paper proposes a new query processing algorithm for the Bx-tree that fully exploits the available data statistics to reduce the query enlargement...

  14. Occlusion detection via structured sparse learning for robust object tracking

    KAUST Repository

    Zhang, Tianzhu

    2014-01-01

    Sparse representation based methods have recently drawn much attention in visual tracking due to good performance against illumination variation and occlusion. They assume the errors caused by image variations can be modeled as pixel-wise sparse. However, in many practical scenarios, these errors are not truly pixel-wise sparse but rather sparsely distributed in a structured way. In fact, pixels in error constitute contiguous regions within the object’s track. This is the case when significant occlusion occurs. To accommodate for nonsparse occlusion in a given frame, we assume that occlusion detected in previous frames can be propagated to the current one. This propagated information determines which pixels will contribute to the sparse representation of the current track. In other words, pixels that were detected as part of an occlusion in the previous frame will be removed from the target representation process. As such, this paper proposes a novel tracking algorithm that models and detects occlusion through structured sparse learning. We test our tracker on challenging benchmark sequences, such as sports videos, which involve heavy occlusion, drastic illumination changes, and large pose variations. Extensive experimental results show that our proposed tracker consistently outperforms the state-of-the-art trackers.

  15. Robust Optimal Fragmentation and Dispersion of Near-Earth Objects

    Data.gov (United States)

    National Aeronautics and Space Administration — During the past 2 decades, various concepts for mitigating the impact threats from NEOs have been proposed, but many of these concepts were impractical and not...

  16. Robustness of Multiple Objective Decision Analysis Preference Functions

    Science.gov (United States)

    2002-06-01

    Bayesian Decision Theory and Utilitarian Ethics ,” American Economic Review Papers and Proceedings, 68: 223-228 (May 1978). Hartsough, Bruce R. “A...1983). Morrell, Darryl and Eric Driver. “ Bayesian Network Implementation of Levi’s Epistemic Utility Decision Theory ,” International Journal Of...elicitation efficiency for the decision maker. Subject Terms Decision Analysis, Utility Theory , Elicitation Error, Operations Research, Decision

  17. Robust Object Tracking with a Hierarchical Ensemble Framework

    Science.gov (United States)

    2016-10-09

    consistency in the target bounding box level while we take this into con - sideration by employing an adaptive Kalman filter. Therefore our method is more...hu- man videos with occlusions(OCC), deformation( DEF ), back- ground clutter(BC), scale variations(SV), fast motion(FM) and illumination variation(IV... con - volutional features for visual tracking,” in Proceedings of the IEEE International Conference on Computer Vision, pp. 3074–3082, 2015. 445

  18. Occlusion detection via structured sparse learning for robust object tracking

    KAUST Repository

    Zhang, Tianzhu; Ghanem, Bernard; Xu, Changsheng; Ahuja, Narendra

    2014-01-01

    occlusion through structured sparse learning. We test our tracker on challenging benchmark sequences, such as sports videos, which involve heavy occlusion, drastic illumination changes, and large pose variations. Extensive experimental results show that our

  19. Robustness Envelopes of Networks

    NARCIS (Netherlands)

    Trajanovski, S.; Martín-Hernández, J.; Winterbach, W.; Van Mieghem, P.

    2013-01-01

    We study the robustness of networks under node removal, considering random node failure, as well as targeted node attacks based on network centrality measures. Whilst both of these have been studied in the literature, existing approaches tend to study random failure in terms of average-case

  20. Bootstrapping Kernel-Based Semiparametric Estimators

    DEFF Research Database (Denmark)

    Cattaneo, Matias D.; Jansson, Michael

    by accommodating a non-negligible bias. A noteworthy feature of the assumptions under which the result is obtained is that reliance on a commonly employed stochastic equicontinuity condition is avoided. The second main result shows that the bootstrap provides an automatic method of correcting for the bias even...... when it is non-negligible....

  1. Kernel-based tests for joint independence

    DEFF Research Database (Denmark)

    Pfister, Niklas; Bühlmann, Peter; Schölkopf, Bernhard

    2018-01-01

    if the $d$ variables are jointly independent, as long as the kernel is characteristic. Based on an empirical estimate of dHSIC, we define three different non-parametric hypothesis tests: a permutation test, a bootstrap test and a test based on a Gamma approximation. We prove that the permutation test......We investigate the problem of testing whether $d$ random variables, which may or may not be continuous, are jointly (or mutually) independent. Our method builds on ideas of the two variable Hilbert-Schmidt independence criterion (HSIC) but allows for an arbitrary number of variables. We embed...... the $d$-dimensional joint distribution and the product of the marginals into a reproducing kernel Hilbert space and define the $d$-variable Hilbert-Schmidt independence criterion (dHSIC) as the squared distance between the embeddings. In the population case, the value of dHSIC is zero if and only...

  2. A robust classic.

    Science.gov (United States)

    Kutzner, Florian; Vogel, Tobias; Freytag, Peter; Fiedler, Klaus

    2011-01-01

    In the present research, we argue for the robustness of illusory correlations (ICs, Hamilton & Gifford, 1976) regarding two boundary conditions suggested in previous research. First, we argue that ICs are maintained under extended experience. Using simulations, we derive conflicting predictions. Whereas noise-based accounts predict ICs to be maintained (Fielder, 2000; Smith, 1991), a prominent account based on discrepancy-reducing feedback learning predicts ICs to disappear (Van Rooy et al., 2003). An experiment involving 320 observations with majority and minority members supports the claim that ICs are maintained. Second, we show that actively using the stereotype to make predictions that are met with reward and punishment does not eliminate the bias. In addition, participants' operant reactions afford a novel online measure of ICs. In sum, our findings highlight the robustness of ICs that can be explained as a result of unbiased but noisy learning.

  3. Robust Airline Schedules

    OpenAIRE

    Eggenberg, Niklaus; Salani, Matteo; Bierlaire, Michel

    2010-01-01

    Due to economic pressure industries, when planning, tend to focus on optimizing the expected profit or the yield. The consequence of highly optimized solutions is an increased sensitivity to uncertainty. This generates additional "operational" costs, incurred by possible modifications of the original plan to be performed when reality does not reflect what was expected in the planning phase. The modern research trend focuses on "robustness" of solutions instead of yield or profit. Although ro...

  4. Robust Geometric Control of a Distillation Column

    DEFF Research Database (Denmark)

    Kymmel, Mogens; Andersen, Henrik Weisberg

    1987-01-01

    A frequency domain method, which makes it possible to adjust multivariable controllers with respect to both nominal performance and robustness, is presented. The basic idea in the approach is that the designer assigns objectives such as steady-state tracking, maximum resonance peaks, bandwidth, m...... is used to examine and improve geometric control of a binary distillation column....

  5. The Crane Robust Control

    Directory of Open Access Journals (Sweden)

    Marek Hicar

    2004-01-01

    Full Text Available The article is about a control design for complete structure of the crane: crab, bridge and crane uplift.The most important unknown parameters for simulations are burden weight and length of hanging rope. We will use robustcontrol for crab and bridge control to ensure adaptivity for burden weight and rope length. Robust control will be designed for current control of the crab and bridge, necessary is to know the range of unknown parameters. Whole robust will be splitto subintervals and after correct identification of unknown parameters the most suitable robust controllers will be chosen.The most important condition at the crab and bridge motion is avoiding from burden swinging in the final position. Crab and bridge drive is designed by asynchronous motor fed from frequency converter. We will use crane uplift with burden weightobserver in combination for uplift, crab and bridge drive with cooperation of their parameters: burden weight, rope length and crab and bridge position. Controllers are designed by state control method. We will use preferably a disturbance observerwhich will identify burden weight as a disturbance. The system will be working in both modes at empty hook as well asat maximum load: burden uplifting and dropping down.

  6. Robust motion estimation using connected operators

    OpenAIRE

    Salembier Clairon, Philippe Jean; Sanson, H

    1997-01-01

    This paper discusses the use of connected operators for robust motion estimation The proposed strategy involves a motion estimation step extracting the dominant motion and a ltering step relying on connected operators that remove objects that do not fol low the dominant motion. These two steps are iterated in order to obtain an accurate motion estimation and a precise de nition of the objects fol lowing this motion This strategy can be applied on the entire frame or on individual connected c...

  7. MULTIPLE OBJECTS

    Directory of Open Access Journals (Sweden)

    A. A. Bosov

    2015-04-01

    Full Text Available Purpose. The development of complicated techniques of production and management processes, information systems, computer science, applied objects of systems theory and others requires improvement of mathematical methods, new approaches for researches of application systems. And the variety and diversity of subject systems makes necessary the development of a model that generalizes the classical sets and their development – sets of sets. Multiple objects unlike sets are constructed by multiple structures and represented by the structure and content. The aim of the work is the analysis of multiple structures, generating multiple objects, the further development of operations on these objects in application systems. Methodology. To achieve the objectives of the researches, the structure of multiple objects represents as constructive trio, consisting of media, signatures and axiomatic. Multiple object is determined by the structure and content, as well as represented by hybrid superposition, composed of sets, multi-sets, ordered sets (lists and heterogeneous sets (sequences, corteges. Findings. In this paper we study the properties and characteristics of the components of hybrid multiple objects of complex systems, proposed assessments of their complexity, shown the rules of internal and external operations on objects of implementation. We introduce the relation of arbitrary order over multiple objects, we define the description of functions and display on objects of multiple structures. Originality.In this paper we consider the development of multiple structures, generating multiple objects.Practical value. The transition from the abstract to the subject of multiple structures requires the transformation of the system and multiple objects. Transformation involves three successive stages: specification (binding to the domain, interpretation (multiple sites and particularization (goals. The proposed describe systems approach based on hybrid sets

  8. A Comparative Theoretical and Computational Study on Robust Counterpart Optimization: I. Robust Linear Optimization and Robust Mixed Integer Linear Optimization

    Science.gov (United States)

    Li, Zukui; Ding, Ran; Floudas, Christodoulos A.

    2011-01-01

    Robust counterpart optimization techniques for linear optimization and mixed integer linear optimization problems are studied in this paper. Different uncertainty sets, including those studied in literature (i.e., interval set; combined interval and ellipsoidal set; combined interval and polyhedral set) and new ones (i.e., adjustable box; pure ellipsoidal; pure polyhedral; combined interval, ellipsoidal, and polyhedral set) are studied in this work and their geometric relationship is discussed. For uncertainty in the left hand side, right hand side, and objective function of the optimization problems, robust counterpart optimization formulations induced by those different uncertainty sets are derived. Numerical studies are performed to compare the solutions of the robust counterpart optimization models and applications in refinery production planning and batch process scheduling problem are presented. PMID:21935263

  9. Robust efficient video fingerprinting

    Science.gov (United States)

    Puri, Manika; Lubin, Jeffrey

    2009-02-01

    We have developed a video fingerprinting system with robustness and efficiency as the primary and secondary design criteria. In extensive testing, the system has shown robustness to cropping, letter-boxing, sub-titling, blur, drastic compression, frame rate changes, size changes and color changes, as well as to the geometric distortions often associated with camcorder capture in cinema settings. Efficiency is afforded by a novel two-stage detection process in which a fast matching process first computes a number of likely candidates, which are then passed to a second slower process that computes the overall best match with minimal false alarm probability. One key component of the algorithm is a maximally stable volume computation - a three-dimensional generalization of maximally stable extremal regions - that provides a content-centric coordinate system for subsequent hash function computation, independent of any affine transformation or extensive cropping. Other key features include an efficient bin-based polling strategy for initial candidate selection, and a final SIFT feature-based computation for final verification. We describe the algorithm and its performance, and then discuss additional modifications that can provide further improvement to efficiency and accuracy.

  10. Robust Instrumentation[Water treatment for power plant]; Robust Instrumentering

    Energy Technology Data Exchange (ETDEWEB)

    Wik, Anders [Vattenfall Utveckling AB, Stockholm (Sweden)

    2003-08-01

    Cementa Slite Power Station is a heat recovery steam generator (HRSG) with moderate steam data; 3.0 MPa and 420 deg C. The heat is recovered from Cementa, a cement industry, without any usage of auxiliary fuel. The Power station commenced operation in 2001. The layout of the plant is unusual, there are no similar in Sweden and very few world-wide, so the operational experiences are limited. In connection with the commissioning of the power plant a R and D project was identified with the objective to minimise the manpower needed for chemistry management of the plant. The lean chemistry management is based on robust instrumentation and chemical-free water treatment plant. The concept with robust instrumentation consists of the following components; choice of on-line instrumentation with a minimum of O and M and a chemical-free water treatment. The parameters are specific conductivity, cation conductivity, oxygen and pH. In addition to that, two fairly new on-line instruments were included; corrosion monitors and differential pH calculated from specific and cation conductivity. The chemical-free water treatment plant consists of softening, reverse osmosis and electro-deionisation. The operational experience shows that the cycle chemistry is not within the guidelines due to major problems with the operation of the power plant. These problems have made it impossible to reach steady state and thereby not viable to fully verify and validate the concept with robust instrumentation. From readings on the panel of the online analysers some conclusions may be drawn, e.g. the differential pH measurements have fulfilled the expectations. The other on-line analysers have been working satisfactorily apart from contamination with turbine oil, which has been noticed at least twice. The corrosion monitors seem to be working but the lack of trend curves from the mainframe computer system makes it hard to draw any clear conclusions. The chemical-free water treatment has met all

  11. Handling Occlusions for Robust Augmented Reality Systems

    Directory of Open Access Journals (Sweden)

    Maidi Madjid

    2010-01-01

    Full Text Available Abstract In Augmented Reality applications, the human perception is enhanced with computer-generated graphics. These graphics must be exactly registered to real objects in the scene and this requires an effective Augmented Reality system to track the user's viewpoint. In this paper, a robust tracking algorithm based on coded fiducials is presented. Square targets are identified and pose parameters are computed using a hybrid approach based on a direct method combined with the Kalman filter. An important factor for providing a robust Augmented Reality system is the correct handling of targets occlusions by real scene elements. To overcome tracking failure due to occlusions, we extend our method using an optical flow approach to track visible points and maintain virtual graphics overlaying when targets are not identified. Our proposed real-time algorithm is tested with different camera viewpoints under various image conditions and shows to be accurate and robust.

  12. Elegant objects

    CERN Document Server

    Bugayenko, Yegor

    2017-01-01

    There are 23 practical recommendations for object-oriented programmers. Most of them are completely against everything you've read in other books. For example, static methods, NULL references, getters, setters, and mutable classes are called evil. Compound variable names, validators, private static literals, configurable objects, inheritance, annotations, MVC, dependency injection containers, reflection, ORM and even algorithms are our enemies.

  13. Objective lens

    Science.gov (United States)

    Olczak, Eugene G. (Inventor)

    2011-01-01

    An objective lens and a method for using same. The objective lens has a first end, a second end, and a plurality of optical elements. The optical elements are positioned between the first end and the second end and are at least substantially symmetric about a plane centered between the first end and the second end.

  14. Passion, Robustness and Perseverance

    DEFF Research Database (Denmark)

    Lim, Miguel Antonio; Lund, Rebecca

    2016-01-01

    Evaluation and merit in the measured university are increasingly based on taken-for-granted assumptions about the “ideal academic”. We suggest that the scholar now needs to show that she is passionate about her work and that she gains pleasure from pursuing her craft. We suggest that passion...... and pleasure achieve an exalted status as something compulsory. The scholar ought to feel passionate about her work and signal that she takes pleasure also in the difficult moments. Passion has become a signal of robustness and perseverance in a job market characterised by funding shortages, increased pressure...... way to demonstrate their potential and, crucially, their passion for their work. Drawing on the literature on technologies of governance, we reflect on what is captured and what is left out by these two evaluation instruments. We suggest that bibliometric analysis at the individual level is deeply...

  15. Robust Optical Flow Estimation

    Directory of Open Access Journals (Sweden)

    Javier Sánchez Pérez

    2013-10-01

    Full Text Available n this work, we describe an implementation of the variational method proposed by Brox etal. in 2004, which yields accurate optical flows with low running times. It has several benefitswith respect to the method of Horn and Schunck: it is more robust to the presence of outliers,produces piecewise-smooth flow fields and can cope with constant brightness changes. Thismethod relies on the brightness and gradient constancy assumptions, using the information ofthe image intensities and the image gradients to find correspondences. It also generalizes theuse of continuous L1 functionals, which help mitigate the effect of outliers and create a TotalVariation (TV regularization. Additionally, it introduces a simple temporal regularizationscheme that enforces a continuous temporal coherence of the flow fields.

  16. Robust Multimodal Dictionary Learning

    Science.gov (United States)

    Cao, Tian; Jojic, Vladimir; Modla, Shannon; Powell, Debbie; Czymmek, Kirk; Niethammer, Marc

    2014-01-01

    We propose a robust multimodal dictionary learning method for multimodal images. Joint dictionary learning for both modalities may be impaired by lack of correspondence between image modalities in training data, for example due to areas of low quality in one of the modalities. Dictionaries learned with such non-corresponding data will induce uncertainty about image representation. In this paper, we propose a probabilistic model that accounts for image areas that are poorly corresponding between the image modalities. We cast the problem of learning a dictionary in presence of problematic image patches as a likelihood maximization problem and solve it with a variant of the EM algorithm. Our algorithm iterates identification of poorly corresponding patches and re-finements of the dictionary. We tested our method on synthetic and real data. We show improvements in image prediction quality and alignment accuracy when using the method for multimodal image registration. PMID:24505674

  17. International Conference on Robust Statistics

    CERN Document Server

    Filzmoser, Peter; Gather, Ursula; Rousseeuw, Peter

    2003-01-01

    Aspects of Robust Statistics are important in many areas. Based on the International Conference on Robust Statistics 2001 (ICORS 2001) in Vorau, Austria, this volume discusses future directions of the discipline, bringing together leading scientists, experienced researchers and practitioners, as well as younger researchers. The papers cover a multitude of different aspects of Robust Statistics. For instance, the fundamental problem of data summary (weights of evidence) is considered and its robustness properties are studied. Further theoretical subjects include e.g.: robust methods for skewness, time series, longitudinal data, multivariate methods, and tests. Some papers deal with computational aspects and algorithms. Finally, the aspects of application and programming tools complete the volume.

  18. Robustness Analyses of Timber Structures

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Sørensen, John Dalsgaard; Hald, Frederik

    2013-01-01

    The robustness of structural systems has obtained a renewed interest arising from a much more frequent use of advanced types of structures with limited redundancy and serious consequences in case of failure. In order to minimise the likelihood of such disproportionate structural failures, many mo...... with respect to robustness of timber structures and will discuss the consequences of such robustness issues related to the future development of timber structures.......The robustness of structural systems has obtained a renewed interest arising from a much more frequent use of advanced types of structures with limited redundancy and serious consequences in case of failure. In order to minimise the likelihood of such disproportionate structural failures, many...... modern building codes consider the need for the robustness of structures and provide strategies and methods to obtain robustness. Therefore, a structural engineer may take necessary steps to design robust structures that are insensitive to accidental circumstances. The present paper summaries issues...

  19. Extended objects

    International Nuclear Information System (INIS)

    Creutz, M.

    1976-01-01

    After some disconnected comments on the MIT bag and string models for extended hadrons, I review current understanding of extended objects in classical conventional relativistic field theories and their quantum mechanical interpretation

  20. Trusted Objects

    International Nuclear Information System (INIS)

    CAMPBELL, PHILIP L.; PIERSON, LYNDON G.; WITZKE, EDWARD L.

    1999-01-01

    In the world of computers a trusted object is a collection of possibly-sensitive data and programs that can be allowed to reside and execute on a computer, even on an adversary's machine. Beyond the scope of one computer we believe that network-based agents in high-consequence and highly reliable applications will depend on this approach, and that the basis for such objects is what we call ''faithful execution.''

  1. Dynamics robustness of cascading systems.

    Directory of Open Access Journals (Sweden)

    Jonathan T Young

    2017-03-01

    Full Text Available A most important property of biochemical systems is robustness. Static robustness, e.g., homeostasis, is the insensitivity of a state against perturbations, whereas dynamics robustness, e.g., homeorhesis, is the insensitivity of a dynamic process. In contrast to the extensively studied static robustness, dynamics robustness, i.e., how a system creates an invariant temporal profile against perturbations, is little explored despite transient dynamics being crucial for cellular fates and are reported to be robust experimentally. For example, the duration of a stimulus elicits different phenotypic responses, and signaling networks process and encode temporal information. Hence, robustness in time courses will be necessary for functional biochemical networks. Based on dynamical systems theory, we uncovered a general mechanism to achieve dynamics robustness. Using a three-stage linear signaling cascade as an example, we found that the temporal profiles and response duration post-stimulus is robust to perturbations against certain parameters. Then analyzing the linearized model, we elucidated the criteria of when signaling cascades will display dynamics robustness. We found that changes in the upstream modules are masked in the cascade, and that the response duration is mainly controlled by the rate-limiting module and organization of the cascade's kinetics. Specifically, we found two necessary conditions for dynamics robustness in signaling cascades: 1 Constraint on the rate-limiting process: The phosphatase activity in the perturbed module is not the slowest. 2 Constraints on the initial conditions: The kinase activity needs to be fast enough such that each module is saturated even with fast phosphatase activity and upstream changes are attenuated. We discussed the relevance of such robustness to several biological examples and the validity of the above conditions therein. Given the applicability of dynamics robustness to a variety of systems, it

  2. Robust Trust in Expert Testimony

    Directory of Open Access Journals (Sweden)

    Christian Dahlman

    2015-05-01

    Full Text Available The standard of proof in criminal trials should require that the evidence presented by the prosecution is robust. This requirement of robustness says that it must be unlikely that additional information would change the probability that the defendant is guilty. Robustness is difficult for a judge to estimate, as it requires the judge to assess the possible effect of information that the he or she does not have. This article is concerned with expert witnesses and proposes a method for reviewing the robustness of expert testimony. According to the proposed method, the robustness of expert testimony is estimated with regard to competence, motivation, external strength, internal strength and relevance. The danger of trusting non-robust expert testimony is illustrated with an analysis of the Thomas Quick Case, a Swedish legal scandal where a patient at a mental institution was wrongfully convicted for eight murders.

  3. Efficient robust conditional random fields.

    Science.gov (United States)

    Song, Dongjin; Liu, Wei; Zhou, Tianyi; Tao, Dacheng; Meyer, David A

    2015-10-01

    Conditional random fields (CRFs) are a flexible yet powerful probabilistic approach and have shown advantages for popular applications in various areas, including text analysis, bioinformatics, and computer vision. Traditional CRF models, however, are incapable of selecting relevant features as well as suppressing noise from noisy original features. Moreover, conventional optimization methods often converge slowly in solving the training procedure of CRFs, and will degrade significantly for tasks with a large number of samples and features. In this paper, we propose robust CRFs (RCRFs) to simultaneously select relevant features. An optimal gradient method (OGM) is further designed to train RCRFs efficiently. Specifically, the proposed RCRFs employ the l1 norm of the model parameters to regularize the objective used by traditional CRFs, therefore enabling discovery of the relevant unary features and pairwise features of CRFs. In each iteration of OGM, the gradient direction is determined jointly by the current gradient together with the historical gradients, and the Lipschitz constant is leveraged to specify the proper step size. We show that an OGM can tackle the RCRF model training very efficiently, achieving the optimal convergence rate [Formula: see text] (where k is the number of iterations). This convergence rate is theoretically superior to the convergence rate O(1/k) of previous first-order optimization methods. Extensive experiments performed on three practical image segmentation tasks demonstrate the efficacy of OGM in training our proposed RCRFs.

  4. Fashion Objects

    DEFF Research Database (Denmark)

    Andersen, Bjørn Schiermer

    2009-01-01

    -- an outline which at the same time indicates the need for transformations of the Durkheimian model on decisive points. Thus, thirdly, it returns to Durkheim and undertakes to develop his concepts in a direction suitable for a sociological theory of fashion. Finally, it discusses the theoretical implications......This article attempts to create a framework for understanding modern fashion phenomena on the basis of Durkheim's sociology of religion. It focuses on Durkheim's conception of the relation between the cult and the sacred object, on his notion of 'exteriorisation', and on his theory of the social...... symbol in an attempt to describe the peculiar attraction of the fashion object and its social constitution. However, Durkheim's notions of cult and ritual must undergo profound changes if they are to be used in an analysis of fashion. The article tries to expand the Durkheimian cult, radically enlarging...

  5. Utilities objectives

    International Nuclear Information System (INIS)

    Cousin, Y.; Fabian, H.U.

    1996-01-01

    The policy of French and german utilities is to make use of nuclear energy as a long term, competitive and environmentally friendly power supply. The world electricity generation is due to double within the next 30 years. In the next 20 to 30 years the necessity of nuclear energy will be broadly recognized. More than for most industries, to deal properly with nuclear energy requires the combination of a consistent political will, of a proper institutional framework, of strong and legitimate control authorities, of a sophisticated industry and of operators with skilled management and human resources. One of the major risk facing nuclear energy is the loss of competitiveness. This can be achieved only through the combination of an optimized design, a consistent standardization, a proper industrial partnership and a stable long term strategy. Although the existing plants in Western Europe are already very safe, the policy is clearly to enhance the safety of the next generation of nuclear plants which are designing today. The French and German utilities have chosen an evolutionary approach based on experience and proven technologies, with an enhanced defense in depth and an objective of easier operation and maintenance. The cost objective is to maintain and improve what has been achieved in the best existing power plants in both countries. This calls for rational choices and optimized design to meet the safety objectives, a strong standardization policy, short construction times, high availability and enough flexibility to enable optimization of the fuel cycle throughout the lifetime of the plants. The conceptual design phase has proven that the French and German teams from industry and from the utilities are able to pursue both the safety and the cost objectives, basing their decision on a rational approach which could be accepted by the safety authorities. (J.S.)

  6. Robustness of dynamic systems with parameter uncertainties

    CERN Document Server

    Balemi, S; Truöl, W

    1992-01-01

    Robust Control is one of the fastest growing and promising areas of research today. In many practical systems there exist uncertainties which have to be considered in the analysis and design of control systems. In the last decade methods were developed for dealing with dynamic systems with unstructured uncertainties such as HOO_ and £I-optimal control. For systems with parameter uncertainties, the seminal paper of V. L. Kharitonov has triggered a large amount of very promising research. An international workshop dealing with all aspects of robust control was successfully organized by S. P. Bhattacharyya and L. H. Keel in San Antonio, Texas, USA in March 1991. We organized the second international workshop in this area in Ascona, Switzer­ land in April 1992. However, this second workshop was restricted to robust control of dynamic systems with parameter uncertainties with the objective to concentrate on some aspects of robust control. This book contains a collection of papers presented at the International W...

  7. Neuromorphic Configurable Architecture for Robust Motion Estimation

    Directory of Open Access Journals (Sweden)

    Guillermo Botella

    2008-01-01

    Full Text Available The robustness of the human visual system recovering motion estimation in almost any visual situation is enviable, performing enormous calculation tasks continuously, robustly, efficiently, and effortlessly. There is obviously a great deal we can learn from our own visual system. Currently, there are several optical flow algorithms, although none of them deals efficiently with noise, illumination changes, second-order motion, occlusions, and so on. The main contribution of this work is the efficient implementation of a biologically inspired motion algorithm that borrows nature templates as inspiration in the design of architectures and makes use of a specific model of human visual motion perception: Multichannel Gradient Model (McGM. This novel customizable architecture of a neuromorphic robust optical flow can be constructed with FPGA or ASIC device using properties of the cortical motion pathway, constituting a useful framework for building future complex bioinspired systems running in real time with high computational complexity. This work includes the resource usage and performance data, and the comparison with actual systems. This hardware has many application fields like object recognition, navigation, or tracking in difficult environments due to its bioinspired and robustness properties.

  8. Robust and distributed hypothesis testing

    CERN Document Server

    Gül, Gökhan

    2017-01-01

    This book generalizes and extends the available theory in robust and decentralized hypothesis testing. In particular, it presents a robust test for modeling errors which is independent from the assumptions that a sufficiently large number of samples is available, and that the distance is the KL-divergence. Here, the distance can be chosen from a much general model, which includes the KL-divergence as a very special case. This is then extended by various means. A minimax robust test that is robust against both outliers as well as modeling errors is presented. Minimax robustness properties of the given tests are also explicitly proven for fixed sample size and sequential probability ratio tests. The theory of robust detection is extended to robust estimation and the theory of robust distributed detection is extended to classes of distributions, which are not necessarily stochastically bounded. It is shown that the quantization functions for the decision rules can also be chosen as non-monotone. Finally, the boo...

  9. Robustness of IPTV business models

    NARCIS (Netherlands)

    Bouwman, H.; Zhengjia, M.; Duin, P. van der; Limonard, S.

    2008-01-01

    The final stage in the STOF method is an evaluation of the robustness of the design, for which the method provides some guidelines. For many innovative services, the future holds numerous uncertainties, which makes evaluating the robustness of a business model a difficult task. In this chapter, we

  10. Robustness Evaluation of Timber Structures

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Sørensen, John Dalsgaard

    2009-01-01

    Robustness of structural systems has obtained a renewed interest due to a much more frequent use of advanced types of structures with limited redundancy and serious consequences in case of failure.......Robustness of structural systems has obtained a renewed interest due to a much more frequent use of advanced types of structures with limited redundancy and serious consequences in case of failure....

  11. Robust statistical methods with R

    CERN Document Server

    Jureckova, Jana

    2005-01-01

    Robust statistical methods were developed to supplement the classical procedures when the data violate classical assumptions. They are ideally suited to applied research across a broad spectrum of study, yet most books on the subject are narrowly focused, overly theoretical, or simply outdated. Robust Statistical Methods with R provides a systematic treatment of robust procedures with an emphasis on practical application.The authors work from underlying mathematical tools to implementation, paying special attention to the computational aspects. They cover the whole range of robust methods, including differentiable statistical functions, distance of measures, influence functions, and asymptotic distributions, in a rigorous yet approachable manner. Highlighting hands-on problem solving, many examples and computational algorithms using the R software supplement the discussion. The book examines the characteristics of robustness, estimators of real parameter, large sample properties, and goodness-of-fit tests. It...

  12. Search Path Evaluation Incorporating Object Placement Structure

    National Research Council Canada - National Science Library

    Baylog, John G; Wettergren, Thomas A

    2007-01-01

    This report describes a computationally robust approach to search path performance evaluation where the objects of search interest exhibit structure in the way in which they occur within the search space...

  13. Robust Decision Making

    Energy Technology Data Exchange (ETDEWEB)

    Christopher A. Dieckmann, PE, CSEP-Acq

    2010-07-01

    The Idaho National Laboratory (INL) is funded through the Department of Energy (DOE) Office of Nuclear Energy and other customers who have direct contracts with the Laboratory. The people, equipment, facilities and other infrastructure at the laboratory require continual investment to maintain and improve the laboratory’s capabilities. With ever tightening federal and customer budgets, the ability to direct investments into the people, equipment, facilities and other infrastructure which are most closely aligned with the laboratory’s mission and customers’ goals grows increasingly more important. The ability to justify those investment decisions based on objective criteria that can withstand political, managerial and technical criticism also becomes increasingly more important. The Systems Engineering tools of decision analysis, risk management and roadmapping, when properly applied to such problems, can provide defensible decisions.

  14. Model predictive control classical, robust and stochastic

    CERN Document Server

    Kouvaritakis, Basil

    2016-01-01

    For the first time, a textbook that brings together classical predictive control with treatment of up-to-date robust and stochastic techniques. Model Predictive Control describes the development of tractable algorithms for uncertain, stochastic, constrained systems. The starting point is classical predictive control and the appropriate formulation of performance objectives and constraints to provide guarantees of closed-loop stability and performance. Moving on to robust predictive control, the text explains how similar guarantees may be obtained for cases in which the model describing the system dynamics is subject to additive disturbances and parametric uncertainties. Open- and closed-loop optimization are considered and the state of the art in computationally tractable methods based on uncertainty tubes presented for systems with additive model uncertainty. Finally, the tube framework is also applied to model predictive control problems involving hard or probabilistic constraints for the cases of multiplic...

  15. Robust facility location: Hedging against failures

    International Nuclear Information System (INIS)

    Hernandez, Ivan; Emmanuel Ramirez-Marquez, Jose; Rainwater, Chase; Pohl, Edward; Medal, Hugh

    2014-01-01

    While few companies would be willing to sacrifice day-to-day operations to hedge against disruptions, designing for robustness can yield solutions that perform well before and after failures have occurred. Through a multi-objective optimization approach this paper provides decision makers the option to trade-off total weighted distance before and after disruptions in the Facility Location Problem. Additionally, this approach allows decision makers to understand the impact on the opening of facilities on total distance and on system robustness (considering the system as the set of located facilities). This approach differs from previous studies in that hedging against failures is done without having to elicit facility failure probabilities concurrently without requiring the allocation of additional hardening/protections resources. The approach is applied to two datasets from the literature

  16. Robust loss functions for boosting.

    Science.gov (United States)

    Kanamori, Takafumi; Takenouchi, Takashi; Eguchi, Shinto; Murata, Noboru

    2007-08-01

    Boosting is known as a gradient descent algorithm over loss functions. It is often pointed out that the typical boosting algorithm, Adaboost, is highly affected by outliers. In this letter, loss functions for robust boosting are studied. Based on the concept of robust statistics, we propose a transformation of loss functions that makes boosting algorithms robust against extreme outliers. Next, the truncation of loss functions is applied to contamination models that describe the occurrence of mislabels near decision boundaries. Numerical experiments illustrate that the proposed loss functions derived from the contamination models are useful for handling highly noisy data in comparison with other loss functions.

  17. Theoretical Framework for Robustness Evaluation

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2011-01-01

    This paper presents a theoretical framework for evaluation of robustness of structural systems, incl. bridges and buildings. Typically modern structural design codes require that ‘the consequence of damages to structures should not be disproportional to the causes of the damages’. However, although...... the importance of robustness for structural design is widely recognized the code requirements are not specified in detail, which makes the practical use difficult. This paper describes a theoretical and risk based framework to form the basis for quantification of robustness and for pre-normative guidelines...

  18. Robustness of airline route networks

    Science.gov (United States)

    Lordan, Oriol; Sallan, Jose M.; Escorihuela, Nuria; Gonzalez-Prieto, David

    2016-03-01

    Airlines shape their route network by defining their routes through supply and demand considerations, paying little attention to network performance indicators, such as network robustness. However, the collapse of an airline network can produce high financial costs for the airline and all its geographical area of influence. The aim of this study is to analyze the topology and robustness of the network route of airlines following Low Cost Carriers (LCCs) and Full Service Carriers (FSCs) business models. Results show that FSC hubs are more central than LCC bases in their route network. As a result, LCC route networks are more robust than FSC networks.

  19. A robust anti-windup design procedure for SISO systems

    Science.gov (United States)

    Kerr, Murray; Turner, Matthew C.; Villota, Elizabeth; Jayasuriya, Suhada; Postlethwaite, Ian

    2011-02-01

    A model-based anti-windup (AW) controller design approach for constrained uncertain linear single-input-single-output (SISO) systems is proposed based on quantitative feedback theory (QFT) loopshaping. The design approach explicitly incorporates uncertainty, is suitable for the solution of both the magnitude and rate saturation problems, and provides for the design of low-order AW controllers satisfying robust stability and robust performance objectives. Robust stability is enforced using absolute stability theory and generic multipliers (i.e. circle, Popov, Zames-Falb), and robust performance is enforced using linear lower-bounds on the input-output maps capturing the effects of saturation as a metric. Two detailed design examples are presented. These show that even for simple systems, certain popular AW techniques lead to compensators that may fail to ensure robust stability and performance when saturation is encountered, but that the proposed QFT design approach is able to handle both saturation and uncertainty effectively.

  20. Robust Portfolio Optimization Using Pseudodistances.

    Science.gov (United States)

    Toma, Aida; Leoni-Aubin, Samuela

    2015-01-01

    The presence of outliers in financial asset returns is a frequently occurring phenomenon which may lead to unreliable mean-variance optimized portfolios. This fact is due to the unbounded influence that outliers can have on the mean returns and covariance estimators that are inputs in the optimization procedure. In this paper we present robust estimators of mean and covariance matrix obtained by minimizing an empirical version of a pseudodistance between the assumed model and the true model underlying the data. We prove and discuss theoretical properties of these estimators, such as affine equivariance, B-robustness, asymptotic normality and asymptotic relative efficiency. These estimators can be easily used in place of the classical estimators, thereby providing robust optimized portfolios. A Monte Carlo simulation study and applications to real data show the advantages of the proposed approach. We study both in-sample and out-of-sample performance of the proposed robust portfolios comparing them with some other portfolios known in literature.

  1. Robust methods for data reduction

    CERN Document Server

    Farcomeni, Alessio

    2015-01-01

    Robust Methods for Data Reduction gives a non-technical overview of robust data reduction techniques, encouraging the use of these important and useful methods in practical applications. The main areas covered include principal components analysis, sparse principal component analysis, canonical correlation analysis, factor analysis, clustering, double clustering, and discriminant analysis.The first part of the book illustrates how dimension reduction techniques synthesize available information by reducing the dimensionality of the data. The second part focuses on cluster and discriminant analy

  2. Trading Robustness Requirements in Mars Entry Trajectory Design

    Science.gov (United States)

    Lafleur, Jarret M.

    2009-01-01

    One of the most important metrics characterizing an atmospheric entry trajectory in preliminary design is the size of its predicted landing ellipse. Often, requirements for this ellipse are set early in design and significantly influence both the expected scientific return from a particular mission and the cost of development. Requirements typically specify a certain probability level (6-level) for the prescribed ellipse, and frequently this latter requirement is taken at 36. However, searches for the justification of 36 as a robustness requirement suggest it is an empirical rule of thumb borrowed from non-aerospace fields. This paper presents an investigation into the sensitivity of trajectory performance to varying robustness (6-level) requirements. The treatment of robustness as a distinct objective is discussed, and an analysis framework is presented involving the manipulation of design variables to effect trades between performance and robustness objectives. The scenario for which this method is illustrated is the ballistic entry of an MSL-class Mars entry vehicle. Here, the design variable is entry flight path angle, and objectives are parachute deploy altitude performance and error ellipse robustness. Resulting plots show the sensitivities between these objectives and trends in the entry flight path angles required to design to these objectives. Relevance to the trajectory designer is discussed, as are potential steps for further development and use of this type of analysis.

  3. Robust MST-Based Clustering Algorithm.

    Science.gov (United States)

    Liu, Qidong; Zhang, Ruisheng; Zhao, Zhili; Wang, Zhenghai; Jiao, Mengyao; Wang, Guangjing

    2018-06-01

    Minimax similarity stresses the connectedness of points via mediating elements rather than favoring high mutual similarity. The grouping principle yields superior clustering results when mining arbitrarily-shaped clusters in data. However, it is not robust against noises and outliers in the data. There are two main problems with the grouping principle: first, a single object that is far away from all other objects defines a separate cluster, and second, two connected clusters would be regarded as two parts of one cluster. In order to solve such problems, we propose robust minimum spanning tree (MST)-based clustering algorithm in this letter. First, we separate the connected objects by applying a density-based coarsening phase, resulting in a low-rank matrix in which the element denotes the supernode by combining a set of nodes. Then a greedy method is presented to partition those supernodes through working on the low-rank matrix. Instead of removing the longest edges from MST, our algorithm groups the data set based on the minimax similarity. Finally, the assignment of all data points can be achieved through their corresponding supernodes. Experimental results on many synthetic and real-world data sets show that our algorithm consistently outperforms compared clustering algorithms.

  4. Recurrent processing during object recognition

    Directory of Open Access Journals (Sweden)

    Randall C. O'Reilly

    2013-04-01

    Full Text Available How does the brain learn to recognize objects visually, and perform this difficult feat robustly in the face of many sources of ambiguity and variability? We present a computational model based on the biology of the relevant visual pathways that learns to reliably recognize 100 different object categories in the face of of naturally-occurring variability in location, rotation, size, and lighting. The model exhibits robustness to highly ambiguous, partially occluded inputs. Both the unified, biologically plausible learning mechanism and the robustness to occlusion derive from the role that recurrent connectivity and recurrent processing mechanisms play in the model. Furthermore, this interaction of recurrent connectivity and learning predicts that high-level visual representations should be shaped by error signals from nearby, associated brain areas over the course of visual learning. Consistent with this prediction, we show how semantic knowledge about object categories changes the nature of their learned visual representations, as well as how this representational shift supports the mapping between perceptual and conceptual knowledge. Altogether, these findings support the potential importance of ongoing recurrent processing throughout the brain's visual system and suggest ways in which object recognition can be understood in terms of interactions within and between processes over time.

  5. Sparse alignment for robust tensor learning.

    Science.gov (United States)

    Lai, Zhihui; Wong, Wai Keung; Xu, Yong; Zhao, Cairong; Sun, Mingming

    2014-10-01

    Multilinear/tensor extensions of manifold learning based algorithms have been widely used in computer vision and pattern recognition. This paper first provides a systematic analysis of the multilinear extensions for the most popular methods by using alignment techniques, thereby obtaining a general tensor alignment framework. From this framework, it is easy to show that the manifold learning based tensor learning methods are intrinsically different from the alignment techniques. Based on the alignment framework, a robust tensor learning method called sparse tensor alignment (STA) is then proposed for unsupervised tensor feature extraction. Different from the existing tensor learning methods, L1- and L2-norms are introduced to enhance the robustness in the alignment step of the STA. The advantage of the proposed technique is that the difficulty in selecting the size of the local neighborhood can be avoided in the manifold learning based tensor feature extraction algorithms. Although STA is an unsupervised learning method, the sparsity encodes the discriminative information in the alignment step and provides the robustness of STA. Extensive experiments on the well-known image databases as well as action and hand gesture databases by encoding object images as tensors demonstrate that the proposed STA algorithm gives the most competitive performance when compared with the tensor-based unsupervised learning methods.

  6. Robust design optimization using the price of robustness, robust least squares and regularization methods

    Science.gov (United States)

    Bukhari, Hassan J.

    2017-12-01

    In this paper a framework for robust optimization of mechanical design problems and process systems that have parametric uncertainty is presented using three different approaches. Robust optimization problems are formulated so that the optimal solution is robust which means it is minimally sensitive to any perturbations in parameters. The first method uses the price of robustness approach which assumes the uncertain parameters to be symmetric and bounded. The robustness for the design can be controlled by limiting the parameters that can perturb.The second method uses the robust least squares method to determine the optimal parameters when data itself is subjected to perturbations instead of the parameters. The last method manages uncertainty by restricting the perturbation on parameters to improve sensitivity similar to Tikhonov regularization. The methods are implemented on two sets of problems; one linear and the other non-linear. This methodology will be compared with a prior method using multiple Monte Carlo simulation runs which shows that the approach being presented in this paper results in better performance.

  7. Robustness of mission plans for unmanned aircraft

    Science.gov (United States)

    Niendorf, Moritz

    This thesis studies the robustness of optimal mission plans for unmanned aircraft. Mission planning typically involves tactical planning and path planning. Tactical planning refers to task scheduling and in multi aircraft scenarios also includes establishing a communication topology. Path planning refers to computing a feasible and collision-free trajectory. For a prototypical mission planning problem, the traveling salesman problem on a weighted graph, the robustness of an optimal tour is analyzed with respect to changes to the edge costs. Specifically, the stability region of an optimal tour is obtained, i.e., the set of all edge cost perturbations for which that tour is optimal. The exact stability region of solutions to variants of the traveling salesman problems is obtained from a linear programming relaxation of an auxiliary problem. Edge cost tolerances and edge criticalities are derived from the stability region. For Euclidean traveling salesman problems, robustness with respect to perturbations to vertex locations is considered and safe radii and vertex criticalities are introduced. For weighted-sum multi-objective problems, stability regions with respect to changes in the objectives, weights, and simultaneous changes are given. Most critical weight perturbations are derived. Computing exact stability regions is intractable for large instances. Therefore, tractable approximations are desirable. The stability region of solutions to relaxations of the traveling salesman problem give under approximations and sets of tours give over approximations. The application of these results to the two-neighborhood and the minimum 1-tree relaxation are discussed. Bounds on edge cost tolerances and approximate criticalities are obtainable likewise. A minimum spanning tree is an optimal communication topology for minimizing the cumulative transmission power in multi aircraft missions. The stability region of a minimum spanning tree is given and tolerances, stability balls

  8. Advances in robust fractional control

    CERN Document Server

    Padula, Fabrizio

    2015-01-01

    This monograph presents design methodologies for (robust) fractional control systems. It shows the reader how to take advantage of the superior flexibility of fractional control systems compared with integer-order systems in achieving more challenging control requirements. There is a high degree of current interest in fractional systems and fractional control arising from both academia and industry and readers from both milieux are catered to in the text. Different design approaches having in common a trade-off between robustness and performance of the control system are considered explicitly. The text generalizes methodologies, techniques and theoretical results that have been successfully applied in classical (integer) control to the fractional case. The first part of Advances in Robust Fractional Control is the more industrially-oriented. It focuses on the design of fractional controllers for integer processes. In particular, it considers fractional-order proportional-integral-derivative controllers, becau...

  9. Robustness of digital artist authentication

    DEFF Research Database (Denmark)

    Jacobsen, Robert; Nielsen, Morten

    In many cases it is possible to determine the authenticity of a painting from digital reproductions of the paintings; this has been demonstrated for a variety of artists and with different approaches. Common to all these methods in digital artist authentication is that the potential of the method...... is in focus, while the robustness has not been considered, i.e. the degree to which the data collection process influences the decision of the method. However, in order for an authentication method to be successful in practice, it needs to be robust to plausible error sources from the data collection....... In this paper we investigate the robustness of the newly proposed authenticity method introduced by the authors based on second generation multiresolution analysis. This is done by modelling a number of realistic factors that can occur in the data collection....

  10. Attractive ellipsoids in robust control

    CERN Document Server

    Poznyak, Alexander; Azhmyakov, Vadim

    2014-01-01

    This monograph introduces a newly developed robust-control design technique for a wide class of continuous-time dynamical systems called the “attractive ellipsoid method.” Along with a coherent introduction to the proposed control design and related topics, the monograph studies nonlinear affine control systems in the presence of uncertainty and presents a constructive and easily implementable control strategy that guarantees certain stability properties. The authors discuss linear-style feedback control synthesis in the context of the above-mentioned systems. The development and physical implementation of high-performance robust-feedback controllers that work in the absence of complete information is addressed, with numerous examples to illustrate how to apply the attractive ellipsoid method to mechanical and electromechanical systems. While theorems are proved systematically, the emphasis is on understanding and applying the theory to real-world situations. Attractive Ellipsoids in Robust Control will a...

  11. Robustness of holonomic quantum gates

    International Nuclear Information System (INIS)

    Solinas, P.; Zanardi, P.; Zanghi, N.

    2005-01-01

    Full text: If the driving field fluctuates during the quantum evolution this produces errors in the applied operator. The holonomic (and geometrical) quantum gates are believed to be robust against some kind of noise. Because of the geometrical dependence of the holonomic operators can be robust against this kind of noise; in fact if the fluctuations are fast enough they cancel out leaving the final operator unchanged. I present the numerical studies of holonomic quantum gates subject to this parametric noise, the fidelity of the noise and ideal evolution is calculated for different noise correlation times. The holonomic quantum gates seem robust not only for fast fluctuating fields but also for slow fluctuating fields. These results can be explained as due to the geometrical feature of the holonomic operator: for fast fluctuating fields the fluctuations are canceled out, for slow fluctuating fields the fluctuations do not perturb the loop in the parameter space. (author)

  12. Robust estimation and hypothesis testing

    CERN Document Server

    Tiku, Moti L

    2004-01-01

    In statistical theory and practice, a certain distribution is usually assumed and then optimal solutions sought. Since deviations from an assumed distribution are very common, one cannot feel comfortable with assuming a particular distribution and believing it to be exactly correct. That brings the robustness issue in focus. In this book, we have given statistical procedures which are robust to plausible deviations from an assumed mode. The method of modified maximum likelihood estimation is used in formulating these procedures. The modified maximum likelihood estimators are explicit functions of sample observations and are easy to compute. They are asymptotically fully efficient and are as efficient as the maximum likelihood estimators for small sample sizes. The maximum likelihood estimators have computational problems and are, therefore, elusive. A broad range of topics are covered in this book. Solutions are given which are easy to implement and are efficient. The solutions are also robust to data anomali...

  13. Robustness in Railway Operations (RobustRailS)

    DEFF Research Database (Denmark)

    Jensen, Jens Parbo; Nielsen, Otto Anker

    This study considers the problem of enhancing railway timetable robustness without adding slack time, hence increasing the travel time. The approach integrates a transit assignment model to assess how passengers adapt their behaviour whenever operations are changed. First, the approach considers...

  14. The Reviewing of Object Files: Object-Specific Integration of Information.

    Science.gov (United States)

    Kahneman, Daniel; And Others

    1992-01-01

    Seven experiments involving a total of 203 college students explored a form of object-specific priming and established a robust object-specific benefit that indicates that a new stimulus will be named faster if it physically matches a previous stimulus seen as part of the same perceptual object. (SLD)

  15. Robust Robot Grasp Detection in Multimodal Fusion

    Directory of Open Access Journals (Sweden)

    Zhang Qiang

    2017-01-01

    Full Text Available Accurate robot grasp detection for model free objects plays an important role in robotics. With the development of RGB-D sensors, object perception technology has made great progress. Reach feature expression by the colour and the depth data is a critical problem that needs to be addressed in order to accomplish the grasping task. To solve the problem of data fusion, this paper proposes a convolutional neural networks (CNN based approach combined with regression and classification. In the CNN model, the colour and the depth modal data are deeply fused together to achieve accurate feature expression. Additionally, Welsch function is introduced into the approach to enhance robustness of the training process. Experiment results demonstrates the superiority of the proposed method.

  16. A Robust Design Applicability Model

    DEFF Research Database (Denmark)

    Ebro, Martin; Lars, Krogstie; Howard, Thomas J.

    2015-01-01

    to be applicable in organisations assigning a high importance to one or more factors that are known to be impacted by RD, while also experiencing a high level of occurrence of this factor. The RDAM supplements existing maturity models and metrics to provide a comprehensive set of data to support management......This paper introduces a model for assessing the applicability of Robust Design (RD) in a project or organisation. The intention of the Robust Design Applicability Model (RDAM) is to provide support for decisions by engineering management considering the relevant level of RD activities...

  17. Ins-Robust Primitive Words

    OpenAIRE

    Srivastava, Amit Kumar; Kapoor, Kalpesh

    2017-01-01

    Let Q be the set of primitive words over a finite alphabet with at least two symbols. We characterize a class of primitive words, Q_I, referred to as ins-robust primitive words, which remain primitive on insertion of any letter from the alphabet and present some properties that characterizes words in the set Q_I. It is shown that the language Q_I is dense. We prove that the language of primitive words that are not ins-robust is not context-free. We also present a linear time algorithm to reco...

  18. Robust Fringe Projection Profilometry via Sparse Representation.

    Science.gov (United States)

    Budianto; Lun, Daniel P K

    2016-04-01

    In this paper, a robust fringe projection profilometry (FPP) algorithm using the sparse dictionary learning and sparse coding techniques is proposed. When reconstructing the 3D model of objects, traditional FPP systems often fail to perform if the captured fringe images have a complex scene, such as having multiple and occluded objects. It introduces great difficulty to the phase unwrapping process of an FPP system that can result in serious distortion in the final reconstructed 3D model. For the proposed algorithm, it encodes the period order information, which is essential to phase unwrapping, into some texture patterns and embeds them to the projected fringe patterns. When the encoded fringe image is captured, a modified morphological component analysis and a sparse classification procedure are performed to decode and identify the embedded period order information. It is then used to assist the phase unwrapping process to deal with the different artifacts in the fringe images. Experimental results show that the proposed algorithm can significantly improve the robustness of an FPP system. It performs equally well no matter the fringe images have a simple or complex scene, or are affected due to the ambient lighting of the working environment.

  19. Robust environmental closed-loop supply chain design under uncertainty

    International Nuclear Information System (INIS)

    MA, Ruimin; YAO, Lifei; JIN, Maozhu; REN, Peiyu; LV, Zhihan

    2016-01-01

    With the fast developments in product remanufacturing to improve economic and environmental performance, an environmental closed-loop supply (ECLSC) chain is important for enterprises' competitiveness. In this paper, a robust ECLSC network is investigated which includes multiple plants, collection centers, demand zones, and products, and consists of both forward and reverse supply chains. First, a robust multi-objective mixed integer nonlinear programming model is proposed to deal with ECLSC considering two conflicting objectives simultaneously, as well as the uncertain nature of the supply chain. Cost parameters of the supply chain and demand fluctuations are subject to uncertainty. The first objective function aims to minimize the economical cost and the second objective function is to minimize the environmental influence. Then, the proposed model is solved as a single-objective mixed integer programming model applying the LP-metrics method. Finally, numerical example has been presented to test the model. The results indicate that the proposed model is applicable in practice.

  20. Design optimization for cost and quality: The robust design approach

    Science.gov (United States)

    Unal, Resit

    1990-01-01

    Designing reliable, low cost, and operable space systems has become the key to future space operations. Designing high quality space systems at low cost is an economic and technological challenge to the designer. A systematic and efficient way to meet this challenge is a new method of design optimization for performance, quality, and cost, called Robust Design. Robust Design is an approach for design optimization. It consists of: making system performance insensitive to material and subsystem variation, thus allowing the use of less costly materials and components; making designs less sensitive to the variations in the operating environment, thus improving reliability and reducing operating costs; and using a new structured development process so that engineering time is used most productively. The objective in Robust Design is to select the best combination of controllable design parameters so that the system is most robust to uncontrollable noise factors. The robust design methodology uses a mathematical tool called an orthogonal array, from design of experiments theory, to study a large number of decision variables with a significantly small number of experiments. Robust design also uses a statistical measure of performance, called a signal-to-noise ratio, from electrical control theory, to evaluate the level of performance and the effect of noise factors. The purpose is to investigate the Robust Design methodology for improving quality and cost, demonstrate its application by the use of an example, and suggest its use as an integral part of space system design process.

  1. Feedforward/feedback control synthesis for performance and robustness

    Science.gov (United States)

    Wie, Bong; Liu, Qiang

    1990-01-01

    Both feedforward and feedback control approaches for uncertain dynamical systems are investigated. The control design objective is to achieve a fast settling time (high performance) and robustness (insensitivity) to plant modeling uncertainty. Preshapong of an ideal, time-optimal control input using a 'tapped-delay' filter is shown to provide a rapid maneuver with robust performance. A robust, non-minimum-phase feedback controller is synthesized with particular emphasis on its proper implementation for a non-zero set-point control problem. The proposed feedforward/feedback control approach is robust for a certain class of uncertain dynamical systems, since the control input command computed for a given desired output does not depend on the plant parameters.

  2. Essays on robust asset pricing

    NARCIS (Netherlands)

    Horváth, Ferenc

    2017-01-01

    The central concept of this doctoral dissertation is robustness. I analyze how model and parameter uncertainty affect financial decisions of investors and fund managers, and what their equilibrium consequences are. Chapter 1 gives an overview of the most important concepts and methodologies used in

  3. Robust visual hashing via ICA

    International Nuclear Information System (INIS)

    Fournel, Thierry; Coltuc, Daniela

    2010-01-01

    Designed to maximize information transmission in the presence of noise, independent component analysis (ICA) could appear in certain circumstances as a statistics-based tool for robust visual hashing. Several ICA-based scenarios can attempt to reach this goal. A first one is here considered.

  4. Robustness of raw quantum tomography

    Science.gov (United States)

    Asorey, M.; Facchi, P.; Florio, G.; Man'ko, V. I.; Marmo, G.; Pascazio, S.; Sudarshan, E. C. G.

    2011-01-01

    We scrutinize the effects of non-ideal data acquisition on the tomograms of quantum states. The presence of a weight function, schematizing the effects of a finite window or equivalently noise, only affects the state reconstruction procedure by a normalization constant. The results are extended to a discrete mesh and show that quantum tomography is robust under incomplete and approximate knowledge of tomograms.

  5. Robustness of raw quantum tomography

    Energy Technology Data Exchange (ETDEWEB)

    Asorey, M. [Departamento de Fisica Teorica, Facultad de Ciencias, Universidad de Zaragoza, 50009 Zaragoza (Spain); Facchi, P. [Dipartimento di Matematica, Universita di Bari, I-70125 Bari (Italy); INFN, Sezione di Bari, I-70126 Bari (Italy); MECENAS, Universita Federico II di Napoli and Universita di Bari (Italy); Florio, G. [Dipartimento di Fisica, Universita di Bari, I-70126 Bari (Italy); INFN, Sezione di Bari, I-70126 Bari (Italy); MECENAS, Universita Federico II di Napoli and Universita di Bari (Italy); Man' ko, V.I., E-mail: manko@lebedev.r [P.N. Lebedev Physical Institute, Leninskii Prospect 53, Moscow 119991 (Russian Federation); Marmo, G. [Dipartimento di Scienze Fisiche, Universita di Napoli ' Federico II' , I-80126 Napoli (Italy); INFN, Sezione di Napoli, I-80126 Napoli (Italy); MECENAS, Universita Federico II di Napoli and Universita di Bari (Italy); Pascazio, S. [Dipartimento di Fisica, Universita di Bari, I-70126 Bari (Italy); INFN, Sezione di Bari, I-70126 Bari (Italy); MECENAS, Universita Federico II di Napoli and Universita di Bari (Italy); Sudarshan, E.C.G. [Department of Physics, University of Texas, Austin, TX 78712 (United States)

    2011-01-31

    We scrutinize the effects of non-ideal data acquisition on the tomograms of quantum states. The presence of a weight function, schematizing the effects of a finite window or equivalently noise, only affects the state reconstruction procedure by a normalization constant. The results are extended to a discrete mesh and show that quantum tomography is robust under incomplete and approximate knowledge of tomograms.

  6. Aspects of robust linear regression

    NARCIS (Netherlands)

    Davies, P.L.

    1993-01-01

    Section 1 of the paper contains a general discussion of robustness. In Section 2 the influence function of the Hampel-Rousseeuw least median of squares estimator is derived. Linearly invariant weak metrics are constructed in Section 3. It is shown in Section 4 that $S$-estimators satisfy an exact

  7. Manipulation Robustness of Collaborative Filtering

    OpenAIRE

    Benjamin Van Roy; Xiang Yan

    2010-01-01

    A collaborative filtering system recommends to users products that similar users like. Collaborative filtering systems influence purchase decisions and hence have become targets of manipulation by unscrupulous vendors. We demonstrate that nearest neighbors algorithms, which are widely used in commercial systems, are highly susceptible to manipulation and introduce new collaborative filtering algorithms that are relatively robust.

  8. Robustness Regions for Dichotomous Decisions.

    Science.gov (United States)

    Vijn, Pieter; Molenaar, Ivo W.

    1981-01-01

    In the case of dichotomous decisions, the total set of all assumptions/specifications for which the decision would have been the same is the robustness region. Inspection of this (data-dependent) region is a form of sensitivity analysis which may lead to improved decision making. (Author/BW)

  9. Theoretical Framework for Robustness Evaluation

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2011-01-01

    This paper presents a theoretical framework for evaluation of robustness of structural systems, incl. bridges and buildings. Typically modern structural design codes require that ‘the consequence of damages to structures should not be disproportional to the causes of the damages’. However, althou...

  10. Robust control design with MATLAB

    CERN Document Server

    Gu, Da-Wei; Konstantinov, Mihail M

    2013-01-01

    Robust Control Design with MATLAB® (second edition) helps the student to learn how to use well-developed advanced robust control design methods in practical cases. To this end, several realistic control design examples from teaching-laboratory experiments, such as a two-wheeled, self-balancing robot, to complex systems like a flexible-link manipulator are given detailed presentation. All of these exercises are conducted using MATLAB® Robust Control Toolbox 3, Control System Toolbox and Simulink®. By sharing their experiences in industrial cases with minimum recourse to complicated theories and formulae, the authors convey essential ideas and useful insights into robust industrial control systems design using major H-infinity optimization and related methods allowing readers quickly to move on with their own challenges. The hands-on tutorial style of this text rests on an abundance of examples and features for the second edition: ·        rewritten and simplified presentation of theoretical and meth...

  11. Robust Portfolio Optimization Using Pseudodistances

    Science.gov (United States)

    2015-01-01

    The presence of outliers in financial asset returns is a frequently occurring phenomenon which may lead to unreliable mean-variance optimized portfolios. This fact is due to the unbounded influence that outliers can have on the mean returns and covariance estimators that are inputs in the optimization procedure. In this paper we present robust estimators of mean and covariance matrix obtained by minimizing an empirical version of a pseudodistance between the assumed model and the true model underlying the data. We prove and discuss theoretical properties of these estimators, such as affine equivariance, B-robustness, asymptotic normality and asymptotic relative efficiency. These estimators can be easily used in place of the classical estimators, thereby providing robust optimized portfolios. A Monte Carlo simulation study and applications to real data show the advantages of the proposed approach. We study both in-sample and out-of-sample performance of the proposed robust portfolios comparing them with some other portfolios known in literature. PMID:26468948

  12. Facial Symmetry in Robust Anthropometrics

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2012-01-01

    Roč. 57, č. 3 (2012), s. 691-698 ISSN 0022-1198 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : forensic science * anthropology * robust image analysis * correlation analysis * multivariate data * classification Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.244, year: 2012

  13. Sparse and Robust Factor Modelling

    NARCIS (Netherlands)

    C. Croux (Christophe); P. Exterkate (Peter)

    2011-01-01

    textabstractFactor construction methods are widely used to summarize a large panel of variables by means of a relatively small number of representative factors. We propose a novel factor construction procedure that enjoys the properties of robustness to outliers and of sparsity; that is, having

  14. Robust distributed cognitive relay beamforming

    KAUST Repository

    Pandarakkottilil, Ubaidulla; Aissa, Sonia

    2012-01-01

    design takes into account a parameter of the error in the channel state information (CSI) to render the performance of the beamformer robust in the presence of imperfect CSI. Though the original problem is non-convex, we show that the proposed design can

  15. Approximability of Robust Network Design

    NARCIS (Netherlands)

    Olver, N.K.; Shepherd, F.B.

    2014-01-01

    We consider robust (undirected) network design (RND) problems where the set of feasible demands may be given by an arbitrary convex body. This model, introduced by Ben-Ameur and Kerivin [Ben-Ameur W, Kerivin H (2003) New economical virtual private networks. Comm. ACM 46(6):69-73], generalizes the

  16. Precise object tracking under deformation

    International Nuclear Information System (INIS)

    Saad, M.H

    2010-01-01

    The precise object tracking is an essential issue in several serious applications such as; robot vision, automated surveillance (civil and military), inspection, biomedical image analysis, video coding, motion segmentation, human-machine interface, visualization, medical imaging, traffic systems, satellite imaging etc. This frame-work focuses on the precise object tracking under deformation such as scaling , rotation, noise, blurring and change of illumination. This research is a trail to solve these serious problems in visual object tracking by which the quality of the overall system will be improved. Developing a three dimensional (3D) geometrical model to determine the current pose of an object and predict its future location based on FIR model learned by the OLS. This framework presents a robust ranging technique to track a visual target instead of the traditional expensive ranging sensors. The presented research work is applied to real video stream and achieved high precession results.

  17. Precise Object Tracking under Deformation

    International Nuclear Information System (INIS)

    Saad, M.H.

    2010-01-01

    The precise object tracking is an essential issue in several serious applications such as; robot vision, automated surveillance (civil and military), inspection, biomedical image analysis, video coding, motion segmentation, human-machine interface, visualization, medical imaging, traffic systems, satellite imaging etc. This framework focuses on the precise object tracking under deformation such as scaling, rotation, noise, blurring and change of illumination. This research is a trail to solve these serious problems in visual object tracking by which the quality of the overall system will be improved. Developing a three dimensional (3D) geometrical model to determine the current pose of an object and predict its future location based on FIR model learned by the OLS. This framework presents a robust ranging technique to track a visual target instead of the traditional expensive ranging sensors. The presented research work is applied to real video stream and achieved high precession results. xiiiThe precise object tracking is an essential issue in several serious applications such as; robot vision, automated surveillance (civil and military), inspection, biomedical image analysis, video coding, motion segmentation, human-machine interface, visualization, medical imaging, traffic systems, satellite imaging etc. This framework focuses on the precise object tracking under deformation such as scaling, rotation, noise, blurring and change of illumination. This research is a trail to solve these serious problems in visual object tracking by which the quality of the overall system will be improved. Developing a three dimensional (3D) geometrical model to determine the current pose of an object and predict its future location based on FIR model learned by the OLS. This framework presents a robust ranging technique to track a visual target instead of the traditional expensive ranging sensors. The presented research work is applied to real video stream and achieved high

  18. Robust Reliability or reliable robustness? - Integrated consideration of robustness and reliability aspects

    DEFF Research Database (Denmark)

    Kemmler, S.; Eifler, Tobias; Bertsche, B.

    2015-01-01

    products are and vice versa. For a comprehensive understanding and to use existing synergies between both domains, this paper discusses the basic principles of Reliability- and Robust Design theory. The development of a comprehensive model will enable an integrated consideration of both domains...

  19. Robust Watermarking of Video Streams

    Directory of Open Access Journals (Sweden)

    T. Polyák

    2006-01-01

    Full Text Available In the past few years there has been an explosion in the use of digital video data. Many people have personal computers at home, and with the help of the Internet users can easily share video files on their computer. This makes possible the unauthorized use of digital media, and without adequate protection systems the authors and distributors have no means to prevent it.Digital watermarking techniques can help these systems to be more effective by embedding secret data right into the video stream. This makes minor changes in the frames of the video, but these changes are almost imperceptible to the human visual system. The embedded information can involve copyright data, access control etc. A robust watermark is resistant to various distortions of the video, so it cannot be removed without affecting the quality of the host medium. In this paper I propose a video watermarking scheme that fulfills the requirements of a robust watermark. 

  20. Robust Decentralized Formation Flight Control

    Directory of Open Access Journals (Sweden)

    Zhao Weihua

    2011-01-01

    Full Text Available Motivated by the idea of multiplexed model predictive control (MMPC, this paper introduces a new framework for unmanned aerial vehicles (UAVs formation flight and coordination. Formulated using MMPC approach, the whole centralized formation flight system is considered as a linear periodic system with control inputs of each UAV subsystem as its periodic inputs. Divided into decentralized subsystems, the whole formation flight system is guaranteed stable if proper terminal cost and terminal constraints are added to each decentralized MPC formulation of the UAV subsystem. The decentralized robust MPC formulation for each UAV subsystem with bounded input disturbances and model uncertainties is also presented. Furthermore, an obstacle avoidance control scheme for any shape and size of obstacles, including the nonapriorily known ones, is integrated under the unified MPC framework. The results from simulations demonstrate that the proposed framework can successfully achieve robust collision-free formation flights.

  1. Inefficient but robust public leadership.

    OpenAIRE

    Matsumura, Toshihiro; Ogawa, Akira

    2014-01-01

    We investigate endogenous timing in a mixed duopoly in a differentiated product market. We find that private leadership is better than public leadership from a social welfare perspective if the private firm is domestic, regardless of the degree of product differentiation. Nevertheless, the public leadership equilibrium is risk-dominant, and it is thus robust if the degree of product differentiation is high. We also find that regardless of the degree of product differentiation, the public lead...

  2. Testing Heteroscedasticity in Robust Regression

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2011-01-01

    Roč. 1, č. 4 (2011), s. 25-28 ISSN 2045-3345 Grant - others:GA ČR(CZ) GA402/09/0557 Institutional research plan: CEZ:AV0Z10300504 Keywords : robust regression * heteroscedasticity * regression quantiles * diagnostics Subject RIV: BB - Applied Statistics , Operational Research http://www.researchjournals.co.uk/documents/Vol4/06%20Kalina.pdf

  3. Robust power system frequency control

    CERN Document Server

    Bevrani, Hassan

    2014-01-01

    This updated edition of the industry standard reference on power system frequency control provides practical, systematic and flexible algorithms for regulating load frequency, offering new solutions to the technical challenges introduced by the escalating role of distributed generation and renewable energy sources in smart electric grids. The author emphasizes the physical constraints and practical engineering issues related to frequency in a deregulated environment, while fostering a conceptual understanding of frequency regulation and robust control techniques. The resulting control strategi

  4. Biometric feature embedding using robust steganography technique

    Science.gov (United States)

    Rashid, Rasber D.; Sellahewa, Harin; Jassim, Sabah A.

    2013-05-01

    This paper is concerned with robust steganographic techniques to hide and communicate biometric data in mobile media objects like images, over open networks. More specifically, the aim is to embed binarised features extracted using discrete wavelet transforms and local binary patterns of face images as a secret message in an image. The need for such techniques can arise in law enforcement, forensics, counter terrorism, internet/mobile banking and border control. What differentiates this problem from normal information hiding techniques is the added requirement that there should be minimal effect on face recognition accuracy. We propose an LSB-Witness embedding technique in which the secret message is already present in the LSB plane but instead of changing the cover image LSB values, the second LSB plane will be changed to stand as a witness/informer to the receiver during message recovery. Although this approach may affect the stego quality, it is eliminating the weakness of traditional LSB schemes that is exploited by steganalysis techniques for LSB, such as PoV and RS steganalysis, to detect the existence of secrete message. Experimental results show that the proposed method is robust against PoV and RS attacks compared to other variants of LSB. We also discussed variants of this approach and determine capacity requirements for embedding face biometric feature vectors while maintain accuracy of face recognition.

  5. Robust Optimal Design of Quantum Electronic Devices

    Directory of Open Access Journals (Sweden)

    Ociel Morales

    2018-01-01

    Full Text Available We consider the optimal design of a sequence of quantum barriers, in order to manufacture an electronic device at the nanoscale such that the dependence of its transmission coefficient on the bias voltage is linear. The technique presented here is easily adaptable to other response characteristics. There are two distinguishing features of our approach. First, the transmission coefficient is determined using a semiclassical approximation, so we can explicitly compute the gradient of the objective function. Second, in contrast with earlier treatments, manufacturing uncertainties are incorporated in the model through random variables; the optimal design problem is formulated in a probabilistic setting and then solved using a stochastic collocation method. As a measure of robustness, a weighted sum of the expectation and the variance of a least-squares performance metric is considered. Several simulations illustrate the proposed technique, which shows an improvement in accuracy over 69% with respect to brute-force, Monte-Carlo-based methods.

  6. Robust Circle Detection Using Harmony Search

    Directory of Open Access Journals (Sweden)

    Jaco Fourie

    2017-01-01

    Full Text Available Automatic circle detection is an important element of many image processing algorithms. Traditionally the Hough transform has been used to find circular objects in images but more modern approaches that make use of heuristic optimisation techniques have been developed. These are often used in large complex images where the presence of noise or limited computational resources make the Hough transform impractical. Previous research on the use of the Harmony Search (HS in circle detection showed that HS is an attractive alternative to many of the modern circle detectors based on heuristic optimisers like genetic algorithms and simulated annealing. We propose improvements to this work that enables our algorithm to robustly find multiple circles in larger data sets and still work on realistic images that are heavily corrupted by noisy edges.

  7. Approximate truncation robust computed tomography—ATRACT

    International Nuclear Information System (INIS)

    Dennerlein, Frank; Maier, Andreas

    2013-01-01

    We present an approximate truncation robust algorithm to compute tomographic images (ATRACT). This algorithm targets at reconstructing volumetric images from cone-beam projections in scenarios where these projections are highly truncated in each dimension. It thus facilitates reconstructions of small subvolumes of interest, without involving prior knowledge about the object. Our method is readily applicable to medical C-arm imaging, where it may contribute to new clinical workflows together with a considerable reduction of x-ray dose. We give a detailed derivation of ATRACT that starts from the conventional Feldkamp filtered-backprojection algorithm and that involves, as one component, a novel original formula for the inversion of the two-dimensional Radon transform. Discretization and numerical implementation are discussed and reconstruction results from both, simulated projections and first clinical data sets are presented. (paper)

  8. Robust high temperature oxygen sensor electrodes

    DEFF Research Database (Denmark)

    Lund, Anders

    Platinum is the most widely used material in high temperature oxygen sensor electrodes. However, platinum is expensive and the platinum electrode may, under certain conditions, suffer from poisoning, which is detrimental for an oxygen sensor. The objective of this thesis is to evaluate electrode...... materials as candidates for robust oxygen sensor electrodes. The present work focuses on characterising the electrochemical properties of a few electrode materials to understand which oxygen electrode processes are limiting for the response time of the sensor electrode. Three types of porous platinum......-Dansensor. The electrochemical properties of the electrodes were characterised by electrochemical impedance spectroscopy (EIS), and the structures were characterised by x-ray diffraction and electron microscopy. At an oxygen partial pressures of 0.2 bar, the response time of the sensor electrode was determined by oxygen...

  9. Distributed Robust Optimization in Networked System.

    Science.gov (United States)

    Wang, Shengnan; Li, Chunguang

    2016-10-11

    In this paper, we consider a distributed robust optimization (DRO) problem, where multiple agents in a networked system cooperatively minimize a global convex objective function with respect to a global variable under the global constraints. The objective function can be represented by a sum of local objective functions. The global constraints contain some uncertain parameters which are partially known, and can be characterized by some inequality constraints. After problem transformation, we adopt the Lagrangian primal-dual method to solve this problem. We prove that the primal and dual optimal solutions of the problem are restricted in some specific sets, and we give a method to construct these sets. Then, we propose a DRO algorithm to find the primal-dual optimal solutions of the Lagrangian function, which consists of a subgradient step, a projection step, and a diffusion step, and in the projection step of the algorithm, the optimized variables are projected onto the specific sets to guarantee the boundedness of the subgradients. Convergence analysis and numerical simulations verifying the performance of the proposed algorithm are then provided. Further, for nonconvex DRO problem, the corresponding approach and algorithm framework are also provided.

  10. Robustness Analysis of Timber Truss Structure

    DEFF Research Database (Denmark)

    Rajčić, Vlatka; Čizmar, Dean; Kirkegaard, Poul Henning

    2010-01-01

    The present paper discusses robustness of structures in general and the robustness requirements given in the codes. Robustness of timber structures is also an issues as this is closely related to Working group 3 (Robustness of systems) of the COST E55 project. Finally, an example of a robustness...... evaluation of a widespan timber truss structure is presented. This structure was built few years ago near Zagreb and has a span of 45m. Reliability analysis of the main members and the system is conducted and based on this a robustness analysis is preformed....

  11. Soliton robustness in optical fibers

    International Nuclear Information System (INIS)

    Menyuk, C.R.

    1993-01-01

    Simulations and experiments indicate that solitons in optical fibers are robust in the presence of Hamiltonian deformations such as higher-order dispersion and birefringence but are destroyed in the presence of non-Hamiltonian deformations such as attenuation and the Raman effect. Two hypotheses are introduced that generalize these observations and give a recipe for when deformations will be Hamiltonian. Concepts from nonlinear dynamics are used to make these two hypotheses plausible. Soliton stabilization with frequency filtering is also briefly discussed from this point of view

  12. Robust and Sparse Factor Modelling

    DEFF Research Database (Denmark)

    Croux, Christophe; Exterkate, Peter

    Factor construction methods are widely used to summarize a large panel of variables by means of a relatively small number of representative factors. We propose a novel factor construction procedure that enjoys the properties of robustness to outliers and of sparsity; that is, having relatively few...... nonzero factor loadings. Compared to the traditional factor construction method, we find that this procedure leads to a favorable forecasting performance in the presence of outliers and to better interpretable factors. We investigate the performance of the method in a Monte Carlo experiment...

  13. Sustainable Resilient, Robust & Resplendent Enterprises

    DEFF Research Database (Denmark)

    Edgeman, Rick

    to their impact. Resplendent enterprises are introduced with resplendence referring not to some sort of public or private façade, but instead refers to organizations marked by dual brilliance and nobility of strategy, governance and comportment that yields superior and sustainable triple bottom line performance....... Herein resilience, robustness, and resplendence (R3) are integrated with sustainable enterprise excellence (Edgeman and Eskildsen, 2013) or SEE and social-ecological innovation (Eskildsen and Edgeman, 2012) to aid progress of a firm toward producing continuously relevant performance that proceed from...

  14. Robustness indicators and capacity models for railway networks

    DEFF Research Database (Denmark)

    Jensen, Lars Wittrup

    In a world continuous striving for higher mobility and the use of more sustainable modes of transport, there is a constant pressure on utilising railway capacity better and, at the same time, obtaining a high robustness against delays. During the planning of railway operations and infrastructure ....... This has motivated the research conducted and described in this thesis, where the objective has been to develop and improve existing methods to achieve timetable and infrastructure plans with robust capacity utilisation aimed at the strategic and early tactical planning phases....

  15. An Evolutionary Approach for Robust Layout Synthesis of MEMS

    DEFF Research Database (Denmark)

    Fan, Zhun; Wang, Jiachuan; Goodman, Erik

    2005-01-01

    The paper introduces a robust design method for layout synthesis of MEM resonators subject to inherent geometric uncertainties such as the fabrication error on the sidewall of the structure. The robust design problem is formulated as a multi-objective constrained optimisation problem after certain...... assumptions and treated with multiobjective genetic algorithm (MOGA), a special type of evolutionary computing approaches. Case study based on layout synthesis of a comb-driven MEM resonator shows that the approach proposed in this paper can lead to design results that meet the target performance and are less...

  16. Perspective: Evolution and detection of genetic robustness

    NARCIS (Netherlands)

    Visser, de J.A.G.M.; Hermisson, J.; Wagner, G.P.; Ancel Meyers, L.; Bagheri-Chaichian, H.; Blanchard, J.L.; Chao, L.; Cheverud, J.M.; Elena, S.F.; Fontana, W.; Gibson, G.; Hansen, T.F.; Krakauer, D.; Lewontin, R.C.; Ofria, C.; Rice, S.H.; Dassow, von G.; Wagner, A.; Whitlock, M.C.

    2003-01-01

    Robustness is the invariance of phenotypes in the face of perturbation. The robustness of phenotypes appears at various levels of biological organization, including gene expression, protein folding, metabolic flux, physiological homeostasis, development, and even organismal fitness. The mechanisms

  17. Robust lyapunov controller for uncertain systems

    KAUST Repository

    Laleg-Kirati, Taous-Meriem; Elmetennani, Shahrazed

    2017-01-01

    Various examples of systems and methods are provided for Lyapunov control for uncertain systems. In one example, a system includes a process plant and a robust Lyapunov controller configured to control an input of the process plant. The robust

  18. Robust distributed cognitive relay beamforming

    KAUST Repository

    Pandarakkottilil, Ubaidulla

    2012-05-01

    In this paper, we present a distributed relay beamformer design for a cognitive radio network in which a cognitive (or secondary) transmit node communicates with a secondary receive node assisted by a set of cognitive non-regenerative relays. The secondary nodes share the spectrum with a licensed primary user (PU) node, and each node is assumed to be equipped with a single transmit/receive antenna. The interference to the PU resulting from the transmission from the cognitive nodes is kept below a specified limit. The proposed robust cognitive relay beamformer design seeks to minimize the total relay transmit power while ensuring that the transceiver signal-to-interference- plus-noise ratio and PU interference constraints are satisfied. The proposed design takes into account a parameter of the error in the channel state information (CSI) to render the performance of the beamformer robust in the presence of imperfect CSI. Though the original problem is non-convex, we show that the proposed design can be reformulated as a tractable convex optimization problem that can be solved efficiently. Numerical results are provided and illustrate the performance of the proposed designs for different network operating conditions and parameters. © 2012 IEEE.

  19. Robust canonical correlations: A comparative study

    OpenAIRE

    Branco, JA; Croux, Christophe; Filzmoser, P; Oliveira, MR

    2005-01-01

    Several approaches for robust canonical correlation analysis will be presented and discussed. A first method is based on the definition of canonical correlation analysis as looking for linear combinations of two sets of variables having maximal (robust) correlation. A second method is based on alternating robust regressions. These methods axe discussed in detail and compared with the more traditional approach to robust canonical correlation via covariance matrix estimates. A simulation study ...

  20. Robust adaptive synchronization of general dynamical networks ...

    Indian Academy of Sciences (India)

    Home; Journals; Pramana – Journal of Physics; Volume 86; Issue 6. Robust ... A robust adaptive synchronization scheme for these general complex networks with multiple delays and uncertainties is established and raised by employing the robust adaptive control principle and the Lyapunov stability theory. We choose ...

  1. A robust standard deviation control chart

    NARCIS (Netherlands)

    Schoonhoven, M.; Does, R.J.M.M.

    2012-01-01

    This article studies the robustness of Phase I estimators for the standard deviation control chart. A Phase I estimator should be efficient in the absence of contaminations and resistant to disturbances. Most of the robust estimators proposed in the literature are robust against either diffuse

  2. Methodology in robust and nonparametric statistics

    CERN Document Server

    Jurecková, Jana; Picek, Jan

    2012-01-01

    Introduction and SynopsisIntroductionSynopsisPreliminariesIntroductionInference in Linear ModelsRobustness ConceptsRobust and Minimax Estimation of LocationClippings from Probability and Asymptotic TheoryProblemsRobust Estimation of Location and RegressionIntroductionM-EstimatorsL-EstimatorsR-EstimatorsMinimum Distance and Pitman EstimatorsDifferentiable Statistical FunctionsProblemsAsymptotic Representations for L-Estimators

  3. Robust Parameter and Signal Estimation in Induction Motors

    DEFF Research Database (Denmark)

    Børsting, H.

    This thesis deals with theories and methods for robust parameter and signal estimation in induction motors. The project originates in industrial interests concerning sensor-less control of electrical drives. During the work, some general problems concerning estimation of signals and parameters...... in nonlinear systems, have been exposed. The main objectives of this project are: - analysis and application of theories and methods for robust estimation of parameters in a model structure, obtained from knowledge of the physics of the induction motor. - analysis and application of theories and methods...... for robust estimation of the rotor speed and driving torque of the induction motor based only on measurements of stator voltages and currents. Only contimuous-time models have been used, which means that physical related signals and parameters are estimated directly and not indirectly by some discrete...

  4. A Robust H∞ Controller for an UAV Flight Control System

    Directory of Open Access Journals (Sweden)

    J. López

    2015-01-01

    Full Text Available The objective of this paper is the implementation and validation of a robust H∞ controller for an UAV to track all types of manoeuvres in the presence of noisy environment. A robust inner-outer loop strategy is implemented. To design the H∞ robust controller in the inner loop, H∞ control methodology is used. The two controllers that conform the outer loop are designed using the H∞ Loop Shaping technique. The reference vector used in the control architecture formed by vertical velocity, true airspeed, and heading angle, suggests a nontraditional way to pilot the aircraft. The simulation results show that the proposed control scheme works well despite the presence of noise and uncertainties, so the control system satisfies the requirements.

  5. Evaluation of Robust Estimators Applied to Fluorescence Assays

    Directory of Open Access Journals (Sweden)

    U. Ruotsalainen

    2007-12-01

    Full Text Available We evaluated standard robust methods in the estimation of fluorescence signal in novel assays used for determining the biomolecule concentrations. The objective was to obtain an accurate and reliable estimate using as few observations as possible by decreasing the influence of outliers. We assumed the true signals to have Gaussian distribution, while no assumptions about the outliers were made. The experimental results showed that arithmetic mean performs poorly even with the modest deviations. Further, the robust methods, especially the M-estimators, performed extremely well. The results proved that the use of robust methods is advantageous in the estimation problems where noise and deviations are significant, such as in biological and medical applications.

  6. Employing Sensitivity Derivatives for Robust Optimization under Uncertainty in CFD

    Science.gov (United States)

    Newman, Perry A.; Putko, Michele M.; Taylor, Arthur C., III

    2004-01-01

    A robust optimization is demonstrated on a two-dimensional inviscid airfoil problem in subsonic flow. Given uncertainties in statistically independent, random, normally distributed flow parameters (input variables), an approximate first-order statistical moment method is employed to represent the Computational Fluid Dynamics (CFD) code outputs as expected values with variances. These output quantities are used to form the objective function and constraints. The constraints are cast in probabilistic terms; that is, the probability that a constraint is satisfied is greater than or equal to some desired target probability. Gradient-based robust optimization of this stochastic problem is accomplished through use of both first and second-order sensitivity derivatives. For each robust optimization, the effect of increasing both input standard deviations and target probability of constraint satisfaction are demonstrated. This method provides a means for incorporating uncertainty when considering small deviations from input mean values.

  7. Robust Layout Synthesis of a MEM Crab-Leg Resonator Using a Constrained Genetic Algorithm

    DEFF Research Database (Denmark)

    Fan, Zhun; Achiche, Sofiane

    2007-01-01

    The research work carried out in this paper introduces a robust design method for layout synthesis of MEM resonator subject to inherent geometric uncertainties such as the fabrication error on the sidewall of the structure. The robust design problem is formulated as a multi-objective constrained...

  8. Container Materials, Fabrication And Robustness

    International Nuclear Information System (INIS)

    Dunn, K.; Louthan, M.; Rawls, G.; Sindelar, R.; Zapp, P.; Mcclard, J.

    2009-01-01

    The multi-barrier 3013 container used to package plutonium-bearing materials is robust and thereby highly resistant to identified degradation modes that might cause failure. The only viable degradation mechanisms identified by a panel of technical experts were pressurization within and corrosion of the containers. Evaluations of the container materials and the fabrication processes and resulting residual stresses suggest that the multi-layered containers will mitigate the potential for degradation of the outer container and prevent the release of the container contents to the environment. Additionally, the ongoing surveillance programs and laboratory studies should detect any incipient degradation of containers in the 3013 storage inventory before an outer container is compromised.

  9. Robust matching for voice recognition

    Science.gov (United States)

    Higgins, Alan; Bahler, L.; Porter, J.; Blais, P.

    1994-10-01

    This paper describes an automated method of comparing a voice sample of an unknown individual with samples from known speakers in order to establish or verify the individual's identity. The method is based on a statistical pattern matching approach that employs a simple training procedure, requires no human intervention (transcription, work or phonetic marketing, etc.), and makes no assumptions regarding the expected form of the statistical distributions of the observations. The content of the speech material (vocabulary, grammar, etc.) is not assumed to be constrained in any way. An algorithm is described which incorporates frame pruning and channel equalization processes designed to achieve robust performance with reasonable computational resources. An experimental implementation demonstrating the feasibility of the concept is described.

  10. Random clustering ferns for multimodal object recognition

    OpenAIRE

    Villamizar Vergel, Michael Alejandro; Garrell Zulueta, Anais; Sanfeliu Cortés, Alberto; Moreno-Noguer, Francesc

    2017-01-01

    The final publication is available at link.springer.com We propose an efficient and robust method for the recognition of objects exhibiting multiple intra-class modes, where each one is associated with a particular object appearance. The proposed method, called random clustering ferns, combines synergically a single and real-time classifier, based on the boosted assembling of extremely randomized trees (ferns), with an unsupervised and probabilistic approach in order to recognize efficient...

  11. Robust Airborne Networking Extensions (RANGE)

    National Research Council Canada - National Science Library

    Henderson, Thomas R

    2008-01-01

    .... The secondary objective is to investigate the application of these protocols to hybrid Navy/USMC/Joint/Coalition networks, including the integration of shore and ground-based (littoral) components...

  12. Robustness Assessment of Spatial Timber Structures

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning

    2012-01-01

    Robustness of structural systems has obtained a renewed interest due to a much more frequent use of advanced types of structures with limited redundancy and serious consequences in case of failure. In order to minimise the likelihood of such disproportionate structural failures many modern buildi...... to robustness of spatial timber structures and will discuss the consequences of such robustness issues related to the future development of timber structures.......Robustness of structural systems has obtained a renewed interest due to a much more frequent use of advanced types of structures with limited redundancy and serious consequences in case of failure. In order to minimise the likelihood of such disproportionate structural failures many modern building...... codes consider the need for robustness of structures and provide strategies and methods to obtain robustness. Therefore a structural engineer may take necessary steps to design robust structures that are insensitive to accidental circumstances. The present paper summaries issues with respect...

  13. Quasi-objects, Cult Objects and Fashion Objects

    DEFF Research Database (Denmark)

    Andersen, Bjørn Schiermer

    2011-01-01

    This article attempts to rehabilitate the concept of fetishism and to contribute to the debate on the social role of objects as well as to fashion theory. Extrapolating from Michel Serres’ theory of the quasi-objects, I distinguish two phenomenologies possessing almost opposite characteristics. T...... as a unique opportunity for studying the interchange between these two forms of fetishism and their respective phenomenologies. Finally, returning to Serres, I briefly consider the theoretical consequences of introducing the fashion object as a quasi-object.......This article attempts to rehabilitate the concept of fetishism and to contribute to the debate on the social role of objects as well as to fashion theory. Extrapolating from Michel Serres’ theory of the quasi-objects, I distinguish two phenomenologies possessing almost opposite characteristics....... These two phenomenologies are, so I argue, essential to quasi-object theory, yet largely ignored by Serres’ sociological interpreters. They correspond with the two different theories of fetishism found in Marx and Durkheim, respectively. In the second half of the article, I introduce the fashion object...

  14. Robust Model Predictive Control of Networked Control Systems under Input Constraints and Packet Dropouts

    Directory of Open Access Journals (Sweden)

    Deyin Yao

    2014-01-01

    Full Text Available This paper deals with the problem of robust model predictive control (RMPC for a class of linear time-varying systems with constraints and data losses. We take the polytopic uncertainties into account to describe the uncertain systems. First, we design a robust state observer by using the linear matrix inequality (LMI constraints so that the original system state can be tracked. Second, the MPC gain is calculated by minimizing the upper bound of infinite horizon robust performance objective in terms of linear matrix inequality conditions. The method of robust MPC and state observer design is illustrated by a numerical example.

  15. Systematic and robust design of photonic crystal waveguides by topology optimization

    DEFF Research Database (Denmark)

    Wang, Fengwen; Jensen, Jakob Søndergaard; Sigmund, Ole

    2010-01-01

    on a threshold projection. The objective is formulated to minimize the maximum error between actual group indices and a prescribed group index among these three designs. Novel photonic crystal waveguide facilitating slow light with a group index of n(g) = 40 is achieved by the robust optimization approach......A robust topology optimization method is presented to consider manufacturing uncertainties in tailoring dispersion properties of photonic crystal waveguides. The under, normal and over-etching scenarios in manufacturing process are represented by dilated, intermediate and eroded designs based....... The numerical result illustrates that the robust topology optimization provides a systematic and robust design methodology for photonic crystal waveguide design....

  16. Learning Object Repositories

    Science.gov (United States)

    Lehman, Rosemary

    2007-01-01

    This chapter looks at the development and nature of learning objects, meta-tagging standards and taxonomies, learning object repositories, learning object repository characteristics, and types of learning object repositories, with type examples. (Contains 1 table.)

  17. Approximating the Pareto Set of Multiobjective Linear Programs via Robust Optimization

    NARCIS (Netherlands)

    Gorissen, B.L.; den Hertog, D.

    2012-01-01

    Abstract: The Pareto set of a multiobjective optimization problem consists of the solutions for which one or more objectives can not be improved without deteriorating one or more other objectives. We consider problems with linear objectives and linear constraints and use Adjustable Robust

  18. Operations of a non-stellar object tracker in space

    DEFF Research Database (Denmark)

    Riis, Troels; Jørgensen, John Leif; Betto, Maurizio

    1999-01-01

    The ability to detect and track non-stellar objects by utilizing a star tracker may seem rather straight forward, as any bright object, not recognized as a star by the system is a non stellar object. However, several pitfalls and errors exist, if a reliable and robust detection is required. To te...

  19. Advances in Modal Analysis Using a Robust and Multiscale Method

    Science.gov (United States)

    Picard, Cécile; Frisson, Christian; Faure, François; Drettakis, George; Kry, Paul G.

    2010-12-01

    This paper presents a new approach to modal synthesis for rendering sounds of virtual objects. We propose a generic method that preserves sound variety across the surface of an object at different scales of resolution and for a variety of complex geometries. The technique performs automatic voxelization of a surface model and automatic tuning of the parameters of hexahedral finite elements, based on the distribution of material in each cell. The voxelization is performed using a sparse regular grid embedding of the object, which permits the construction of plausible lower resolution approximations of the modal model. We can compute the audible impulse response of a variety of objects. Our solution is robust and can handle nonmanifold geometries that include both volumetric and surface parts. We present a system which allows us to manipulate and tune sounding objects in an appropriate way for games, training simulations, and other interactive virtual environments.

  20. Advances in Modal Analysis Using a Robust and Multiscale Method

    Directory of Open Access Journals (Sweden)

    Frisson Christian

    2010-01-01

    Full Text Available Abstract This paper presents a new approach to modal synthesis for rendering sounds of virtual objects. We propose a generic method that preserves sound variety across the surface of an object at different scales of resolution and for a variety of complex geometries. The technique performs automatic voxelization of a surface model and automatic tuning of the parameters of hexahedral finite elements, based on the distribution of material in each cell. The voxelization is performed using a sparse regular grid embedding of the object, which permits the construction of plausible lower resolution approximations of the modal model. We can compute the audible impulse response of a variety of objects. Our solution is robust and can handle nonmanifold geometries that include both volumetric and surface parts. We present a system which allows us to manipulate and tune sounding objects in an appropriate way for games, training simulations, and other interactive virtual environments.

  1. Robust holographic storage system design.

    Science.gov (United States)

    Watanabe, Takahiro; Watanabe, Minoru

    2011-11-21

    Demand is increasing daily for large data storage systems that are useful for applications in spacecraft, space satellites, and space robots, which are all exposed to radiation-rich space environment. As candidates for use in space embedded systems, holographic storage systems are promising because they can easily provided the demanded large-storage capability. Particularly, holographic storage systems, which have no rotation mechanism, are demanded because they are virtually maintenance-free. Although a holographic memory itself is an extremely robust device even in a space radiation environment, its associated lasers and drive circuit devices are vulnerable. Such vulnerabilities sometimes engendered severe problems that prevent reading of all contents of the holographic memory, which is a turn-off failure mode of a laser array. This paper therefore presents a proposal for a recovery method for the turn-off failure mode of a laser array on a holographic storage system, and describes results of an experimental demonstration. © 2011 Optical Society of America

  2. Robust boosting via convex optimization

    Science.gov (United States)

    Rätsch, Gunnar

    2001-12-01

    In this work we consider statistical learning problems. A learning machine aims to extract information from a set of training examples such that it is able to predict the associated label on unseen examples. We consider the case where the resulting classification or regression rule is a combination of simple rules - also called base hypotheses. The so-called boosting algorithms iteratively find a weighted linear combination of base hypotheses that predict well on unseen data. We address the following issues: o The statistical learning theory framework for analyzing boosting methods. We study learning theoretic guarantees on the prediction performance on unseen examples. Recently, large margin classification techniques emerged as a practical result of the theory of generalization, in particular Boosting and Support Vector Machines. A large margin implies a good generalization performance. Hence, we analyze how large the margins in boosting are and find an improved algorithm that is able to generate the maximum margin solution. o How can boosting methods be related to mathematical optimization techniques? To analyze the properties of the resulting classification or regression rule, it is of high importance to understand whether and under which conditions boosting converges. We show that boosting can be used to solve large scale constrained optimization problems, whose solutions are well characterizable. To show this, we relate boosting methods to methods known from mathematical optimization, and derive convergence guarantees for a quite general family of boosting algorithms. o How to make Boosting noise robust? One of the problems of current boosting techniques is that they are sensitive to noise in the training sample. In order to make boosting robust, we transfer the soft margin idea from support vector learning to boosting. We develop theoretically motivated regularized algorithms that exhibit a high noise robustness. o How to adapt boosting to regression problems

  3. Robust Meter Network for Water Distribution Pipe Burst Detection

    OpenAIRE

    Donghwi Jung; Joong Hoon Kim

    2017-01-01

    A meter network is a set of meters installed throughout a water distribution system to measure system variables, such as the pipe flow rate and pressure. In the current hyper-connected world, meter networks are being exposed to meter failure conditions, such as malfunction of the meter’s physical system and communication system failure. Therefore, a meter network’s robustness should be secured for reliable provision of informative meter data. This paper introduces a multi-objective optimal me...

  4. Robust Image Analysis of Faces for Genetic Applications

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2010-01-01

    Roč. 6, č. 2 (2010), s. 95-102 ISSN 1801-5603 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : object localization * template matching * eye or mouth detection * robust correlation analysis * image denoising Subject RIV: BB - Applied Statistics, Operational Research http://www.ejbi.cz/articles/201012/47/1.html

  5. Robust AIC with High Breakdown Scale Estimate

    Directory of Open Access Journals (Sweden)

    Shokrya Saleh

    2014-01-01

    Full Text Available Akaike Information Criterion (AIC based on least squares (LS regression minimizes the sum of the squared residuals; LS is sensitive to outlier observations. Alternative criterion, which is less sensitive to outlying observation, has been proposed; examples are robust AIC (RAIC, robust Mallows Cp (RCp, and robust Bayesian information criterion (RBIC. In this paper, we propose a robust AIC by replacing the scale estimate with a high breakdown point estimate of scale. The robustness of the proposed methods is studied through its influence function. We show that, the proposed robust AIC is effective in selecting accurate models in the presence of outliers and high leverage points, through simulated and real data examples.

  6. Multimodel Robust Control for Hydraulic Turbine

    OpenAIRE

    Osuský, Jakub; Števo, Stanislav

    2014-01-01

    The paper deals with the multimodel and robust control system design and their combination based on M-Δ structure. Controller design will be done in the frequency domain with nominal performance specified by phase margin. Hydraulic turbine model is analyzed as system with unstructured uncertainty, and robust stability condition is included in controller design. Multimodel and robust control approaches are presented in detail on hydraulic turbine model. Control design approaches are compared a...

  7. Forecasting exchange rates: a robust regression approach

    OpenAIRE

    Preminger, Arie; Franck, Raphael

    2005-01-01

    The least squares estimation method as well as other ordinary estimation method for regression models can be severely affected by a small number of outliers, thus providing poor out-of-sample forecasts. This paper suggests a robust regression approach, based on the S-estimation method, to construct forecasting models that are less sensitive to data contamination by outliers. A robust linear autoregressive (RAR) and a robust neural network (RNN) models are estimated to study the predictabil...

  8. Kernel Based Nonlinear Dimensionality Reduction and Classification for Genomic Microarray

    Directory of Open Access Journals (Sweden)

    Lan Shu

    2008-07-01

    Full Text Available Genomic microarrays are powerful research tools in bioinformatics and modern medicinal research because they enable massively-parallel assays and simultaneous monitoring of thousands of gene expression of biological samples. However, a simple microarray experiment often leads to very high-dimensional data and a huge amount of information, the vast amount of data challenges researchers into extracting the important features and reducing the high dimensionality. In this paper, a nonlinear dimensionality reduction kernel method based locally linear embedding(LLE is proposed, and fuzzy K-nearest neighbors algorithm which denoises datasets will be introduced as a replacement to the classical LLE’s KNN algorithm. In addition, kernel method based support vector machine (SVM will be used to classify genomic microarray data sets in this paper. We demonstrate the application of the techniques to two published DNA microarray data sets. The experimental results confirm the superiority and high success rates of the presented method.

  9. Kernel-Based Approximate Dynamic Programming Using Bellman Residual Elimination

    Science.gov (United States)

    2010-02-01

    Redding, Mike Robbins, Frant Sobolic, Justin Teo, Tuna Toksoz, Glenn Tournier, Aditya Undurti, Mario Valenti, Andy Whitten, Albert Wu, and Rodrigo...vector algorithms. Neural Computation, 12(5):1207–1245, 2000. [143] P. Schweitzer and A. Seidman. Generalized polynomial approximation in Markovian

  10. Kernel based orthogonalization for change detection in hyperspectral images

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    function and all quantities needed in the analysis are expressed in terms of this kernel function. This means that we need not know the nonlinear mappings explicitly. Kernel PCA and MNF analyses handle nonlinearities by implicitly transforming data into high (even infinite) dimensional feature space via...... analysis all 126 spectral bands of the HyMap are included. Changes on the ground are most likely due to harvest having taken place between the two acquisitions and solar effects (both solar elevation and azimuth have changed). Both types of kernel analysis emphasize change and unlike kernel PCA, kernel MNF...

  11. Kernel-Based Learning for Domain-Specific Relation Extraction

    Science.gov (United States)

    Basili, Roberto; Giannone, Cristina; Del Vescovo, Chiara; Moschitti, Alessandro; Naggar, Paolo

    In a specific process of business intelligence, i.e. investigation on organized crime, empirical language processing technologies can play a crucial role. The analysis of transcriptions on investigative activities, such as police interrogatories, for the recognition and storage of complex relations among people and locations is a very difficult and time consuming task, ultimately based on pools of experts. We discuss here an inductive relation extraction platform that opens the way to much cheaper and consistent workflows. The presented empirical investigation shows that accurate results, comparable to the expert teams, can be achieved, and parametrization allows to fine tune the system behavior for fitting domain-specific requirements.

  12. Efficient Kernel-Based Ensemble Gaussian Mixture Filtering

    KAUST Repository

    Liu, Bo; Ait-El-Fquih, Boujemaa; Hoteit, Ibrahim

    2015-01-01

    (KF)-like update of the ensemble members and a particle filter (PF)-like update of the weights, followed by a resampling step to start a new forecast cycle. After formulating EnGMF for any observational operator, we analyze the influence

  13. Kernel based collaborative recommender system for e-purchasing

    Indian Academy of Sciences (India)

    proposed in order to overcome the traditional problems of CRS. ... known method for matrix factorization that provides the lowest rank ..... Adomavicius G, Tuzhilin A 2005 Toward the next generation of recommender systems: A survey of.

  14. Investigation into Text Classification With Kernel Based Schemes

    Science.gov (United States)

    2010-03-01

    Document Matrix TDMs Term-Document Matrices TMG Text to Matrix Generator TN True Negative TP True Positive VSM Vector Space Model xxii THIS PAGE...are represented as a term-document matrix, common evaluation metrics, and the software package Text to Matrix Generator ( TMG ). The classifier...AND METRICS This chapter introduces the indexing capabilities of the Text to Matrix Generator ( TMG ) Toolbox. Specific attention is placed on the

  15. Kernel-based noise filtering of neutron detector signals

    International Nuclear Information System (INIS)

    Park, Moon Ghu; Shin, Ho Cheol; Lee, Eun Ki

    2007-01-01

    This paper describes recently developed techniques for effective filtering of neutron detector signal noise. In this paper, three kinds of noise filters are proposed and their performance is demonstrated for the estimation of reactivity. The tested filters are based on the unilateral kernel filter, unilateral kernel filter with adaptive bandwidth and bilateral filter to show their effectiveness in edge preservation. Filtering performance is compared with conventional low-pass and wavelet filters. The bilateral filter shows a remarkable improvement compared with unilateral kernel and wavelet filters. The effectiveness and simplicity of the unilateral kernel filter with adaptive bandwidth is also demonstrated by applying it to the reactivity measurement performed during reactor start-up physics tests

  16. Robustness: confronting lessons from physics and biology.

    Science.gov (United States)

    Lesne, Annick

    2008-11-01

    The term robustness is encountered in very different scientific fields, from engineering and control theory to dynamical systems to biology. The main question addressed herein is whether the notion of robustness and its correlates (stability, resilience, self-organisation) developed in physics are relevant to biology, or whether specific extensions and novel frameworks are required to account for the robustness properties of living systems. To clarify this issue, the different meanings covered by this unique term are discussed; it is argued that they crucially depend on the kind of perturbations that a robust system should by definition withstand. Possible mechanisms underlying robust behaviours are examined, either encountered in all natural systems (symmetries, conservation laws, dynamic stability) or specific to biological systems (feedbacks and regulatory networks). Special attention is devoted to the (sometimes counterintuitive) interrelations between robustness and noise. A distinction between dynamic selection and natural selection in the establishment of a robust behaviour is underlined. It is finally argued that nested notions of robustness, relevant to different time scales and different levels of organisation, allow one to reconcile the seemingly contradictory requirements for robustness and adaptability in living systems.

  17. Robustness of Long Span Reciprocal Timber Structures

    DEFF Research Database (Denmark)

    Balfroid, Nathalie; Kirkegaard, Poul Henning

    2011-01-01

    engineer may take necessary steps to design robust structures that are insensitive to accidental circumstances. The present paper makes a discussion of such robustness issues related to the future development of reciprocal timber structures. The paper concludes that these kind of structures can have...... a potential as long span timber structures in real projects if they are carefully designed with respect to the overall robustness strategies.......Robustness of structural systems has obtained a renewed interest due to a much more frequent use of advanced types of structures with limited redundancy and serious consequences in case of failure. The interest has also been facilitated due to recently severe structural failures...

  18. Robust position estimation of a mobile vehicle

    International Nuclear Information System (INIS)

    Conan, V.

    1994-01-01

    The ability to estimate the position of a mobile vehicle is a key task for navigation over large distances in complex indoor environments such as nuclear power plants. Schematics of the plants are available, but they are incomplete, as real settings contain many objects, such as pipes, cables or furniture, that mask part of the model. The position estimation method described in this paper matches 3-D data with a simple schematic of a plant. It is basically independent of odometer information and viewpoint, robust to noisy data and spurious points and largely insensitive to occlusions. The method is based on a hypothesis/verification paradigm and its complexity is polynomial; it runs in O(m 4 n 4 ), where m represents the number of model patches and n the number of scene patches. Heuristics are presented to speed up the algorithm. Results on real 3-D data show good behaviour even when the scene is very occluded. (authors). 16 refs., 3 figs., 1 tab

  19. Robust Indicators of Nonproliferation Performance

    Energy Technology Data Exchange (ETDEWEB)

    Cowan, Mara R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kurzrok, Andrew J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-02-13

    Understanding how the nuclear industry may benefit from self-regulation is closely linked with understanding how to report compliance activities for nonproliferation and export control objectives, as well as how to distinguish high and low compliance performance. Drawing on the corporate sustainability reporting model, nuclear and dual-use commodities industries can frame socially responsible self-regulatory activities to distinguish themselves as good nonproliferators.

  20. Robust Satisficing Decision Making for Unmanned Aerial Vehicle Complex Missions under Severe Uncertainty.

    Directory of Open Access Journals (Sweden)

    Xiaoting Ji

    Full Text Available This paper presents a robust satisficing decision-making method for Unmanned Aerial Vehicles (UAVs executing complex missions in an uncertain environment. Motivated by the info-gap decision theory, we formulate this problem as a novel robust satisficing optimization problem, of which the objective is to maximize the robustness while satisfying some desired mission requirements. Specifically, a new info-gap based Markov Decision Process (IMDP is constructed to abstract the uncertain UAV system and specify the complex mission requirements with the Linear Temporal Logic (LTL. A robust satisficing policy is obtained to maximize the robustness to the uncertain IMDP while ensuring a desired probability of satisfying the LTL specifications. To this end, we propose a two-stage robust satisficing solution strategy which consists of the construction of a product IMDP and the generation of a robust satisficing policy. In the first stage, a product IMDP is constructed by combining the IMDP with an automaton representing the LTL specifications. In the second, an algorithm based on robust dynamic programming is proposed to generate a robust satisficing policy, while an associated robustness evaluation algorithm is presented to evaluate the robustness. Finally, through Monte Carlo simulation, the effectiveness of our algorithms is demonstrated on an UAV search mission under severe uncertainty so that the resulting policy can maximize the robustness while reaching the desired performance level. Furthermore, by comparing the proposed method with other robust decision-making methods, it can be concluded that our policy can tolerate higher uncertainty so that the desired performance level can be guaranteed, which indicates that the proposed method is much more effective in real applications.

  1. Robust Control Design for Uncertain Nonlinear Dynamic Systems

    Science.gov (United States)

    Kenny, Sean P.; Crespo, Luis G.; Andrews, Lindsey; Giesy, Daniel P.

    2012-01-01

    Robustness to parametric uncertainty is fundamental to successful control system design and as such it has been at the core of many design methods developed over the decades. Despite its prominence, most of the work on robust control design has focused on linear models and uncertainties that are non-probabilistic in nature. Recently, researchers have acknowledged this disparity and have been developing theory to address a broader class of uncertainties. This paper presents an experimental application of robust control design for a hybrid class of probabilistic and non-probabilistic parametric uncertainties. The experimental apparatus is based upon the classic inverted pendulum on a cart. The physical uncertainty is realized by a known additional lumped mass at an unknown location on the pendulum. This unknown location has the effect of substantially altering the nominal frequency and controllability of the nonlinear system, and in the limit has the capability to make the system neutrally stable and uncontrollable. Another uncertainty to be considered is a direct current motor parameter. The control design objective is to design a controller that satisfies stability, tracking error, control power, and transient behavior requirements for the largest range of parametric uncertainties. This paper presents an overview of the theory behind the robust control design methodology and the experimental results.

  2. On robust parameter estimation in brain-computer interfacing

    Science.gov (United States)

    Samek, Wojciech; Nakajima, Shinichi; Kawanabe, Motoaki; Müller, Klaus-Robert

    2017-12-01

    Objective. The reliable estimation of parameters such as mean or covariance matrix from noisy and high-dimensional observations is a prerequisite for successful application of signal processing and machine learning algorithms in brain-computer interfacing (BCI). This challenging task becomes significantly more difficult if the data set contains outliers, e.g. due to subject movements, eye blinks or loose electrodes, as they may heavily bias the estimation and the subsequent statistical analysis. Although various robust estimators have been developed to tackle the outlier problem, they ignore important structural information in the data and thus may not be optimal. Typical structural elements in BCI data are the trials consisting of a few hundred EEG samples and indicating the start and end of a task. Approach. This work discusses the parameter estimation problem in BCI and introduces a novel hierarchical view on robustness which naturally comprises different types of outlierness occurring in structured data. Furthermore, the class of minimum divergence estimators is reviewed and a robust mean and covariance estimator for structured data is derived and evaluated with simulations and on a benchmark data set. Main results. The results show that state-of-the-art BCI algorithms benefit from robustly estimated parameters. Significance. Since parameter estimation is an integral part of various machine learning algorithms, the presented techniques are applicable to many problems beyond BCI.

  3. Robust GPS autonomous signal quality monitoring

    Science.gov (United States)

    Ndili, Awele Nnaemeka

    The Global Positioning System (GPS), introduced by the U.S. Department of Defense in 1973, provides unprecedented world-wide navigation capabilities through a constellation of 24 satellites in global orbit, each emitting a low-power radio-frequency signal for ranging. GPS receivers track these transmitted signals, computing position to within 30 meters from range measurements made to four satellites. GPS has a wide range of applications, including aircraft, marine and land vehicle navigation. Each application places demands on GPS for various levels of accuracy, integrity, system availability and continuity of service. Radio frequency interference (RFI), which results from natural sources such as TV/FM harmonics, radar or Mobile Satellite Systems (MSS), presents a challenge in the use of GPS, by posing a threat to the accuracy, integrity and availability of the GPS navigation solution. In order to use GPS for integrity-sensitive applications, it is therefore necessary to monitor the quality of the received signal, with the objective of promptly detecting the presence of RFI, and thus provide a timely warning of degradation of system accuracy. This presents a challenge, since the myriad kinds of RFI affect the GPS receiver in different ways. What is required then, is a robust method of detecting GPS accuracy degradation, which is effective regardless of the origin of the threat. This dissertation presents a new method of robust signal quality monitoring for GPS. Algorithms for receiver autonomous interference detection and integrity monitoring are demonstrated. Candidate test statistics are derived from fundamental receiver measurements of in-phase and quadrature correlation outputs, and the gain of the Active Gain Controller (AGC). Performance of selected test statistics are evaluated in the presence of RFI: broadband interference, pulsed and non-pulsed interference, coherent CW at different frequencies; and non-RFI: GPS signal fading due to physical blockage and

  4. Robust Real-Time Tracking for Visual Surveillance

    Directory of Open Access Journals (Sweden)

    Aguilera Josep

    2007-01-01

    Full Text Available This paper describes a real-time multi-camera surveillance system that can be applied to a range of application domains. This integrated system is designed to observe crowded scenes and has mechanisms to improve tracking of objects that are in close proximity. The four component modules described in this paper are (i motion detection using a layered background model, (ii object tracking based on local appearance, (iii hierarchical object recognition, and (iv fused multisensor object tracking using multiple features and geometric constraints. This integrated approach to complex scene tracking is validated against a number of representative real-world scenarios to show that robust, real-time analysis can be performed.

  5. Including robustness in multi-criteria optimization for intensity-modulated proton therapy

    Science.gov (United States)

    Chen, Wei; Unkelbach, Jan; Trofimov, Alexei; Madden, Thomas; Kooy, Hanne; Bortfeld, Thomas; Craft, David

    2012-02-01

    We present a method to include robustness in a multi-criteria optimization (MCO) framework for intensity-modulated proton therapy (IMPT). The approach allows one to simultaneously explore the trade-off between different objectives as well as the trade-off between robustness and nominal plan quality. In MCO, a database of plans each emphasizing different treatment planning objectives, is pre-computed to approximate the Pareto surface. An IMPT treatment plan that strikes the best balance between the different objectives can be selected by navigating on the Pareto surface. In our approach, robustness is integrated into MCO by adding robustified objectives and constraints to the MCO problem. Uncertainties (or errors) of the robust problem are modeled by pre-calculated dose-influence matrices for a nominal scenario and a number of pre-defined error scenarios (shifted patient positions, proton beam undershoot and overshoot). Objectives and constraints can be defined for the nominal scenario, thus characterizing nominal plan quality. A robustified objective represents the worst objective function value that can be realized for any of the error scenarios and thus provides a measure of plan robustness. The optimization method is based on a linear projection solver and is capable of handling large problem sizes resulting from a fine dose grid resolution, many scenarios, and a large number of proton pencil beams. A base-of-skull case is used to demonstrate the robust optimization method. It is demonstrated that the robust optimization method reduces the sensitivity of the treatment plan to setup and range errors to a degree that is not achieved by a safety margin approach. A chordoma case is analyzed in more detail to demonstrate the involved trade-offs between target underdose and brainstem sparing as well as robustness and nominal plan quality. The latter illustrates the advantage of MCO in the context of robust planning. For all cases examined, the robust optimization for

  6. Implicitly Weighted Methods in Robust Image Analysis

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2012-01-01

    Roč. 44, č. 3 (2012), s. 449-462 ISSN 0924-9907 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : robustness * high breakdown point * outlier detection * robust correlation analysis * template matching * face recognition Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.767, year: 2012

  7. What is it to be sturdy (robust)?

    DEFF Research Database (Denmark)

    Nielsen, Niss Skov; Zwisler, Lars Pagter; Bojsen, Ann Kristina Mikkelsen

    Purpose: This paper intends to give a first insight into the concept of being "sturdy/robust"; To develop and test a Danish model of how to measure sturdi-ness/robustness; To test the scale's ability to identify people in emergency situa-tions who have high risk of developing psychological illness....

  8. Structural Robustness Evaluation of Offshore Wind Turbines

    DEFF Research Database (Denmark)

    Giuliani, Luisa; Bontempi, Franco

    2010-01-01

    in the framework of a safe design: it depends on different factors, like exposure, vulnerability and robustness. Particularly, the requirement of structural vulnerability and robustness are discussed in this paper and a numerical application is presented, in order to evaluate the effects of a ship collision...

  9. In Silico Design of Robust Bolalipid Membranes

    NARCIS (Netherlands)

    Bulacu, Monica; Periole, Xavier; Marrink, Siewert J.; Périole, Xavier

    The robustness of microorganisms used in industrial fermentations is essential for the efficiency and yield of the production process. A viable tool to increase the robustness is through engineering of the cell membrane and especially by incorporating lipids from species that survive under harsh

  10. Assessment of Process Robustness for Mass Customization

    DEFF Research Database (Denmark)

    Nielsen, Kjeld; Brunø, Thomas Ditlev

    2013-01-01

    robustness and their capability to develop it. Through literature study and analysis of robust process design characteristics a number of metrics are described which can be used for assessment. The metrics are evaluated and analyzed to be applied as KPI’s to help MC companies prioritize efforts in business...

  11. Applying Robust Design in an Industrial Context

    DEFF Research Database (Denmark)

    Christensen, Martin Ebro

    mechanical architectures. Furthermore a set of 15 robust design principles for reducing the variation in functional performance is compiled in a format directly supporting the work of the design engineer. With these foundational methods in place, the existing tools, methods and KPIs of Robust Design...

  12. The importance of robust design methodology

    DEFF Research Database (Denmark)

    Eifler, Tobias; Howard, Thomas J.

    2018-01-01

    infamous recalls in automotive history, that of the GM ignition switch, from the perspective of Robust Design. It is investigated if available Robust Design methods such as sensitivity analysis, tolerance stack-ups, design clarity, etc. would have been suitable to account for the performance variation...

  13. Robust Control Charts for Time Series Data

    NARCIS (Netherlands)

    Croux, C.; Gelper, S.; Mahieu, K.

    2010-01-01

    This article presents a control chart for time series data, based on the one-step- ahead forecast errors of the Holt-Winters forecasting method. We use robust techniques to prevent that outliers affect the estimation of the control limits of the chart. Moreover, robustness is important to maintain

  14. Efficient reanalysis techniques for robust topology optimization

    DEFF Research Database (Denmark)

    Amir, Oded; Sigmund, Ole; Lazarov, Boyan Stefanov

    2012-01-01

    efficient robust topology optimization procedures based on reanalysis techniques. The approach is demonstrated on two compliant mechanism design problems where robust design is achieved by employing either a worst case formulation or a stochastic formulation. It is shown that the time spent on finite...

  15. Extending the Scope of Robust Quadratic Optimization

    NARCIS (Netherlands)

    Marandi, Ahmadreza; Ben-Tal, A.; den Hertog, Dick; Melenberg, Bertrand

    In this paper, we derive tractable reformulations of the robust counterparts of convex quadratic and conic quadratic constraints with concave uncertainties for a broad range of uncertainty sets. For quadratic constraints with convex uncertainty, it is well-known that the robust counterpart is, in

  16. Security and robustness for collaborative monitors

    NARCIS (Netherlands)

    Testerink, Bas; Bulling, Nils; Dastani, Mehdi

    2016-01-01

    Decentralized monitors can be subject to robustness and security risks. Robustness risks include attacks on the monitor’s infrastructure in order to disable parts of its functionality. Security risks include attacks that try to extract information from the monitor and thereby possibly leak sensitive

  17. How Robust is Your System Resilience?

    Science.gov (United States)

    Homayounfar, M.; Muneepeerakul, R.

    2017-12-01

    Robustness and resilience are concepts in system thinking that have grown in importance and popularity. For many complex social-ecological systems, however, robustness and resilience are difficult to quantify and the connections and trade-offs between them difficult to study. Most studies have either focused on qualitative approaches to discuss their connections or considered only one of them under particular classes of disturbances. In this study, we present an analytical framework to address the linkage between robustness and resilience more systematically. Our analysis is based on a stylized dynamical model that operationalizes a widely used concept framework for social-ecological systems. The model enables us to rigorously define robustness and resilience and consequently investigate their connections. The results reveal the tradeoffs among performance, robustness, and resilience. They also show how the nature of the such tradeoffs varies with the choices of certain policies (e.g., taxation and investment in public infrastructure), internal stresses and external disturbances.

  18. International Conference on Robust Statistics 2015

    CERN Document Server

    Basu, Ayanendranath; Filzmoser, Peter; Mukherjee, Diganta

    2016-01-01

    This book offers a collection of recent contributions and emerging ideas in the areas of robust statistics presented at the International Conference on Robust Statistics 2015 (ICORS 2015) held in Kolkata during 12–16 January, 2015. The book explores the applicability of robust methods in other non-traditional areas which includes the use of new techniques such as skew and mixture of skew distributions, scaled Bregman divergences, and multilevel functional data methods; application areas being circular data models and prediction of mortality and life expectancy. The contributions are of both theoretical as well as applied in nature. Robust statistics is a relatively young branch of statistical sciences that is rapidly emerging as the bedrock of statistical analysis in the 21st century due to its flexible nature and wide scope. Robust statistics supports the application of parametric and other inference techniques over a broader domain than the strictly interpreted model scenarios employed in classical statis...

  19. A kriging metamodel-assisted robust optimization method based on a reverse model

    Science.gov (United States)

    Zhou, Hui; Zhou, Qi; Liu, Congwei; Zhou, Taotao

    2018-02-01

    The goal of robust optimization methods is to obtain a solution that is both optimum and relatively insensitive to uncertainty factors. Most existing robust optimization approaches use outer-inner nested optimization structures where a large amount of computational effort is required because the robustness of each candidate solution delivered from the outer level should be evaluated in the inner level. In this article, a kriging metamodel-assisted robust optimization method based on a reverse model (K-RMRO) is first proposed, in which the nested optimization structure is reduced into a single-loop optimization structure to ease the computational burden. Ignoring the interpolation uncertainties from kriging, K-RMRO may yield non-robust optima. Hence, an improved kriging-assisted robust optimization method based on a reverse model (IK-RMRO) is presented to take the interpolation uncertainty of kriging metamodel into consideration. In IK-RMRO, an objective switching criterion is introduced to determine whether the inner level robust optimization or the kriging metamodel replacement should be used to evaluate the robustness of design alternatives. The proposed criterion is developed according to whether or not the robust status of the individual can be changed because of the interpolation uncertainties from the kriging metamodel. Numerical and engineering cases are used to demonstrate the applicability and efficiency of the proposed approach.

  20. Specification of Concurrent Objects

    DEFF Research Database (Denmark)

    Sørensen, Morten U.

    relation over two objects and an event. In the model, objects can be composed by parallel composition, encapsulation, and hiding of operations. Refinement between objects is defined as fair trace inclusion.A specification language is presented where objects can be specified operationally by abstract...

  1. Paradigms in object recognition

    International Nuclear Information System (INIS)

    Mutihac, R.; Mutihac, R.C.

    1999-09-01

    A broad range of approaches has been proposed and applied for the complex and rather difficult task of object recognition that involves the determination of object characteristics and object classification into one of many a priori object types. Our paper revises briefly the three main different paradigms in pattern recognition, namely Bayesian statistics, neural networks, and expert systems. (author)

  2. BL Lacertae objects

    International Nuclear Information System (INIS)

    Disney, M.J.; Veron, P.

    1977-01-01

    The properties of BL Lacertae objects are discussed including their spectra, variability, and brightness. The historical development of observation, and the conclusion that these objects are possibly quasar-related objects rather than variable stars as originally supposed are treated. The possible mechanisms for the unusual luminosity of these objects are considered

  3. Designing the Object Game

    DEFF Research Database (Denmark)

    Filip, Diane; Lindegaard, Hanne

    2016-01-01

    The Object Game is an exploratory design game and an experiment of developing a tangible object that can spark dialogue and retrospection between collaborative partners and act as a boundary object. The objective of this article is to show and elaborate on the development of the Object Game......, and to provide case examples of the game in action. The Object Game has two parts – Story-building and Co-rating of objects – with the aim of stimulating a collaborative reflection on knowledge sharing with different objects. In Story-building, the participants visualize their knowledge sharing process...... these facilitated knowledge transfer, knowledge exchange, knowledge generation, and knowledge integration. The participants collaborative reflected on their use of different objects for knowledge sharing and learn which objects have been effective (and which have not been effective) in their collaborative...

  4. Object tracking using multiple camera video streams

    Science.gov (United States)

    Mehrubeoglu, Mehrube; Rojas, Diego; McLauchlan, Lifford

    2010-05-01

    Two synchronized cameras are utilized to obtain independent video streams to detect moving objects from two different viewing angles. The video frames are directly correlated in time. Moving objects in image frames from the two cameras are identified and tagged for tracking. One advantage of such a system involves overcoming effects of occlusions that could result in an object in partial or full view in one camera, when the same object is fully visible in another camera. Object registration is achieved by determining the location of common features in the moving object across simultaneous frames. Perspective differences are adjusted. Combining information from images from multiple cameras increases robustness of the tracking process. Motion tracking is achieved by determining anomalies caused by the objects' movement across frames in time in each and the combined video information. The path of each object is determined heuristically. Accuracy of detection is dependent on the speed of the object as well as variations in direction of motion. Fast cameras increase accuracy but limit the speed and complexity of the algorithm. Such an imaging system has applications in traffic analysis, surveillance and security, as well as object modeling from multi-view images. The system can easily be expanded by increasing the number of cameras such that there is an overlap between the scenes from at least two cameras in proximity. An object can then be tracked long distances or across multiple cameras continuously, applicable, for example, in wireless sensor networks for surveillance or navigation.

  5. Defining robustness protocols: a method to include and evaluate robustness in clinical plans

    International Nuclear Information System (INIS)

    McGowan, S E; Albertini, F; Lomax, A J; Thomas, S J

    2015-01-01

    We aim to define a site-specific robustness protocol to be used during the clinical plan evaluation process. Plan robustness of 16 skull base IMPT plans to systematic range and random set-up errors have been retrospectively and systematically analysed. This was determined by calculating the error-bar dose distribution (ebDD) for all the plans and by defining some metrics used to define protocols aiding the plan assessment. Additionally, an example of how to clinically use the defined robustness database is given whereby a plan with sub-optimal brainstem robustness was identified. The advantage of using different beam arrangements to improve the plan robustness was analysed. Using the ebDD it was found range errors had a smaller effect on dose distribution than the corresponding set-up error in a single fraction, and that organs at risk were most robust to the range errors, whereas the target was more robust to set-up errors. A database was created to aid planners in terms of plan robustness aims in these volumes. This resulted in the definition of site-specific robustness protocols. The use of robustness constraints allowed for the identification of a specific patient that may have benefited from a treatment of greater individuality. A new beam arrangement showed to be preferential when balancing conformality and robustness for this case. The ebDD and error-bar volume histogram proved effective in analysing plan robustness. The process of retrospective analysis could be used to establish site-specific robustness planning protocols in proton therapy. These protocols allow the planner to determine plans that, although delivering a dosimetrically adequate dose distribution, have resulted in sub-optimal robustness to these uncertainties. For these cases the use of different beam start conditions may improve the plan robustness to set-up and range uncertainties. (paper)

  6. Defining robustness protocols: a method to include and evaluate robustness in clinical plans

    Science.gov (United States)

    McGowan, S. E.; Albertini, F.; Thomas, S. J.; Lomax, A. J.

    2015-04-01

    We aim to define a site-specific robustness protocol to be used during the clinical plan evaluation process. Plan robustness of 16 skull base IMPT plans to systematic range and random set-up errors have been retrospectively and systematically analysed. This was determined by calculating the error-bar dose distribution (ebDD) for all the plans and by defining some metrics used to define protocols aiding the plan assessment. Additionally, an example of how to clinically use the defined robustness database is given whereby a plan with sub-optimal brainstem robustness was identified. The advantage of using different beam arrangements to improve the plan robustness was analysed. Using the ebDD it was found range errors had a smaller effect on dose distribution than the corresponding set-up error in a single fraction, and that organs at risk were most robust to the range errors, whereas the target was more robust to set-up errors. A database was created to aid planners in terms of plan robustness aims in these volumes. This resulted in the definition of site-specific robustness protocols. The use of robustness constraints allowed for the identification of a specific patient that may have benefited from a treatment of greater individuality. A new beam arrangement showed to be preferential when balancing conformality and robustness for this case. The ebDD and error-bar volume histogram proved effective in analysing plan robustness. The process of retrospective analysis could be used to establish site-specific robustness planning protocols in proton therapy. These protocols allow the planner to determine plans that, although delivering a dosimetrically adequate dose distribution, have resulted in sub-optimal robustness to these uncertainties. For these cases the use of different beam start conditions may improve the plan robustness to set-up and range uncertainties.

  7. PET functional volume delineation: a robustness and repeatability study

    International Nuclear Information System (INIS)

    Hatt, Mathieu; Cheze-le Rest, Catherine; Albarghach, Nidal; Pradier, Olivier; Visvikis, Dimitris

    2011-01-01

    Current state-of-the-art algorithms for functional uptake volume segmentation in PET imaging consist of threshold-based approaches, whose parameters often require specific optimization for a given scanner and associated reconstruction algorithms. Different advanced image segmentation approaches previously proposed and extensively validated, such as among others fuzzy C-means (FCM) clustering, or fuzzy locally adaptive bayesian (FLAB) algorithm have the potential to improve the robustness of functional uptake volume measurements. The objective of this study was to investigate robustness and repeatability with respect to various scanner models, reconstruction algorithms and acquisition conditions. Robustness was evaluated using a series of IEC phantom acquisitions carried out on different PET/CT scanners (Philips Gemini and Gemini Time-of-Flight, Siemens Biograph and GE Discovery LS) with their associated reconstruction algorithms (RAMLA, TF MLEM, OSEM). A range of acquisition parameters (contrast, duration) and reconstruction parameters (voxel size) were considered for each scanner model, and the repeatability of each method was evaluated on simulated and clinical tumours and compared to manual delineation. For all the scanner models, acquisition parameters and reconstruction algorithms considered, the FLAB algorithm demonstrated higher robustness in delineation of the spheres with low mean errors (10%) and variability (5%), with respect to threshold-based methodologies and FCM. The repeatability provided by all segmentation algorithms considered was very high with a negligible variability of <5% in comparison to that associated with manual delineation (5-35%). The use of advanced image segmentation algorithms may not only allow high accuracy as previously demonstrated, but also provide a robust and repeatable tool to aid physicians as an initial guess in determining functional volumes in PET. (orig.)

  8. A Robust Obstacle Avoidance for Service Robot Using Bayesian Approach

    Directory of Open Access Journals (Sweden)

    Widodo Budiharto

    2011-03-01

    Full Text Available The objective of this paper is to propose a robust obstacle avoidance method for service robot in indoor environment. The method for obstacles avoidance uses information about static obstacles on the landmark using edge detection. Speed and direction of people that walks as moving obstacle obtained by single camera using tracking and recognition system and distance measurement using 3 ultrasonic sensors. A new geometrical model and maneuvering method for moving obstacle avoidance introduced and combined with Bayesian approach for state estimation. The obstacle avoidance problem is formulated using decision theory, prior and posterior distribution and loss function to determine an optimal response based on inaccurate sensor data. Algorithms for moving obstacles avoidance method proposed and experiment results implemented to service robot also presented. Various experiments show that our proposed method very fast, robust and successfully implemented to service robot called Srikandi II that equipped with 4 DOF arm robot developed in our laboratory.

  9. Robust Design Optimization of an Aerospace Vehicle Prolusion System

    Directory of Open Access Journals (Sweden)

    Muhammad Aamir Raza

    2011-01-01

    Full Text Available This paper proposes a robust design optimization methodology under design uncertainties of an aerospace vehicle propulsion system. The approach consists of 3D geometric design coupled with complex internal ballistics, hybrid optimization, worst-case deviation, and efficient statistical approach. The uncertainties are propagated through worst-case deviation using first-order orthogonal design matrices. The robustness assessment is measured using the framework of mean-variance and percentile difference approach. A parametric sensitivity analysis is carried out to analyze the effects of design variables variation on performance parameters. A hybrid simulated annealing and pattern search approach is used as an optimizer. The results show the objective function of optimizing the mean performance and minimizing the variation of performance parameters in terms of thrust ratio and total impulse could be achieved while adhering to the system constraints.

  10. Finding Objects for Assisting Blind People

    OpenAIRE

    Yi, Chucai; Flores, Roberto W.; Chincha, Ricardo; Tian, YingLi

    2013-01-01

    Computer vision technology has been widely used for blind assistance, such as navigation and wayfinding. However, few camera-based systems are developed for helping blind or visually-impaired people to find daily necessities. In this paper, we propose a prototype system of blind-assistant object finding by camera-based network and matching-based recognition. We collect a dataset of daily necessities and apply Speeded-Up Robust Features (SURF) and Scale Invariant Feature Transform (SIFT) featu...

  11. Robustness Analysis of Dynamic Watermarks

    Directory of Open Access Journals (Sweden)

    Ivan V. Nechta

    2017-06-01

    Full Text Available In this paper we consider previously known scheme of dynamic watermarks embedding (Ra- dix-n that is used for preventing illegal use of software. According to the scheme a watermark is dynamic linked data structure (graph, which is created in memory during program execution. Hidden data, such as information about author, can be represented in a different type of graph structure. This data can be extracted and demonstrated in judicial proceedings. This paper declared that the above mentioned scheme was previously one of the most reliable, has a number of features that allows an attacker to detect a stage of watermark construction in the program, and therefore it can be corrupted or deleted. The author of this article shows the weakness of Radix-N scheme, which consists in the fact that we can reveal dynamic data structures of a program by using information received from some API-functions hooker which catches function calls of dynamic memory allocation. One of these data structures is the watermark. Pointers on dynamically created objects (arrays, variables, class items, etc. of a program can be detected by content analysis of computer's RAM. Different dynamic objects in memory interconnected by pointers form dynamic data structures of a program such as lists, stacks, trees and other graphs (including the watermark. Our experiment shows that in the vast majority of cases the amount of data structure in programs is small, which increases probability of a successful attack. Also we present an algorithm for finding connected components of a graph with linear time-consuming in cases where the number of nodes is about 106. On the basis of the experimental findings the new watermarking scheme has been presented, which is resistant to the proposed attack. It is offered to use different graph structure representation of a watermark, where edges are implemented using unique signatures. Our scheme uses content encrypting of graph nodes (except signature

  12. Robust estimation of seismic coda shape

    Science.gov (United States)

    Nikkilä, Mikko; Polishchuk, Valentin; Krasnoshchekov, Dmitry

    2014-04-01

    We present a new method for estimation of seismic coda shape. It falls into the same class of methods as non-parametric shape reconstruction with the use of neural network techniques where data are split into a training and validation data sets. We particularly pursue the well-known problem of image reconstruction formulated in this case as shape isolation in the presence of a broadly defined noise. This combined approach is enabled by the intrinsic feature of seismogram which can be divided objectively into a pre-signal seismic noise with lack of the target shape, and the remainder that contains scattered waveforms compounding the coda shape. In short, we separately apply shape restoration procedure to pre-signal seismic noise and the event record, which provides successful delineation of the coda shape in the form of a smooth almost non-oscillating function of time. The new algorithm uses a recently developed generalization of classical computational-geometry tool of α-shape. The generalization essentially yields robust shape estimation by ignoring locally a number of points treated as extreme values, noise or non-relevant data. Our algorithm is conceptually simple and enables the desired or pre-determined level of shape detail, constrainable by an arbitrary data fit criteria. The proposed tool for coda shape delineation provides an alternative to moving averaging and/or other smoothing techniques frequently used for this purpose. The new algorithm is illustrated with an application to the problem of estimating the coda duration after a local event. The obtained relation coefficient between coda duration and epicentral distance is consistent with the earlier findings in the region of interest.

  13. Robust statistics and geochemical data analysis

    International Nuclear Information System (INIS)

    Di, Z.

    1987-01-01

    Advantages of robust procedures over ordinary least-squares procedures in geochemical data analysis is demonstrated using NURE data from the Hot Springs Quadrangle, South Dakota, USA. Robust principal components analysis with 5% multivariate trimming successfully guarded the analysis against perturbations by outliers and increased the number of interpretable factors. Regression with SINE estimates significantly increased the goodness-of-fit of the regression and improved the correspondence of delineated anomalies with known uranium prospects. Because of the ubiquitous existence of outliers in geochemical data, robust statistical procedures are suggested as routine procedures to replace ordinary least-squares procedures

  14. Design Robust Controller for Rotary Kiln

    Directory of Open Access Journals (Sweden)

    Omar D. Hernández-Arboleda

    2013-11-01

    Full Text Available This paper presents the design of a robust controller for a rotary kiln. The designed controller is a combination of a fractional PID and linear quadratic regulator (LQR, these are not used to control the kiln until now, in addition robustness criteria are evaluated (gain margin, phase margin, strength gain, rejecting high frequency noise and sensitivity applied to the entire model (controller-plant, obtaining good results with a frequency range of 0.020 to 90 rad/s, which contributes to the robustness of the system.

  15. Towards distortion-free robust image authentication

    International Nuclear Information System (INIS)

    Coltuc, D

    2007-01-01

    This paper investigates a general framework for distortion-free robust image authentication by multiple marking. First, by robust watermarking a subsampled version of image edges is embedded. Then, by reversible watermarking the information needed to recover the original image is inserted, too. The hiding capacity of the reversible watermarking is the essential requirement for this approach. Thus in case of no attacks not only image is authenticated but also the original is exactly recovered. In case of attacks, reversibility is lost, but image can still be authenticated. Preliminary results providing very good robustness against JPEG compression are presented

  16. An Overview of the Adaptive Robust DFT

    Directory of Open Access Journals (Sweden)

    Djurović Igor

    2010-01-01

    Full Text Available Abstract This paper overviews basic principles and applications of the robust DFT (RDFT approach, which is used for robust processing of frequency-modulated (FM signals embedded in non-Gaussian heavy-tailed noise. In particular, we concentrate on the spectral analysis and filtering of signals corrupted by impulsive distortions using adaptive and nonadaptive robust estimators. Several adaptive estimators of location parameter are considered, and it is shown that their application is preferable with respect to non-adaptive counterparts. This fact is demonstrated by efficiency comparison of adaptive and nonadaptive RDFT methods for different noise environments.

  17. A robust interpretation of duration calculus

    DEFF Research Database (Denmark)

    Franzle, M.; Hansen, Michael Reichhardt

    2005-01-01

    We transfer the concept of robust interpretation from arithmetic first-order theories to metric-time temporal logics. The idea is that the interpretation of a formula is robust iff its truth value does not change under small variation of the constants in the formula. Exemplifying this on Duration...... Calculus (DC), our findings are that the robust interpretation of DC is equivalent to a multi-valued interpretation that uses the real numbers as semantic domain and assigns Lipschitz-continuous interpretations to all operators of DC. Furthermore, this continuity permits approximation between discrete...

  18. REINA at CLEF 2007 Robust Task

    OpenAIRE

    Zazo Rodríguez, Ángel Francisco; Figuerola, Carlos G.; Alonso Berrocal, José Luis

    2007-01-01

    This paper describes our work at CLEF 2007 Robust Task. We have participated in the monolingual (English, French and Portuguese) and the bilingual (English to French) subtask. At CLEF 2006 our research group obtained very good results applying local query expansion using windows of terms in the robust task. This year we have used the same expansion technique, but taking into account some criteria of robustness: MAP, GMAP, MMR, GS@10, P@10, number of failed topics, number of topics bellow 0.1 ...

  19. REINA at CLEF 2007 Robust Track (2007)

    OpenAIRE

    Zazo, Ángel F.; G.-Figuerola, Carlos; Alonso-Berrocal, José-Luis

    2007-01-01

    This paper describes our work at CLEF 2007 Robust Task. We have participated in the monolingual (English, French and Portuguese) and the bilingual (English to French) subtask. At CLEF 2006 our research group obtained very good results applying local query expansion using windows of terms in the robust task. This year we have used the same expansion technique, but taking into account some criteria of robustness: MAP, GMAP, MMR, GS@10, P@10, number of failed topics, number of topics bellow 0.1 ...

  20. Danish Requirements for Robustness of Structures

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Christensen, H. H.

    2006-01-01

    . This paper describes the background of the revised robustness requirements implemented in the Danish Code of Practice for Safety of Structures in 2003 [1, 2, 3]. According to the Danish design rules robustness shall be documented for all structures where consequences of failure are serious. This paper...... describes the background of the design procedure in the Danish codes, which shall be followed in order to document sufficient robustness in the following steps: Step 1: review of loads and possible failure modes/scenarios and determination of acceptable collapse extent. Step 2: review of the structural...

  1. Robustness-related issues in speaker recognition

    CERN Document Server

    Zheng, Thomas Fang

    2017-01-01

    This book presents an overview of speaker recognition technologies with an emphasis on dealing with robustness issues. Firstly, the book gives an overview of speaker recognition, such as the basic system framework, categories under different criteria, performance evaluation and its development history. Secondly, with regard to robustness issues, the book presents three categories, including environment-related issues, speaker-related issues and application-oriented issues. For each category, the book describes the current hot topics, existing technologies, and potential research focuses in the future. The book is a useful reference book and self-learning guide for early researchers working in the field of robust speech recognition.

  2. Robustness and structure of complex networks

    Science.gov (United States)

    Shao, Shuai

    This dissertation covers the two major parts of my PhD research on statistical physics and complex networks: i) modeling a new type of attack -- localized attack, and investigating robustness of complex networks under this type of attack; ii) discovering the clustering structure in complex networks and its influence on the robustness of coupled networks. Complex networks appear in every aspect of our daily life and are widely studied in Physics, Mathematics, Biology, and Computer Science. One important property of complex networks is their robustness under attacks, which depends crucially on the nature of attacks and the structure of the networks themselves. Previous studies have focused on two types of attack: random attack and targeted attack, which, however, are insufficient to describe many real-world damages. Here we propose a new type of attack -- localized attack, and study the robustness of complex networks under this type of attack, both analytically and via simulation. On the other hand, we also study the clustering structure in the network, and its influence on the robustness of a complex network system. In the first part, we propose a theoretical framework to study the robustness of complex networks under localized attack based on percolation theory and generating function method. We investigate the percolation properties, including the critical threshold of the phase transition pc and the size of the giant component Pinfinity. We compare localized attack with random attack and find that while random regular (RR) networks are more robust against localized attack, Erdoḧs-Renyi (ER) networks are equally robust under both types of attacks. As for scale-free (SF) networks, their robustness depends crucially on the degree exponent lambda. The simulation results show perfect agreement with theoretical predictions. We also test our model on two real-world networks: a peer-to-peer computer network and an airline network, and find that the real-world networks

  3. Robust Structured Control Design via LMI Optimization

    DEFF Research Database (Denmark)

    Adegas, Fabiano Daher; Stoustrup, Jakob

    2011-01-01

    This paper presents a new procedure for discrete-time robust structured control design. Parameter-dependent nonconvex conditions for stabilizable and induced L2-norm performance controllers are solved by an iterative linear matrix inequalities (LMI) optimization. A wide class of controller...... structures including decentralized of any order, fixed-order dynamic output feedback, static output feedback can be designed robust to polytopic uncertainties. Stability is proven by a parameter-dependent Lyapunov function. Numerical examples on robust stability margins shows that the proposed procedure can...

  4. Robust K-Median and K-Means Clustering Algorithms for Incomplete Data

    Directory of Open Access Journals (Sweden)

    Jinhua Li

    2016-01-01

    Full Text Available Incomplete data with missing feature values are prevalent in clustering problems. Traditional clustering methods first estimate the missing values by imputation and then apply the classical clustering algorithms for complete data, such as K-median and K-means. However, in practice, it is often hard to obtain accurate estimation of the missing values, which deteriorates the performance of clustering. To enhance the robustness of clustering algorithms, this paper represents the missing values by interval data and introduces the concept of robust cluster objective function. A minimax robust optimization (RO formulation is presented to provide clustering results, which are insensitive to estimation errors. To solve the proposed RO problem, we propose robust K-median and K-means clustering algorithms with low time and space complexity. Comparisons and analysis of experimental results on both artificially generated and real-world incomplete data sets validate the robustness and effectiveness of the proposed algorithms.

  5. Towards robust optimal design of storm water systems

    Science.gov (United States)

    Marquez Calvo, Oscar; Solomatine, Dimitri

    2015-04-01

    In this study the focus is on the design of a storm water or a combined sewer system. Such a system should be capable to handle properly most of the storm to minimize the damages caused by flooding due to the lack of capacity of the system to cope with rain water at peak times. This problem is a multi-objective optimization problem: we have to take into account the minimization of the construction costs, the minimization of damage costs due to flooding, and possibly other criteria. One of the most important factors influencing the design of storm water systems is the expected amount of water to deal with. It is common that this infrastructure is developed with the capacity to cope with events that occur once in, say 10 or 20 years - so-called design rainfall events. However, rainfall is a random variable and such uncertainty typically is not taken explicitly into account in optimization. Rainfall design data is based on historical information of rainfalls, but many times this data is based on unreliable measures; or in not enough historical information; or as we know, the patterns of rainfall are changing regardless of historical information. There are also other sources of uncertainty influencing design, for example, leakages in the pipes and accumulation of sediments in pipes. In the context of storm water or combined sewer systems design or rehabilitation, robust optimization technique should be able to find the best design (or rehabilitation plan) within the available budget but taking into account uncertainty in those variables that were used to design the system. In this work we consider various approaches to robust optimization proposed by various authors (Gabrel, Murat, Thiele 2013; Beyer, Sendhoff 2007) and test a novel method ROPAR (Solomatine 2012) to analyze robustness. References Beyer, H.G., & Sendhoff, B. (2007). Robust optimization - A comprehensive survey. Comput. Methods Appl. Mech. Engrg., 3190-3218. Gabrel, V.; Murat, C., Thiele, A. (2014

  6. Seeing Objects as Faces Enhances Object Detection.

    Science.gov (United States)

    Takahashi, Kohske; Watanabe, Katsumi

    2015-10-01

    The face is a special visual stimulus. Both bottom-up processes for low-level facial features and top-down modulation by face expectations contribute to the advantages of face perception. However, it is hard to dissociate the top-down factors from the bottom-up processes, since facial stimuli mandatorily lead to face awareness. In the present study, using the face pareidolia phenomenon, we demonstrated that face awareness, namely seeing an object as a face, enhances object detection performance. In face pareidolia, some people see a visual stimulus, for example, three dots arranged in V shape, as a face, while others do not. This phenomenon allows us to investigate the effect of face awareness leaving the stimulus per se unchanged. Participants were asked to detect a face target or a triangle target. While target per se was identical between the two tasks, the detection sensitivity was higher when the participants recognized the target as a face. This was the case irrespective of the stimulus eccentricity or the vertical orientation of the stimulus. These results demonstrate that seeing an object as a face facilitates object detection via top-down modulation. The advantages of face perception are, therefore, at least partly, due to face awareness.

  7. Seeing Objects as Faces Enhances Object Detection

    Directory of Open Access Journals (Sweden)

    Kohske Takahashi

    2015-09-01

    Full Text Available The face is a special visual stimulus. Both bottom-up processes for low-level facial features and top-down modulation by face expectations contribute to the advantages of face perception. However, it is hard to dissociate the top-down factors from the bottom-up processes, since facial stimuli mandatorily lead to face awareness. In the present study, using the face pareidolia phenomenon, we demonstrated that face awareness, namely seeing an object as a face, enhances object detection performance. In face pareidolia, some people see a visual stimulus, for example, three dots arranged in V shape, as a face, while others do not. This phenomenon allows us to investigate the effect of face awareness leaving the stimulus per se unchanged. Participants were asked to detect a face target or a triangle target. While target per se was identical between the two tasks, the detection sensitivity was higher when the participants recognized the target as a face. This was the case irrespective of the stimulus eccentricity or the vertical orientation of the stimulus. These results demonstrate that seeing an object as a face facilitates object detection via top-down modulation. The advantages of face perception are, therefore, at least partly, due to face awareness.

  8. A novel non-probabilistic approach using interval analysis for robust design optimization

    International Nuclear Information System (INIS)

    Sun, Wei; Dong, Rongmei; Xu, Huanwei

    2009-01-01

    A technique for formulation of the objective and constraint functions with uncertainty plays a crucial role in robust design optimization. This paper presents the first application of interval methods for reformulating the robust optimization problem. Based on interval mathematics, the original real-valued objective and constraint functions are replaced with the interval-valued functions, which directly represent the upper and lower bounds of the new functions under uncertainty. The single objective function is converted into two objective functions for minimizing the mean value and the variation, and the constraint functions are reformulated with the acceptable robustness level, resulting in a bi-level mathematical model. Compared with other methods, this method is efficient and does not require presumed probability distribution of uncertain factors or gradient or continuous information of constraints. Two numerical examples are used to illustrate the validity and feasibility of the presented method

  9. Probabilistic active recognition of multiple objects using Hough-based geometric matching features

    CSIR Research Space (South Africa)

    Govender, N

    2015-01-01

    Full Text Available be recognized simultaneously, and occlusion and clutter (through distracter objects) is common. We propose a representation for object viewpoints using Hough transform based geometric matching features, which are robust in such circumstances. We show how...

  10. Object Recognition In HADOOP Using HIPI

    Directory of Open Access Journals (Sweden)

    Ankit Kumar Agrawal

    2015-07-01

    Full Text Available Abstract The amount of images and videos being shared by the user is exponentially increasing but applications that perform video analytics is severely lacking or work on limited set of data. It is also challenging to perform analytics with less time complexity. Object recognition is the primary step in video analytics. We implement a robust method to extract objects from the data which is in unstructured format and cannot be processed directly by relational databases. In this study we present our report with results after performance evaluation and compare them with results of MATLAB.

  11. Finding Objects for Assisting Blind People.

    Science.gov (United States)

    Yi, Chucai; Flores, Roberto W; Chincha, Ricardo; Tian, Yingli

    2013-07-01

    Computer vision technology has been widely used for blind assistance, such as navigation and wayfinding. However, few camera-based systems are developed for helping blind or visually-impaired people to find daily necessities. In this paper, we propose a prototype system of blind-assistant object finding by camera-based network and matching-based recognition. We collect a dataset of daily necessities and apply Speeded-Up Robust Features (SURF) and Scale Invariant Feature Transform (SIFT) feature descriptors to perform object recognition. Experimental results demonstrate the effectiveness of our prototype system.

  12. Objectivity And Moral Relativism

    OpenAIRE

    Magni, Sergio Filippo

    2017-01-01

    The relativity of morals has usually been taken as an argument against the objectivity of ethics. However, a more careful analysis can show that there are forms of moral objectivism which have relativistic implications, and that moral relativism can be compatible with the objectivity of ethics. Such an objectivity is not always in contrast to moral relativism and it is possible to be relativists without having to give up the claim of objectivity in ethics

  13. Fusion of Sensors That Interact Dynamically for Exploratory Development of Robust, Fast Object Detection and Recognition

    National Research Council Canada - National Science Library

    Bandera, Cesar

    1996-01-01

    .... The report first lays a theoretical foundation for reinforcement learning. It then introduces particular reinforcement learning algorithms in conjunction with function approximation as an efficient learning control method for visual attention...

  14. Robust synthesis for real-time systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Legay, Axel; Traonouez, Luois-Marie

    2014-01-01

    Specification theories for real-time systems allow reasoning about interfaces and their implementation models, using a set of operators that includes satisfaction, refinement, logical and parallel composition. To make such theories applicable throughout the entire design process from an abstract...... of introducing small perturbations into formal models. We address this problem of robust implementations in timed specification theories. We first consider a fixed perturbation and study the robustness of timed specifications with respect to the operators of the theory. To this end we synthesize robust...... specification to an implementation, we need to reason about the possibility to effectively implement the theoretical specifications on physical systems, despite their limited precision. In the literature, this implementation problem has been linked to the robustness problem that analyzes the consequences...

  15. Robust adaptive synchronization of general dynamical networks ...

    Indian Academy of Sciences (India)

    Robust adaptive synchronization; dynamical network; multiple delays; multiple uncertainties. ... Networks such as neural networks, communication transmission networks, social rela- tionship networks etc. ..... a very good effect. Pramana – J.

  16. Technical Challenges Hindering Development of Robust Wireless ...

    African Journals Online (AJOL)

    PROF. OLIVER OSUAGWA

    2015-12-01

    Dec 1, 2015 ... challenges remain to be resolved, in designing robust wireless networks that can deliver the performance ... demonstrated the first radio transmission from the Isle of ... distances with better quality, less power, and smaller ...

  17. Multifidelity Robust Aeroelastic Design, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Nielsen Engineering & Research (NEAR) proposes a new method to generate mathematical models of wind-tunnel models and flight vehicles for robust aeroelastic...

  18. The structural robustness of multiprocessor computing system

    Directory of Open Access Journals (Sweden)

    N. Andronaty

    1996-03-01

    Full Text Available The model of the multiprocessor computing system on the base of transputers which permits to resolve the question of valuation of a structural robustness (viability, survivability is described.

  19. Design principles for robust oscillatory behavior.

    Science.gov (United States)

    Castillo-Hair, Sebastian M; Villota, Elizabeth R; Coronado, Alberto M

    2015-09-01

    Oscillatory responses are ubiquitous in regulatory networks of living organisms, a fact that has led to extensive efforts to study and replicate the circuits involved. However, to date, design principles that underlie the robustness of natural oscillators are not completely known. Here we study a three-component enzymatic network model in order to determine the topological requirements for robust oscillation. First, by simulating every possible topological arrangement and varying their parameter values, we demonstrate that robust oscillators can be obtained by augmenting the number of both negative feedback loops and positive autoregulations while maintaining an appropriate balance of positive and negative interactions. We then identify network motifs, whose presence in more complex topologies is a necessary condition for obtaining oscillatory responses. Finally, we pinpoint a series of simple architectural patterns that progressively render more robust oscillators. Together, these findings can help in the design of more reliable synthetic biomolecular networks and may also have implications in the understanding of other oscillatory systems.

  20. Framework for Robustness Assessment of Timber Structures

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2011-01-01

    This paper presents a theoretical framework for the design and analysis of robustness of timber structures. This is actualized by a more4 frequent use of advanced types of timber structures with limited redundancy and serious consequences in the case of failure. Combined with increased requirements...... to efficiency in design and execution followed by increased risk of human errors has made the need of requirements to robustness of new structures essential. Further, the collapse of the Ballerup Super Arena, the bad Reichenhall Ice-Arena and a number of other structural systems during the last 10 years has...... increased the interest in robustness. Typically, modern structural design codes require that ‘the consequence of damages to structures should not be disproportional to the causes of the damages’. However, although the importance of robustness for structural design is widely recognized, the code requirements...

  1. Robust Analysis and Design of Multivariable Systems

    National Research Council Canada - National Science Library

    Tannenbaum, Allen

    1998-01-01

    In this Final Report, we will describe the work we have performed in robust control theory and nonlinear control, and the utilization of techniques in image processing and computer vision for problems in visual tracking...

  2. Robust Tracking Control for a Piezoelectric Actuator

    National Research Council Canada - National Science Library

    Salah, M; McIntyre, M; Dawson, D; Wagner, J

    2006-01-01

    In this paper, a hysteresis model-based nonlinear robust controller is developed for a piezoelectric actuator, utilizing a Lyapunov-based stability analysis, which ensures that a desired displacement...

  3. Objects in Motion

    Science.gov (United States)

    Damonte, Kathleen

    2004-01-01

    One thing scientists study is how objects move. A famous scientist named Sir Isaac Newton (1642-1727) spent a lot of time observing objects in motion and came up with three laws that describe how things move. This explanation only deals with the first of his three laws of motion. Newton's First Law of Motion says that moving objects will continue…

  4. Survivability via Control Objectives

    Energy Technology Data Exchange (ETDEWEB)

    CAMPBELL,PHILIP L.

    2000-08-11

    Control objectives open an additional front in the survivability battle. A given set of control objectives is valuable if it represents good practices, it is complete (it covers all the necessary areas), and it is auditable. CobiT and BS 7799 are two examples of control objective sets.

  5. Repurposing learning object components

    NARCIS (Netherlands)

    Verbert, K.; Jovanovic, J.; Gasevic, D.; Duval, E.; Meersman, R.

    2005-01-01

    This paper presents an ontology-based framework for repurposing learning object components. Unlike the usual practice where learning object components are assembled manually, the proposed framework enables on-the-fly access and repurposing of learning object components. The framework supports two

  6. Antecedents and Dimensions of Supply Chain Robustness

    OpenAIRE

    Durach, Christian F.; Wieland, Andreas; Machuca, Jose A.D.

    2015-01-01

    Purpose – The purpose of this paper is to provide groundwork for an emerging theory of supply chain robustness – which has been conceptualized as a dimension of supply chain resilience – through reviewing and synthesizing related yet disconnected studies. The paper develops a formal definition of supply chain robustness to build a framework that captures the dimensions, antecedents and moderators of the construct as discussed in the literature. Design/methodology/approach – The...

  7. A Unifying Mathematical Framework for Genetic Robustness, Environmental Robustness, Network Robustness and their Trade-offs on Phenotype Robustness in Biological Networks. Part III: Synthetic Gene Networks in Synthetic Biology

    Science.gov (United States)

    Chen, Bor-Sen; Lin, Ying-Po

    2013-01-01

    Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties that are observed in biological systems at many different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be large enough to confer: intrinsic robustness for tolerating intrinsic parameter fluctuations; genetic robustness for buffering genetic variations; and environmental robustness for resisting environmental disturbances. Network robustness is needed so phenotype stability of biological network can be maintained, guaranteeing phenotype robustness. Synthetic biology is foreseen to have important applications in biotechnology and medicine; it is expected to contribute significantly to a better understanding of functioning of complex biological systems. This paper presents a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation for synthetic gene networks in synthetic biology. Further, from the unifying mathematical framework, we found that the phenotype robustness criterion for synthetic gene networks is the following: if intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness, then the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in synthetic biology can also be investigated through corresponding phenotype robustness criteria from the systematic point of view. Finally, a robust synthetic design that involves network evolution algorithms with desired behavior under intrinsic parameter fluctuations, genetic variations, and environmental

  8. Adaptive Critic Nonlinear Robust Control: A Survey.

    Science.gov (United States)

    Wang, Ding; He, Haibo; Liu, Derong

    2017-10-01

    Adaptive dynamic programming (ADP) and reinforcement learning are quite relevant to each other when performing intelligent optimization. They are both regarded as promising methods involving important components of evaluation and improvement, at the background of information technology, such as artificial intelligence, big data, and deep learning. Although great progresses have been achieved and surveyed when addressing nonlinear optimal control problems, the research on robustness of ADP-based control strategies under uncertain environment has not been fully summarized. Hence, this survey reviews the recent main results of adaptive-critic-based robust control design of continuous-time nonlinear systems. The ADP-based nonlinear optimal regulation is reviewed, followed by robust stabilization of nonlinear systems with matched uncertainties, guaranteed cost control design of unmatched plants, and decentralized stabilization of interconnected systems. Additionally, further comprehensive discussions are presented, including event-based robust control design, improvement of the critic learning rule, nonlinear H ∞ control design, and several notes on future perspectives. By applying the ADP-based optimal and robust control methods to a practical power system and an overhead crane plant, two typical examples are provided to verify the effectiveness of theoretical results. Overall, this survey is beneficial to promote the development of adaptive critic control methods with robustness guarantee and the construction of higher level intelligent systems.

  9. Benefits and Challenges when Performing Robust Topology Optimization for Interior Acoustic Problems

    DEFF Research Database (Denmark)

    Christiansen, Rasmus Ellebæk; Jensen, Jakob Søndergaard; Lazarov, Boyan Stefanov

    The objective of this work is to present benets and challenges of using robust topology optimization techniques for minimizing the sound pressure in interior acoustic problems. The focus is on creating designs which maintain high performance under uniform spatial variations. This work takes offset...... in previous work considering topology optimization for interior acoustic problems, [1]. However in the previous work the robustness of the designs was not considered....

  10. Benefits and Challenges when Performing Robust Topology Optimization for Interior Acoustic Problems

    OpenAIRE

    Christiansen, Rasmus Ellebæk; Jensen, Jakob Søndergaard; Lazarov, Boyan Stefanov; Sigmund, Ole

    2014-01-01

    The objective of this work is to present benets and challenges of using robust topology optimization techniques for minimizing the sound pressure in interior acoustic problems. The focus is on creating designs which maintain high performance under uniform spatial variations. This work takes offset in previous work considering topology optimization for interior acoustic problems, [1]. However in the previous work the robustness of the designs was not considered.

  11. Physics- and engineering knowledge-based geometry repair system for robust parametric CAD geometries

    OpenAIRE

    Li, Dong

    2012-01-01

    In modern multi-objective design optimisation, an effective geometry engine is becoming an essential tool and its performance has a significant impact on the entire process. Building a parametric geometry requires difficult compromises between the conflicting goals of robustness and flexibility. The work presents a solution for improving the robustness of parametric geometry models by capturing and modelling relative engineering knowledge into a surrogate model, and deploying it automatically...

  12. Robust Distributed Model Predictive Load Frequency Control of Interconnected Power System

    Directory of Open Access Journals (Sweden)

    Xiangjie Liu

    2013-01-01

    Full Text Available Considering the load frequency control (LFC of large-scale power system, a robust distributed model predictive control (RDMPC is presented. The system uncertainty according to power system parameter variation alone with the generation rate constraints (GRC is included in the synthesis procedure. The entire power system is composed of several control areas, and the problem is formulated as convex optimization problem with linear matrix inequalities (LMI that can be solved efficiently. It minimizes an upper bound on a robust performance objective for each subsystem. Simulation results show good dynamic response and robustness in the presence of power system dynamic uncertainties.

  13. Optimisation in the Design of Environmental Sensor Networks with Robustness Consideration

    Science.gov (United States)

    Budi, Setia; de Souza, Paulo; Timms, Greg; Malhotra, Vishv; Turner, Paul

    2015-01-01

    This work proposes the design of Environmental Sensor Networks (ESN) through balancing robustness and redundancy. An Evolutionary Algorithm (EA) is employed to find the optimal placement of sensor nodes in the Region of Interest (RoI). Data quality issues are introduced to simulate their impact on the performance of the ESN. Spatial Regression Test (SRT) is also utilised to promote robustness in data quality of the designed ESN. The proposed method provides high network representativeness (fit for purpose) with minimum sensor redundancy (cost), and ensures robustness by enabling the network to continue to achieve its objectives when some sensors fail. PMID:26633392

  14. Robust optimization of supersonic ORC nozzle guide vanes

    Science.gov (United States)

    Bufi, Elio A.; Cinnella, Paola

    2017-03-01

    An efficient Robust Optimization (RO) strategy is developed for the design of 2D supersonic Organic Rankine Cycle turbine expanders. The dense gas effects are not-negligible for this application and they are taken into account describing the thermodynamics by means of the Peng-Robinson-Stryjek-Vera equation of state. The design methodology combines an Uncertainty Quantification (UQ) loop based on a Bayesian kriging model of the system response to the uncertain parameters, used to approximate statistics (mean and variance) of the uncertain system output, a CFD solver, and a multi-objective non-dominated sorting algorithm (NSGA), also based on a Kriging surrogate of the multi-objective fitness function, along with an adaptive infill strategy for surrogate enrichment at each generation of the NSGA. The objective functions are the average and variance of the isentropic efficiency. The blade shape is parametrized by means of a Free Form Deformation (FFD) approach. The robust optimal blades are compared to the baseline design (based on the Method of Characteristics) and to a blade obtained by means of a deterministic CFD-based optimization.

  15. Objects, materiality and meaning

    DEFF Research Database (Denmark)

    Lenau, Torben Anker; Lindegaard, Hanne

    2008-01-01

    The present research work investigates the relation between physical objects, their materiality, understood as the physical substances they are made from, and the communication from the objects. In product design of physical objects the communicative aspects are just as important as the function...... of the object, and the designers aim is therefore to tune both in order to achieve a desired goal. To do so the designer basically has 2 options: Alteration of the physical shape of the object and the selection of materials. Through the manipulation of shape and materials can symbolic and sensory information...... be written into the object. The materials are therefore carriers of communication, even though this is dependent of the cultural context and the environment which the object will be part of. However the designer has only minor influence on those....

  16. Robust reflective ghost imaging against different partially polarized thermal light

    Science.gov (United States)

    Li, Hong-Guo; Wang, Yan; Zhang, Rui-Xue; Zhang, De-Jian; Liu, Hong-Chao; Li, Zong-Guo; Xiong, Jun

    2018-03-01

    We theoretically study the influence of degree of polarization (DOP) of thermal light on the contrast-to-noise ratio (CNR) of the reflective ghost imaging (RGI), which is a novel and indirect imaging modality. An expression for the CNR of RGI with partially polarized thermal light is carefully derived, which suggests a weak dependence of CNR on the DOP, especially when the ratio of the object size to the speckle size of thermal light has a large value. Different from conventional imaging approaches, our work reveals that RGI is much more robust against the DOP of the light source, which thereby has advantages in practical applications, such as remote sensing.

  17. Robust topology optimization accounting for spatially varying manufacturing errors

    DEFF Research Database (Denmark)

    Schevenels, M.; Lazarov, Boyan Stefanov; Sigmund, Ole

    2011-01-01

    This paper presents a robust approach for the design of macro-, micro-, or nano-structures by means of topology optimization, accounting for spatially varying manufacturing errors. The focus is on structures produced by milling or etching; in this case over- or under-etching may cause parts...... optimization problem is formulated in a probabilistic way: the objective function is defined as a weighted sum of the mean value and the standard deviation of the structural performance. The optimization problem is solved by means of a Monte Carlo method: in each iteration of the optimization scheme, a Monte...

  18. Measure of robustness for complex networks

    Science.gov (United States)

    Youssef, Mina Nabil

    Critical infrastructures are repeatedly attacked by external triggers causing tremendous amount of damages. Any infrastructure can be studied using the powerful theory of complex networks. A complex network is composed of extremely large number of different elements that exchange commodities providing significant services. The main functions of complex networks can be damaged by different types of attacks and failures that degrade the network performance. These attacks and failures are considered as disturbing dynamics, such as the spread of viruses in computer networks, the spread of epidemics in social networks, and the cascading failures in power grids. Depending on the network structure and the attack strength, every network differently suffers damages and performance degradation. Hence, quantifying the robustness of complex networks becomes an essential task. In this dissertation, new metrics are introduced to measure the robustness of technological and social networks with respect to the spread of epidemics, and the robustness of power grids with respect to cascading failures. First, we introduce a new metric called the Viral Conductance (VCSIS ) to assess the robustness of networks with respect to the spread of epidemics that are modeled through the susceptible/infected/susceptible (SIS) epidemic approach. In contrast to assessing the robustness of networks based on a classical metric, the epidemic threshold, the new metric integrates the fraction of infected nodes at steady state for all possible effective infection strengths. Through examples, VCSIS provides more insights about the robustness of networks than the epidemic threshold. In addition, both the paradoxical robustness of Barabasi-Albert preferential attachment networks and the effect of the topology on the steady state infection are studied, to show the importance of quantifying the robustness of networks. Second, a new metric VCSIR is introduced to assess the robustness of networks with respect

  19. Robust pattern decoding in shape-coded structured light

    Science.gov (United States)

    Tang, Suming; Zhang, Xu; Song, Zhan; Song, Lifang; Zeng, Hai

    2017-09-01

    Decoding is a challenging and complex problem in a coded structured light system. In this paper, a robust pattern decoding method is proposed for the shape-coded structured light in which the pattern is designed as grid shape with embedded geometrical shapes. In our decoding method, advancements are made at three steps. First, a multi-template feature detection algorithm is introduced to detect the feature point which is the intersection of each two orthogonal grid-lines. Second, pattern element identification is modelled as a supervised classification problem and the deep neural network technique is applied for the accurate classification of pattern elements. Before that, a training dataset is established, which contains a mass of pattern elements with various blurring and distortions. Third, an error correction mechanism based on epipolar constraint, coplanarity constraint and topological constraint is presented to reduce the false matches. In the experiments, several complex objects including human hand are chosen to test the accuracy and robustness of the proposed method. The experimental results show that our decoding method not only has high decoding accuracy, but also owns strong robustness to surface color and complex textures.

  20. A robust sound perception model suitable for neuromorphic implementation.

    Science.gov (United States)

    Coath, Martin; Sheik, Sadique; Chicca, Elisabetta; Indiveri, Giacomo; Denham, Susan L; Wennekers, Thomas

    2013-01-01

    We have recently demonstrated the emergence of dynamic feature sensitivity through exposure to formative stimuli in a real-time neuromorphic system implementing a hybrid analog/digital network of spiking neurons. This network, inspired by models of auditory processing in mammals, includes several mutually connected layers with distance-dependent transmission delays and learning in the form of spike timing dependent plasticity, which effects stimulus-driven changes in the network connectivity. Here we present results that demonstrate that the network is robust to a range of variations in the stimulus pattern, such as are found in naturalistic stimuli and neural responses. This robustness is a property critical to the development of realistic, electronic neuromorphic systems. We analyze the variability of the response of the network to "noisy" stimuli which allows us to characterize the acuity in information-theoretic terms. This provides an objective basis for the quantitative comparison of networks, their connectivity patterns, and learning strategies, which can inform future design decisions. We also show, using stimuli derived from speech samples, that the principles are robust to other challenges, such as variable presentation rate, that would have to be met by systems deployed in the real world. Finally we demonstrate the potential applicability of the approach to real sounds.

  1. Robust stabilization of nonlinear systems: The LMI approach

    Directory of Open Access Journals (Sweden)

    Šiljak D. D.

    2000-01-01

    Full Text Available This paper presents a new approach to robust quadratic stabilization of nonlinear systems within the framework of Linear Matrix Inequalities (LMI. The systems are composed of a linear constant part perturbed by an additive nonlinearity which depends discontinuously on both time and state. The only information about the nonlinearity is that it satisfies a quadratic constraint. Our major objective is to show how linear constant feedback laws can be formulated to stabilize this type of systems and, at the same time, maximize the bounds on the nonlinearity which the system can tolerate without going unstable. We shall broaden the new setting to include design of decentralized control laws for robust stabilization of interconnected systems. Again, the LMI methods will be used to maximize the class of uncertain interconnections which leave the overall system connectively stable. It is useful to learn that the proposed LMI formulation “recognizes” the matching conditions by returning a feedback gain matrix for any prescribed bound on the interconnection terms. More importantly, the new formulation provides a suitable setting for robust stabilization of nonlinear systems where the nonlinear perturbations satisfy the generalized matching conditions.

  2. Robust Visual Tracking Using the Bidirectional Scale Estimation

    Directory of Open Access Journals (Sweden)

    An Zhiyong

    2017-01-01

    Full Text Available Object tracking with robust scale estimation is a challenging task in computer vision. This paper presents a novel tracking algorithm that learns the translation and scale filters with a complementary scheme. The translation filter is constructed using the ridge regression and multidimensional features. A robust scale filter is constructed by the bidirectional scale estimation, including the forward scale and backward scale. Firstly, we learn the scale filter using the forward tracking information. Then the forward scale and backward scale can be estimated using the respective scale filter. Secondly, a conservative strategy is adopted to compromise the forward and backward scales. Finally, the scale filter is updated based on the final scale estimation. It is effective to update scale filter since the stable scale estimation can improve the performance of scale filter. To reveal the effectiveness of our tracker, experiments are performed on 32 sequences with significant scale variation and on the benchmark dataset with 50 challenging videos. Our results show that the proposed tracker outperforms several state-of-the-art trackers in terms of robustness and accuracy.

  3. Robust linear discriminant analysis with distance based estimators

    Science.gov (United States)

    Lim, Yai-Fung; Yahaya, Sharipah Soaad Syed; Ali, Hazlina

    2017-11-01

    Linear discriminant analysis (LDA) is one of the supervised classification techniques concerning relationship between a categorical variable and a set of continuous variables. The main objective of LDA is to create a function to distinguish between populations and allocating future observations to previously defined populations. Under the assumptions of normality and homoscedasticity, the LDA yields optimal linear discriminant rule (LDR) between two or more groups. However, the optimality of LDA highly relies on the sample mean and pooled sample covariance matrix which are known to be sensitive to outliers. To alleviate these conflicts, a new robust LDA using distance based estimators known as minimum variance vector (MVV) has been proposed in this study. The MVV estimators were used to substitute the classical sample mean and classical sample covariance to form a robust linear discriminant rule (RLDR). Simulation and real data study were conducted to examine on the performance of the proposed RLDR measured in terms of misclassification error rates. The computational result showed that the proposed RLDR is better than the classical LDR and was comparable with the existing robust LDR.

  4. A Robust Sound Perception Model Suitable for Neuromorphic Implementation

    Directory of Open Access Journals (Sweden)

    Martin eCoath

    2014-01-01

    Full Text Available We have recently demonstrated the emergence of dynamic feature sensitivity through exposure to formative stimuli in a real-time neuromorphic system implementing a hybrid analogue/digital network of spiking neurons. This network, inspired by models of auditory processing in mammals, includes several mutually connected layers with distance-dependent transmission delays and learning in the form of spike timing dependent plasticity, which effects stimulus-driven changes in the network connectivity.Here we present results that demonstrate that the network is robust to a range of variations in the stimulus pattern, such as are found in naturalistic stimuli and neural responses. This robustness is a property critical to the development of realistic, electronic neuromorphic systems.We analyse the variability of the response of the network to `noisy' stimuli which allows us to characterize the acuity in information-theoretic terms. This provides an objective basis for the quantitative comparison of networks, their connectivity patterns, and learning strategies, which can inform future design decisions. We also show, using stimuli derived from speech samples, that the principles are robust to other challenges, such as variable presentation rate, that would have to be met by systems deployed in the real world. Finally we demonstrate the potential applicability of the approach to real sounds.

  5. Early object relations into new objects.

    Science.gov (United States)

    Downey, T W

    2001-01-01

    Two strands of change are suggested by this review, one maturational, the other therapeutic or developmental (Hartmann and Kris, 1945). By "maturational" I mean to suggest energies that infuse the individual from earliest life in a manner that includes object relations, but for the healthy exercise of which object relations per se need not be of central and crucial importance. Within wide limits such energies may be delayed until growth conditions prevail without significant distortion of certain of the organism's ego functions. Therapeutic change is analogous to developmental change in that both involve the crucial presence of another to release energies. In therapeutic change these are energies that have been repressed beyond the reach of developmental dynamics. In everyday development crisis and synthesis alternate in conjunction with new and emerging objects to add to the psychological structures brought to the fore by maturation. In many instances, as we see with John, over time and in a less focussed manner, developmental changes can approximate therapeutic change and visa versa. Freud-Dann in their "experiment" pursued one line, in which the equipmental delay brought on by extremely adverse living circumstances was redressed by providing an interpersonally enriching, loving, developmentally facilitating milieu. The sketches of individual children and John's subsequent story provide a perspective into what becomes the stuff of growth and what remains the stuff of neurosis. The developmental reserves and ego resilience of these children were impressive but probably not extraordinary. Usual growth ensued as soon as they were provided with the rich soil of Bulldogs Bank instead of the desert sand of the Tereszin concentration camp. However, no one can escape such adverse circumstances without having taken in the stuff of neurosis. Affects and percepts that were not assimilatable or even available to consciousness at the time remain buried in the unconscious

  6. Reasoning about Function Objects

    Science.gov (United States)

    Nordio, Martin; Calcagno, Cristiano; Meyer, Bertrand; Müller, Peter; Tschannen, Julian

    Modern object-oriented languages support higher-order implementations through function objects such as delegates in C#, agents in Eiffel, or closures in Scala. Function objects bring a new level of abstraction to the object-oriented programming model, and require a comparable extension to specification and verification techniques. We introduce a verification methodology that extends function objects with auxiliary side-effect free (pure) methods to model logical artifacts: preconditions, postconditions and modifies clauses. These pure methods can be used to specify client code abstractly, that is, independently from specific instantiations of the function objects. To demonstrate the feasibility of our approach, we have implemented an automatic prover, which verifies several non-trivial examples.

  7. Birth of the Object: Detection of Objectness and Extraction of Object Shape through Object Action Complexes

    DEFF Research Database (Denmark)

    Kraft, Dirk; Pugeault, Nicolas; Baseski, Emre

    2008-01-01

    We describe a process in which the segmentation of objects as well as the extraction of the object shape becomes realized through active exploration of a robot vision system. In the exploration process, two behavioral modules that link robot actions to the visual and haptic perception of objects...... interact. First, by making use of an object independent grasping mechanism, physical control over potential objects can be gained. Having evaluated the initial grasping mechanism as being successful, a second behavior extracts the object shape by making use of prediction based on the motion induced...... system, knowledge about its own embodiment as well as knowledge about geometric relationships such as rigid body motion. This prior knowledge allows the extraction of representations that are semantically richer compared to many other approaches....

  8. Herbig-Haro objects

    International Nuclear Information System (INIS)

    Schwartz, R.D.

    1983-01-01

    Progress in the understanding of Herbig-Haro (HH) objects is reviewed. The results of optical studies of the proper motions and alignments, variability, and polarization of HH objects and the results of spectroscopic studies are discussed. Ground-based infrared studies and far-infrared observations are reviewed. Findings on the properties of molecular clouds associated with HH objects, on gas flows associated with HH IR stars, on maser emission, and on radio continuum observations are considered. A history of proposed excitation mechanisms for HH objects is briefly presented, and the salient shock-wave calculations aimed at synthesizing the spectra of HH objects are summarized along with hypotheses that have been advanced about the origin of the objects. 141 references

  9. Propelling Extended Objects

    Science.gov (United States)

    Humbert, Richard

    2010-01-01

    A force acting on just part of an extended object (either a solid or a volume of a liquid) can cause all of it to move. That motion is due to the transmission of the force through the object by its material. This paper discusses how the force is distributed to all of the object by a gradient of stress or pressure in it, which creates the local…

  10. BL Lacertae objects

    International Nuclear Information System (INIS)

    Miller, J.S.

    1978-01-01

    An overview is given of the principal characteristics and problems associated with the prototype BL Lacertae. The most important characteristics of this group and its relevance, the consideration of a few particular objects in moderate detail, the relation between these objects QSOs, and normal galaxies, and finally the possible physical nature of BL Lac objects and the important questions they raise are treated. 15 references

  11. Objective-C

    CERN Document Server

    DeVoe, Jiva

    2011-01-01

    A soup-to-nuts guide on the Objective-C programming language. Objective-C is the language behind Cocoa and Cocoa Touch, which is the Framework of applications written for the Macintosh, iPod touch, iPhone, and iPad platforms. Part of the Developer Reference series covering the hottest Apple topics, this book covers everything from the basics of the C language to advanced aspects of Apple development. You'll examine Objective-C and high-level subjects of frameworks, threading, networking, and much more.: Covers the basics of the C language and then quickly moves onto Objective-C and more advanc

  12. Abstract Objects of Verbs

    DEFF Research Database (Denmark)

    2014-01-01

    Verbs do often take arguments of quite different types. In an orthodox type-theoretic framework this results in an extreme polysemy of many verbs. In this article, it is shown that this unwanted consequence can be avoided when a theory of "abstract objects" is adopted according to which...... these objects represent non-objectual entities in contexts from which they are excluded by type restrictions. Thus these objects are "abstract'' in a functional rather than in an ontological sense: they function as representatives of other entities but they are otherwise quite normal objects. Three examples...

  13. Robust adaptive control for Unmanned Aerial Vehicles

    Science.gov (United States)

    Kahveci, Nazli E.

    The objective of meeting higher endurance requirements remains a challenging task for any type and size of Unmanned Aerial Vehicles (UAVs). According to recent research studies significant energy savings can be realized through utilization of thermal currents. The navigation strategies followed across thermal regions, however, are based on rather intuitive assessments of remote pilots and lack any systematic path planning approaches. Various methods to enhance the autonomy of UAVs in soaring applications are investigated while seeking guarantees for flight performance improvements. The dynamics of the aircraft, small UAVs in particular, are affected by the environmental conditions, whereas unmodeled dynamics possibly become significant during aggressive flight maneuvers. Besides, the demanded control inputs might have a magnitude range beyond the limits dictated by the control surface actuators. The consequences of ignoring these issues can be catastrophic. Supporting this claim NASA Dryden Flight Research Center reports considerable performance degradation and even loss of stability in autonomous soaring flight tests with the subsequent risk of an aircraft crash. The existing control schemes are concluded to suffer from limited performance. Considering the aircraft dynamics and the thermal characteristics we define a vehicle-specific trajectory optimization problem to achieve increased cross-country speed and extended range of flight. In an environment with geographically dispersed set of thermals of possibly limited lifespan, we identify the similarities to the Vehicle Routing Problem (VRP) and provide both exact and approximate guidance algorithms for the navigation of automated UAVs. An additional stochastic approach is used to quantify the performance losses due to incorrect thermal data while dealing with random gust disturbances and onboard sensor measurement inaccuracies. One of the main contributions of this research is a novel adaptive control design with

  14. Robustness analysis of chiller sequencing control

    International Nuclear Information System (INIS)

    Liao, Yundan; Sun, Yongjun; Huang, Gongsheng

    2015-01-01

    Highlights: • Uncertainties with chiller sequencing control were systematically quantified. • Robustness of chiller sequencing control was systematically analyzed. • Different sequencing control strategies were sensitive to different uncertainties. • A numerical method was developed for easy selection of chiller sequencing control. - Abstract: Multiple-chiller plant is commonly employed in the heating, ventilating and air-conditioning system to increase operational feasibility and energy-efficiency under part load condition. In a multiple-chiller plant, chiller sequencing control plays a key role in achieving overall energy efficiency while not sacrifices the cooling sufficiency for indoor thermal comfort. Various sequencing control strategies have been developed and implemented in practice. Based on the observation that (i) uncertainty, which cannot be avoided in chiller sequencing control, has a significant impact on the control performance and may cause the control fail to achieve the expected control and/or energy performance; and (ii) in current literature few studies have systematically addressed this issue, this paper therefore presents a study on robustness analysis of chiller sequencing control in order to understand the robustness of various chiller sequencing control strategies under different types of uncertainty. Based on the robustness analysis, a simple and applicable method is developed to select the most robust control strategy for a given chiller plant in the presence of uncertainties, which will be verified using case studies

  15. Information theory perspective on network robustness

    International Nuclear Information System (INIS)

    Schieber, Tiago A.; Carpi, Laura; Frery, Alejandro C.; Rosso, Osvaldo A.; Pardalos, Panos M.; Ravetti, Martín G.

    2016-01-01

    A crucial challenge in network theory is the study of the robustness of a network when facing a sequence of failures. In this work, we propose a dynamical definition of network robustness based on Information Theory, that considers measurements of the structural changes caused by failures of the network's components. Failures are defined here as a temporal process defined in a sequence. Robustness is then evaluated by measuring dissimilarities between topologies after each time step of the sequence, providing a dynamical information about the topological damage. We thoroughly analyze the efficiency of the method in capturing small perturbations by considering different probability distributions on networks. In particular, we find that distributions based on distances are more consistent in capturing network structural deviations, as better reflect the consequences of the failures. Theoretical examples and real networks are used to study the performance of this methodology. - Highlights: • A novel methodology to measure the robustness of a network to component failure or targeted attacks is proposed. • The use of the network's distance PDF allows a precise analysis. • The method provides a dynamic robustness profile showing the response of the topology to each failure event. • The measure is capable to detect network's critical elements.

  16. Replication and robustness in developmental research.

    Science.gov (United States)

    Duncan, Greg J; Engel, Mimi; Claessens, Amy; Dowsett, Chantelle J

    2014-11-01

    Replications and robustness checks are key elements of the scientific method and a staple in many disciplines. However, leading journals in developmental psychology rarely include explicit replications of prior research conducted by different investigators, and few require authors to establish in their articles or online appendices that their key results are robust across estimation methods, data sets, and demographic subgroups. This article makes the case for prioritizing both explicit replications and, especially, within-study robustness checks in developmental psychology. It provides evidence on variation in effect sizes in developmental studies and documents strikingly different replication and robustness-checking practices in a sample of journals in developmental psychology and a sister behavioral science-applied economics. Our goal is not to show that any one behavioral science has a monopoly on best practices, but rather to show how journals from a related discipline address vital concerns of replication and generalizability shared by all social and behavioral sciences. We provide recommendations for promoting graduate training in replication and robustness-checking methods and for editorial policies that encourage these practices. Although some of our recommendations may shift the form and substance of developmental research articles, we argue that they would generate considerable scientific benefits for the field. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  17. Emergence of robustness in networks of networks

    Science.gov (United States)

    Roth, Kevin; Morone, Flaviano; Min, Byungjoon; Makse, Hernán A.

    2017-06-01

    A model of interdependent networks of networks (NONs) was introduced recently [Proc. Natl. Acad. Sci. (USA) 114, 3849 (2017), 10.1073/pnas.1620808114] in the context of brain activation to identify the neural collective influencers in the brain NON. Here we investigate the emergence of robustness in such a model, and we develop an approach to derive an exact expression for the random percolation transition in Erdös-Rényi NONs of this kind. Analytical calculations are in agreement with numerical simulations, and highlight the robustness of the NON against random node failures, which thus presents a new robust universality class of NONs. The key aspect of this robust NON model is that a node can be activated even if it does not belong to the giant mutually connected component, thus allowing the NON to be built from below the percolation threshold, which is not possible in previous models of interdependent networks. Interestingly, the phase diagram of the model unveils particular patterns of interconnectivity for which the NON is most vulnerable, thereby marking the boundary above which the robustness of the system improves with increasing dependency connections.

  18. Primal and dual approaches to adjustable robust optimization

    NARCIS (Netherlands)

    de Ruiter, Frans

    2018-01-01

    Robust optimization has become an important paradigm to deal with optimization under uncertainty. Adjustable robust optimization is an extension that deals with multistage problems. This thesis starts with a short but comprehensive introduction to adjustable robust optimization. Then the two

  19. Manifold-Based Visual Object Counting.

    Science.gov (United States)

    Wang, Yi; Zou, Yuexian; Wang, Wenwu

    2018-07-01

    Visual object counting (VOC) is an emerging area in computer vision which aims to estimate the number of objects of interest in a given image or video. Recently, object density based estimation method is shown to be promising for object counting as well as rough instance localization. However, the performance of this method tends to degrade when dealing with new objects and scenes. To address this limitation, we propose a manifold-based method for visual object counting (M-VOC), based on the manifold assumption that similar image patches share similar object densities. Firstly, the local geometry of a given image patch is represented linearly by its neighbors using a predefined patch training set, and the object density of this given image patch is reconstructed by preserving the local geometry using locally linear embedding. To improve the characterization of local geometry, additional constraints such as sparsity and non-negativity are also considered via regularization, nonlinear mapping, and kernel trick. Compared with the state-of-the-art VOC methods, our proposed M-VOC methods achieve competitive performance on seven benchmark datasets. Experiments verify that the proposed M-VOC methods have several favorable properties, such as robustness to the variation in the size of training dataset and image resolution, as often encountered in real-world VOC applications.

  20. Programs as Data Objects

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the Second Symposium on Programs as Data Objects, PADO 2001, held in Aarhus, Denmark, in May 2001. The 14 revised full papers presented were carefully reviewed and selected from 30 submissions. Various aspects of looking at programs as data objects...... are covered from the point of view of program analysis, program transformation, computational complexity, etc....

  1. Exhibiting Epistemic Objects

    DEFF Research Database (Denmark)

    Tybjerg, Karin

    2017-01-01

    of exhibiting epistemic objects that utilize their knowledge-generating potential and allow them to continue to stimulate curiosity and generate knowledge in the exhibition. The epistemic potential of the objects can then be made to work together with the function of the exhibition as a knowledge-generating set...

  2. Object permanence in lemurs.

    Science.gov (United States)

    Deppe, Anja M; Wright, Patricia C; Szelistowski, William A

    2009-03-01

    Object permanence, the ability to mentally represent objects that have disappeared from view, should be advantageous to animals in their interaction with the natural world. The objective of this study was to examine whether lemurs possess object permanence. Thirteen adult subjects representing four species of diurnal lemur (Eulemur fulvus rufus, Eulemur mongoz, Lemur catta and Hapalemur griseus) were presented with seven standard Piagetian visible and invisible object displacement tests, plus one single visible test where the subject had to wait predetermined times before allowed to search, and two invisible tests where each hiding place was made visually unique. In all visible tests lemurs were able to find an object that had been in clear view before being hidden. However, when lemurs were not allowed to search for up to 25-s, performance declined with increasing time-delay. Subjects did not outperform chance on any invisible displacements regardless of whether hiding places were visually uniform or unique, therefore the upper limit of object permanence observed was Stage 5b. Lemur species in this study eat stationary foods and are not subject to stalking predators, thus Stage 5 object permanence is probably sufficient to solve most problems encountered in the wild.

  3. Investigating Music Information Objects

    Science.gov (United States)

    Weissenberger, Lynnsey K.

    2016-01-01

    This dissertation, titled "Investigating Music Information Objects," is a study of the nature, description, representations, and ideas related to music information objects (MIOs). This research study investigates how music practitioners from various traditions describe and conceptualize MIOs, using a theoretical framework to classify…

  4. Gamifying Video Object Segmentation.

    Science.gov (United States)

    Spampinato, Concetto; Palazzo, Simone; Giordano, Daniela

    2017-10-01

    Video object segmentation can be considered as one of the most challenging computer vision problems. Indeed, so far, no existing solution is able to effectively deal with the peculiarities of real-world videos, especially in cases of articulated motion and object occlusions; limitations that appear more evident when we compare the performance of automated methods with the human one. However, manually segmenting objects in videos is largely impractical as it requires a lot of time and concentration. To address this problem, in this paper we propose an interactive video object segmentation method, which exploits, on one hand, the capability of humans to identify correctly objects in visual scenes, and on the other hand, the collective human brainpower to solve challenging and large-scale tasks. In particular, our method relies on a game with a purpose to collect human inputs on object locations, followed by an accurate segmentation phase achieved by optimizing an energy function encoding spatial and temporal constraints between object regions as well as human-provided location priors. Performance analysis carried out on complex video benchmarks, and exploiting data provided by over 60 users, demonstrated that our method shows a better trade-off between annotation times and segmentation accuracy than interactive video annotation and automated video object segmentation approaches.

  5. Objects of Desire.

    Science.gov (United States)

    Zielinski, Dave

    2000-01-01

    Describes learning objects, also known as granules, chunks, or information nuggets, and likens them to help screens. Discusses concerns about how they can go wrong: (1) faulty pretest questions; (2) missing links in the learning object chain; (3) poor frames of reference; and (4) lack of customization. (JOW)

  6. Per Object statistical analysis

    DEFF Research Database (Denmark)

    2008-01-01

    of a specific class in turn, and uses as pair of PPO stages to derive the statistics and then assign them to the objects' Object Variables. It may be that this could all be done in some other, simply way, but several other ways that were tried did not succeed. The procedure ouptut has been tested against...

  7. On Objects and Events

    DEFF Research Database (Denmark)

    Eugster, Patrick Thomas; Guerraoui, Rachid; Damm, Christian Heide

    2001-01-01

    This paper presents linguistic primitives for publish/subscribe programming using events and objects. We integrate our primitives into a strongly typed object-oriented language through four mechanisms: (1) serialization, (2) multiple sub typing, (3) closures, and (4) deferred code evaluation. We...

  8. Stability of multihypernuclear objects

    International Nuclear Information System (INIS)

    Ikram, M.; Rather, Asloob A.; Usmani, A.A.; Patra, S.K.

    2016-01-01

    In present work, we analyze the stability of multi-hypernuclear objects having higher content of strangeness. The aim of this work is to test the stability of such objects which might be produced in heavy-ion reactions. Studies of such type of systems might have great implication to nuclear-astrophysics

  9. Cultivating objects in interaction

    DEFF Research Database (Denmark)

    Hazel, Spencer

    2014-01-01

    is chapter explores patterns of repeated orientations to physical objects in interactants’ visuo-spatial and haptic surround. A number of examples are presented from advice-giving activities in various institutional settings, where participants-in-interaction initially draw on material objects...

  10. Piles of objects

    KAUST Repository

    Hsu, Shu-Wei; Keyser, John

    2010-01-01

    We present a method for directly modeling piles of objects in multi-body simulations. Piles of objects represent some of the more interesting, but also most time-consuming portion of simulation. We propose a method for reducing computation in many

  11. Object oriented programming

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1990-01-01

    This paper is an introduction to object oriented programming techniques. It tries to explain the concepts by using analogies with traditional programming. The object oriented approach not inherently difficult, but most programmers find a relatively high threshold in learning it. Thus, this paper will attempt to convey the concepts with examples rather than explain the formal theory

  12. Occupant behaviour and robustness of building design

    DEFF Research Database (Denmark)

    Buso, Tiziana; Fabi, Valentina; Andersen, Rune Korsholm

    2015-01-01

    in a dynamic building energy simulation tool (IDA ICE). The analysis was carried out by simulating 15 building envelope designs in different thermal zones of an Office Reference Building in 3 climates: Stockholm, Frankfurt and Athens.In general, robustness towards changes in occupants' behaviour increased......Occupant behaviour can cause major discrepancies between the designed and the real total energy use in buildings. A possible solution to reduce the differences between predictions and actual performances is designing robust buildings, i.e. buildings whose performances show little variations...... with alternating occupant behaviour patterns. The aim of this work was to investigate how alternating occupant behaviour patterns impact the performance of different envelope design solutions in terms of building robustness. Probabilistic models of occupants' window opening and use of shading were implemented...

  13. Robust Mediation Analysis Based on Median Regression

    Science.gov (United States)

    Yuan, Ying; MacKinnon, David P.

    2014-01-01

    Mediation analysis has many applications in psychology and the social sciences. The most prevalent methods typically assume that the error distribution is normal and homoscedastic. However, this assumption may rarely be met in practice, which can affect the validity of the mediation analysis. To address this problem, we propose robust mediation analysis based on median regression. Our approach is robust to various departures from the assumption of homoscedasticity and normality, including heavy-tailed, skewed, contaminated, and heteroscedastic distributions. Simulation studies show that under these circumstances, the proposed method is more efficient and powerful than standard mediation analysis. We further extend the proposed robust method to multilevel mediation analysis, and demonstrate through simulation studies that the new approach outperforms the standard multilevel mediation analysis. We illustrate the proposed method using data from a program designed to increase reemployment and enhance mental health of job seekers. PMID:24079925

  14. Robustness of Distance-to-Default

    DEFF Research Database (Denmark)

    Jessen, Cathrine; Lando, David

    2013-01-01

    Distance-to-default is a remarkably robust measure for ranking firms according to their risk of default. The ranking seems to work despite the fact that the Merton model from which the measure is derived produces default probabilities that are far too small when applied to real data. We use...... simulations to investigate the robustness of the distance-to-default measure to different model specifications. Overall we find distance-to-default to be robust to a number of deviations from the simple Merton model that involve different asset value dynamics and different default triggering mechanisms....... A notable exception is a model with stochastic volatility of assets. In this case both the ranking of firms and the estimated default probabilities using distance-to-default perform significantly worse. We therefore propose a volatility adjustment of the distance-to-default measure, that significantly...

  15. Robust Portfolio Optimization using CAPM Approach

    Directory of Open Access Journals (Sweden)

    mohsen gharakhani

    2013-08-01

    Full Text Available In this paper, a new robust model of multi-period portfolio problem has been developed. One of the key concerns in any asset allocation problem is how to cope with uncertainty about future returns. There are some approaches in the literature for this purpose including stochastic programming and robust optimization. Applying these techniques to multi-period portfolio problem may increase the problem size in a way that the resulting model is intractable. In this paper, a novel approach has been proposed to formulate multi-period portfolio problem as an uncertain linear program assuming that asset return follows the single-index factor model. Robust optimization technique has been also used to solve the problem. In order to evaluate the performance of the proposed model, a numerical example has been applied using simulated data.

  16. Robustness of quantum correlations against linear noise

    International Nuclear Information System (INIS)

    Guo, Zhihua; Cao, Huaixin; Qu, Shixian

    2016-01-01

    Relative robustness of quantum correlations (RRoQC) of a bipartite state is firstly introduced relative to a classically correlated state. Robustness of quantum correlations (RoQC) of a bipartite state is then defined as the minimum of RRoQC of the state relative to all classically correlated ones. It is proved that as a function on quantum states, RoQC is nonnegative, lower semi-continuous and neither convex nor concave; especially, it is zero if and only if the state is classically correlated. Thus, RoQC not only quantifies the endurance of quantum correlations of a state against linear noise, but also can be used to distinguish between quantum and classically correlated states. Furthermore, the effects of local quantum channels on the robustness are explored and characterized. (paper)

  17. Parametric uncertainty modeling for robust control

    DEFF Research Database (Denmark)

    Rasmussen, K.H.; Jørgensen, Sten Bay

    1999-01-01

    The dynamic behaviour of a non-linear process can often be approximated with a time-varying linear model. In the presented methodology the dynamics is modeled non-conservatively as parametric uncertainty in linear lime invariant models. The obtained uncertainty description makes it possible...... to perform robustness analysis on a control system using the structured singular value. The idea behind the proposed method is to fit a rational function to the parameter variation. The parameter variation can then be expressed as a linear fractional transformation (LFT), It is discussed how the proposed...... point changes. It is shown that a diagonal PI control structure provides robust performance towards variations in feed flow rate or feed concentrations. However including both liquid and vapor flow delays robust performance specifications cannot be satisfied with this simple diagonal control structure...

  18. Incentive-Compatible Robust Line Planning

    Science.gov (United States)

    Bessas, Apostolos; Kontogiannis, Spyros; Zaroliagis, Christos

    The problem of robust line planning requests for a set of origin-destination paths (lines) along with their frequencies in an underlying railway network infrastructure, which are robust to fluctuations of real-time parameters of the solution. In this work, we investigate a variant of robust line planning stemming from recent regulations in the railway sector that introduce competition and free railway markets, and set up a new application scenario: there is a (potentially large) number of line operators that have their lines fixed and operate as competing entities issuing frequency requests, while the management of the infrastructure itself remains the responsibility of a single entity, the network operator. The line operators are typically unwilling to reveal their true incentives, while the network operator strives to ensure a fair (or socially optimal) usage of the infrastructure, e.g., by maximizing the (unknown to him) aggregate incentives of the line operators.

  19. Beginning Objective-C

    CERN Document Server

    Dovey, James

    2012-01-01

    Objective-C is today's fastest growing programming language, at least in part due to the popularity of Apple's Mac, iPhone and iPad. Beginning Objective-C is for you if you have some programming experience, but you're new to the Objective-C programming language and you want a modern-and fast-way forwards to your own coding projects. Beginning Objective-C offers you a modern programmer's perspective on Objective-C courtesy of two of the best iOS and Mac developers in the field today, and gets you programming to the best of your ability in this important language.  It gets you rolling fast into

  20. Hardware Objects for Java

    DEFF Research Database (Denmark)

    Schoeberl, Martin; Thalinger, Christian; Korsholm, Stephan

    2008-01-01

    Java, as a safe and platform independent language, avoids access to low-level I/O devices or direct memory access. In standard Java, low-level I/O it not a concern; it is handled by the operating system. However, in the embedded domain resources are scarce and a Java virtual machine (JVM) without...... an underlying middleware is an attractive architecture. When running the JVM on bare metal, we need access to I/O devices from Java; therefore we investigate a safe and efficient mechanism to represent I/O devices as first class Java objects, where device registers are represented by object fields. Access...... to those registers is safe as Java’s type system regulates it. The access is also fast as it is directly performed by the bytecodes getfield and putfield. Hardware objects thus provide an object-oriented abstraction of low-level hardware devices. As a proof of concept, we have implemented hardware objects...

  1. Abstract Objects of Verbs

    DEFF Research Database (Denmark)

    Robering, Klaus

    2014-01-01

    Verbs do often take arguments of quite different types. In an orthodox type-theoretic framework this results in an extreme polysemy of many verbs. In this article, it is shown that this unwanted consequence can be avoided when a theory of "abstract objects" is adopted according to which these obj......Verbs do often take arguments of quite different types. In an orthodox type-theoretic framework this results in an extreme polysemy of many verbs. In this article, it is shown that this unwanted consequence can be avoided when a theory of "abstract objects" is adopted according to which...... these objects represent non-objectual entities in contexts from which they are excluded by type restrictions. Thus these objects are "abstract'' in a functional rather than in an ontological sense: they function as representatives of other entities but they are otherwise quite normal objects. Three examples...

  2. Object-based attention: strength of object representation and attentional guidance.

    Science.gov (United States)

    Shomstein, Sarah; Behrmann, Marlene

    2008-01-01

    Two or more features belonging to a single object are identified more quickly and more accurately than are features belonging to different objects--a finding attributed to sensory enhancement of all features belonging to an attended or selected object. However, several recent studies have suggested that this "single-object advantage" may be a product of probabilistic and configural strategic prioritizations rather than of object-based perceptual enhancement per se, challenging the underlying mechanism that is thought to give rise to object-based attention. In the present article, we further explore constraints on the mechanisms of object-based selection by examining the contribution of the strength of object representations to the single-object advantage. We manipulated factors such as exposure duration (i.e., preview time) and salience of configuration (i.e., objects). Varying preview time changes the magnitude of the object-based effect, so that if there is ample time to establish an object representation (i.e., preview time of 1,000 msec), then both probability and configuration (i.e., objects) guide attentional selection. If, however, insufficient time is provided to establish a robust object-based representation, then only probabilities guide attentional selection. Interestingly, at a short preview time of 200 msec, when the two objects were sufficiently different from each other (i.e., different colors), both configuration and probability guided attention selection. These results suggest that object-based effects can be explained both in terms of strength of object representations (established at longer exposure durations and by pictorial cues) and probabilistic contingencies in the visual environment.

  3. Competition improves robustness against loss of information

    Directory of Open Access Journals (Sweden)

    Arash eKermani Kolankeh

    2015-03-01

    Full Text Available A substantial number of works aimed at modeling the receptive field properties of the primary visual cortex (V1. Their evaluation criterion is usually the similarity of the model response properties to the recorded responses from biological organisms. However, as several algorithms were able to demonstrate some degree of similarity to biological data based on the existing criteria, we focus on the robustness against loss of information in the form of occlusions as an additional constraint for better understanding the algorithmic level of early vision in the brain. We try to investigate the influence of competition mechanisms on the robustness. Therefore, we compared four methods employing different competition mechanisms, namely, independent component analysis, non-negative matrix factorization with sparseness constraint, predictive coding/biased competition, and a Hebbian neural network with lateral inhibitory connections. Each of those methods is known to be capable of developing receptive fields comparable to those of V1 simple-cells. Since measuring the robustness of methods having simple-cell like receptive fields against occlusion is difficult, we measure the robustness using the classification accuracy on the MNIST hand written digit dataset. For this we trained all methods on the training set of the MNIST hand written digits dataset and tested them on a MNIST test set with different levels of occlusions. We observe that methods which employ competitive mechanisms have higher robustness against loss of information. Also the kind of the competition mechanisms plays an important role in robustness. Global feedback inhibition as employed in predictive coding/biased competition has an advantage compared to local lateral inhibition learned by an anti-Hebb rule.

  4. Return Predictability, Model Uncertainty, and Robust Investment

    DEFF Research Database (Denmark)

    Lukas, Manuel

    Stock return predictability is subject to great uncertainty. In this paper we use the model confidence set approach to quantify uncertainty about expected utility from investment, accounting for potential return predictability. For monthly US data and six representative return prediction models, we...... find that confidence sets are very wide, change significantly with the predictor variables, and frequently include expected utilities for which the investor prefers not to invest. The latter motivates a robust investment strategy maximizing the minimal element of the confidence set. The robust investor...... allocates a much lower share of wealth to stocks compared to a standard investor....

  5. Three Contributions to Robust Regression Diagnostics

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2015-01-01

    Roč. 11, č. 2 (2015), s. 69-78 ISSN 1336-9180 Grant - others:GA ČR(CZ) GA13-01930S; Nadační fond na podporu vědy(CZ) Neuron Institutional support: RVO:67985807 Keywords : robust regression * robust econometrics * hypothesis test ing Subject RIV: BA - General Mathematics http://www.degruyter.com/view/j/jamsi.2015.11.issue-2/jamsi-2015-0013/jamsi-2015-0013.xml?format=INT

  6. Robust Inference with Multi-way Clustering

    OpenAIRE

    A. Colin Cameron; Jonah B. Gelbach; Douglas L. Miller; Doug Miller

    2009-01-01

    In this paper we propose a variance estimator for the OLS estimator as well as for nonlinear estimators such as logit, probit and GMM. This variance estimator enables cluster-robust inference when there is two-way or multi-way clustering that is non-nested. The variance estimator extends the standard cluster-robust variance estimator or sandwich estimator for one-way clustering (e.g. Liang and Zeger (1986), Arellano (1987)) and relies on similar relatively weak distributional assumptions. Our...

  7. Robust control synthesis for uncertain dynamical systems

    Science.gov (United States)

    Byun, Kuk-Whan; Wie, Bong; Sunkel, John

    1989-01-01

    This paper presents robust control synthesis techniques for uncertain dynamical systems subject to structured parameter perturbation. Both QFT (quantitative feedback theory) and H-infinity control synthesis techniques are investigated. Although most H-infinity-related control techniques are not concerned with the structured parameter perturbation, a new way of incorporating the parameter uncertainty in the robust H-infinity control design is presented. A generic model of uncertain dynamical systems is used to illustrate the design methodologies investigated in this paper. It is shown that, for a certain noncolocated structural control problem, use of both techniques results in nonminimum phase compensation.

  8. Robustness of multiparty nonlocality to local decoherence

    International Nuclear Information System (INIS)

    Jang, Sung Soon; Cheong, Yong Wook; Kim, Jaewan; Lee, Hai-Woong

    2006-01-01

    We investigate the robustness of multiparty nonlocality under local decoherence, acting independently and equally on each subsystem. To be specific, we consider an N-qubit Greenberger-Horne-Zeilinger (GHZ) state under a depolarization, dephasing, or dissipation channel, and examine nonlocality by testing violation of the Mermin-Klyshko inequality, which is one of Bell's inequalities for multiqubit systems. The results show that the robustness of nonlocality increases with the number of qubits, and that the nonlocality of an N-qubit GHZ state with even N is extremely persistent against dephasing

  9. Robust speaker recognition in noisy environments

    CERN Document Server

    Rao, K Sreenivasa

    2014-01-01

    This book discusses speaker recognition methods to deal with realistic variable noisy environments. The text covers authentication systems for; robust noisy background environments, functions in real time and incorporated in mobile devices. The book focuses on different approaches to enhance the accuracy of speaker recognition in presence of varying background environments. The authors examine: (a) Feature compensation using multiple background models, (b) Feature mapping using data-driven stochastic models, (c) Design of super vector- based GMM-SVM framework for robust speaker recognition, (d) Total variability modeling (i-vectors) in a discriminative framework and (e) Boosting method to fuse evidences from multiple SVM models.

  10. Robust Parametric Control of Spacecraft Rendezvous

    Directory of Open Access Journals (Sweden)

    Dake Gu

    2014-01-01

    Full Text Available This paper proposes a method to design the robust parametric control for autonomous rendezvous of spacecrafts with the inertial information with uncertainty. We consider model uncertainty of traditional C-W equation to formulate the dynamic model of the relative motion. Based on eigenstructure assignment and model reference theory, a concise control law for spacecraft rendezvous is proposed which could be fixed through solving an optimization problem. The cost function considers the stabilization of the system and other performances. Simulation results illustrate the robustness and effectiveness of the proposed control.

  11. Robust cluster analysis and variable selection

    CERN Document Server

    Ritter, Gunter

    2014-01-01

    Clustering remains a vibrant area of research in statistics. Although there are many books on this topic, there are relatively few that are well founded in the theoretical aspects. In Robust Cluster Analysis and Variable Selection, Gunter Ritter presents an overview of the theory and applications of probabilistic clustering and variable selection, synthesizing the key research results of the last 50 years. The author focuses on the robust clustering methods he found to be the most useful on simulated data and real-time applications. The book provides clear guidance for the varying needs of bot

  12. Robust median estimator in logisitc regression

    Czech Academy of Sciences Publication Activity Database

    Hobza, T.; Pardo, L.; Vajda, Igor

    2008-01-01

    Roč. 138, č. 12 (2008), s. 3822-3840 ISSN 0378-3758 R&D Projects: GA MŠk 1M0572 Grant - others:Instituto Nacional de Estadistica (ES) MPO FI - IM3/136; GA MŠk(CZ) MTM 2006-06872 Institutional research plan: CEZ:AV0Z10750506 Keywords : Logistic regression * Median * Robustness * Consistency and asymptotic normality * Morgenthaler * Bianco and Yohai * Croux and Hasellbroeck Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.679, year: 2008 http://library.utia.cas.cz/separaty/2008/SI/vajda-robust%20median%20estimator%20in%20logistic%20regression.pdf

  13. Introduction to Robust Estimation and Hypothesis Testing

    CERN Document Server

    Wilcox, Rand R

    2012-01-01

    This revised book provides a thorough explanation of the foundation of robust methods, incorporating the latest updates on R and S-Plus, robust ANOVA (Analysis of Variance) and regression. It guides advanced students and other professionals through the basic strategies used for developing practical solutions to problems, and provides a brief background on the foundations of modern methods, placing the new methods in historical context. Author Rand Wilcox includes chapter exercises and many real-world examples that illustrate how various methods perform in different situations.Introduction to R

  14. Robust-mode analysis of hydrodynamic flows

    Science.gov (United States)

    Roy, Sukesh; Gord, James R.; Hua, Jia-Chen; Gunaratne, Gemunu H.

    2017-04-01

    The emergence of techniques to extract high-frequency high-resolution data introduces a new avenue for modal decomposition to assess the underlying dynamics, especially of complex flows. However, this task requires the differentiation of robust, repeatable flow constituents from noise and other irregular features of a flow. Traditional approaches involving low-pass filtering and principle components analysis have shortcomings. The approach outlined here, referred to as robust-mode analysis, is based on Koopman decomposition. Three applications to (a) a counter-rotating cellular flame state, (b) variations in financial markets, and (c) turbulent injector flows are provided.

  15. Nuclear Energy General Objectives

    International Nuclear Information System (INIS)

    2011-01-01

    One of the IAEA's statutory objectives is to 'seek to accelerate and enlarge the contribution of atomic energy to peace, health and prosperity throughout the world'. One way it achieves this objective is to issue publications in various series. Two of these series are the IAEA Nuclear Energy Series and the IAEA Safety Standards Series. According to Article III, paragraph A.6, of the IAEA Statute, the IAEA safety standards establish 'standards of safety for protection of health and minimization of danger to life and property.' The safety standards include the Safety Fundamentals, Safety Requirements and Safety Guides. These standards are primarily written in a regulatory style, and are binding on the IAEA for its own activities. The principal users are Member State regulatory bodies and other national authorities. The IAEA Nuclear Energy Series consists of reports designed to encourage and assist research on, and development and practical application of, nuclear energy for peaceful uses. This includes practical examples to be used by owners and operators of utilities in Member States, implementing organizations, academia and politicians, among others. The information is presented in guides, reports on the status of technology and advances, and best practices for peaceful uses of nuclear energy based on inputs from international experts. The series complements the IAEA's safety standards, and provides detailed guidance, experience, good practices and examples on the five areas covered in the IAEA Nuclear Energy Series. The Nuclear Energy Basic Principles is the highest level publication in the IAEA Nuclear Energy Series and describes the rationale and vision for the peaceful uses of nuclear energy. It presents eight Basic Principles on which nuclear energy systems should be based to fulfil nuclear energy's potential to help meet growing global energy needs. The Nuclear Energy Series Objectives are the second level publications. They describe what needs to be

  16. Robust algebraic image enhancement for intelligent control systems

    Science.gov (United States)

    Lerner, Bao-Ting; Morrelli, Michael

    1993-01-01

    Robust vision capability for intelligent control systems has been an elusive goal in image processing. The computationally intensive techniques a necessary for conventional image processing make real-time applications, such as object tracking and collision avoidance difficult. In order to endow an intelligent control system with the needed vision robustness, an adequate image enhancement subsystem capable of compensating for the wide variety of real-world degradations, must exist between the image capturing and the object recognition subsystems. This enhancement stage must be adaptive and must operate with consistency in the presence of both statistical and shape-based noise. To deal with this problem, we have developed an innovative algebraic approach which provides a sound mathematical framework for image representation and manipulation. Our image model provides a natural platform from which to pursue dynamic scene analysis, and its incorporation into a vision system would serve as the front-end to an intelligent control system. We have developed a unique polynomial representation of gray level imagery and applied this representation to develop polynomial operators on complex gray level scenes. This approach is highly advantageous since polynomials can be manipulated very easily, and are readily understood, thus providing a very convenient environment for image processing. Our model presents a highly structured and compact algebraic representation of grey-level images which can be viewed as fuzzy sets.

  17. Functional Object Analysis

    DEFF Research Database (Denmark)

    Raket, Lars Lau

    We propose a direction it the field of statistics which we will call functional object analysis. This subfields considers the analysis of functional objects defined on continuous domains. In this setting we will focus on model-based statistics, with a particularly emphasis on mixed......-effect formulations, where the observed functional signal is assumed to consist of both fixed and random functional effects. This thesis takes the initial steps toward the development of likelihood-based methodology for functional objects. We first consider analysis of functional data defined on high...

  18. Towards C++ object libraries for accelerator physics

    International Nuclear Information System (INIS)

    Michelotti, L.

    1993-01-01

    We must write robust, flexible, portable software that is easier to understand, maintain, modify, reuse, and extend. These attributes are more than mere buzzwords. They are important goals that computer scientists strive to achieve by refining their programming models and devising languages to support them. An important breakthrough was achieved by the introduction of object oriented programming (OOP) as the computing model behind such languages as Smalltalk, ADA, Eiffel, Objective-C, and C++. OOP is not a ''fad'': it is arguably the most significant development in programming since the invention of FORTRAN, and it is the way that the best software will be written well into the next century. An ''object'' comprises structures of data, the functions that manipulate them, and rules for bringing them into and out of scope. OOP is a methodology for realizing and fully utilizing this abstract concept, an extension to programming of the basic technique that has advanced mathematics for centuries

  19. Piles of objects

    KAUST Repository

    Hsu, Shu-Wei

    2010-01-01

    We present a method for directly modeling piles of objects in multi-body simulations. Piles of objects represent some of the more interesting, but also most time-consuming portion of simulation. We propose a method for reducing computation in many of these situations by explicitly modeling the piles that the objects may form into. By modeling pile behavior rather than the behavior of all individual objects, we can achieve realistic results in less time, and without directly modeling the frictional component that leads to desired pile shapes. Our method is simple to implement and can be easily integrated with existing rigid body simulations. We observe notable speedups in several rigid body examples, and generate a wider variety of piled structures than possible with strict impulse-based simulation. © 2010 ACM.

  20. Safety objectives for 2014

    CERN Multimedia

    HSE Unit

    2014-01-01

    This is the third year in which the CERN Management has presented annual safety objectives for the Organization, the “HSE Objectives”.   The HSE objectives for 2014, which were announced by the Director-General at his traditional New Year’s address to the staff and were presented at the first Enlarged Directorate meeting of the year, have been drawn up and agreed in close collaboration between the DSOs, the HSE Unit and the DG himself. From safety in the workplace to radiation and environmental protection, the document emphasises that “Safety is a priority for CERN” and that safety policy is a key element in how the Organization is run. And, like all policies, it generates objectives that “serve as a general framework for action”. The HSE objectives are broken down into the following fields: occupational health and safety on sites and in the workplace, radiation protection, radiation safety, environmental protection, emerge...

  1. Registration of Space Objects

    Science.gov (United States)

    Schmidt-Tedd, Bernhard

    2017-07-01

    Space objects are subject to registration in order to allocate "jurisdiction and control" over those objects in the sovereign-free environment of outer space. This approach is similar to the registration of ships in view of the high sea and for aircrafts with respect to the international airspace. Registration is one of the basic principles of space law, starting with UN General Assembly Resolution 1721 B (XVI) of December 20, 1961, followed by Resolution 1962 (XVIII) of December 13, 1963, then formulated in Article VIII of the Outer Space Treaty of 1967 and as specified in the Registration Convention of 1975. Registration of space objects can be seen today as a principle of customary international law, relevant for each spacefaring state. Registration is divided into a national and an international level. The State Party establishes a national registry for its space objects, and those registrations have to be communicated via diplomatic channel to the UN Register of space objects. This UN Register is handled by the UN Office for Outer Space Affairs (UNOOSA) and is an open source of information for space objects worldwide. Registration is linked to the so-called launching state of the relevant space object. There might be more than one launching state for the specific launch event, but only one state actor can register a specific space object. The state of registry gains "jurisdiction and control" over the space object and therefore no double registration is permissible. Based on the established UN Space Law, registration practice was subject to some adaptions due to technical developments and legal challenges. After the privatization of the major international satellite organizations, a number of non-registrations had to be faced. The state actors reacted with the UN Registration Practice Resolution of 2007 as elaborated in the Legal Subcommittee of UNCOPUOS, the Committee for the Peaceful Use of Outer Space. In this context an UNOOSA Registration Information

  2. Robust optimization model and algorithm for railway freight center location problem in uncertain environment.

    Science.gov (United States)

    Liu, Xing-Cai; He, Shi-Wei; Song, Rui; Sun, Yang; Li, Hao-Dong

    2014-01-01

    Railway freight center location problem is an important issue in railway freight transport programming. This paper focuses on the railway freight center location problem in uncertain environment. Seeing that the expected value model ignores the negative influence of disadvantageous scenarios, a robust optimization model was proposed. The robust optimization model takes expected cost and deviation value of the scenarios as the objective. A cloud adaptive clonal selection algorithm (C-ACSA) was presented. It combines adaptive clonal selection algorithm with Cloud Model which can improve the convergence rate. Design of the code and progress of the algorithm were proposed. Result of the example demonstrates the model and algorithm are effective. Compared with the expected value cases, the amount of disadvantageous scenarios in robust model reduces from 163 to 21, which prove the result of robust model is more reliable.

  3. Robust Optimization Model and Algorithm for Railway Freight Center Location Problem in Uncertain Environment

    Directory of Open Access Journals (Sweden)

    Xing-cai Liu

    2014-01-01

    Full Text Available Railway freight center location problem is an important issue in railway freight transport programming. This paper focuses on the railway freight center location problem in uncertain environment. Seeing that the expected value model ignores the negative influence of disadvantageous scenarios, a robust optimization model was proposed. The robust optimization model takes expected cost and deviation value of the scenarios as the objective. A cloud adaptive clonal selection algorithm (C-ACSA was presented. It combines adaptive clonal selection algorithm with Cloud Model which can improve the convergence rate. Design of the code and progress of the algorithm were proposed. Result of the example demonstrates the model and algorithm are effective. Compared with the expected value cases, the amount of disadvantageous scenarios in robust model reduces from 163 to 21, which prove the result of robust model is more reliable.

  4. Protected Objects in Java

    DEFF Research Database (Denmark)

    Løvengreen, Hans Henrik; Schwarzer, Jens Christian

    1998-01-01

    We present an implementation of Ada 95's notion of protected objects in Java. The implementation comprises a class library supporting entry queues and a (pre-) compiler translating slightly decorated Java classes to pure Java classes utilizing the library.......We present an implementation of Ada 95's notion of protected objects in Java. The implementation comprises a class library supporting entry queues and a (pre-) compiler translating slightly decorated Java classes to pure Java classes utilizing the library....

  5. Robust giant magnetoresistive effect type multilayer sensor

    NARCIS (Netherlands)

    Lenssen, K.M.H.; Kuiper, A.E.T.; Roozeboom, F.

    2002-01-01

    A robust Giant Magneto Resistive effect type multilayer sensor comprising a free and a pinned ferromagnetic layer, which can withstand high temperatures and strong magnetic fields as required in automotive applications. The GMR multi-layer has an asymmetric magneto-resistive curve and enables

  6. New solutions for NPP robustness improvement

    International Nuclear Information System (INIS)

    Wolski, Alexander

    2013-01-01

    Fukushima accident has triggered a major re-assessment of robustness of nuclear stations. First round of evaluations has been Finished. Improvement areas and strategies have been identified. Implementation of upgrades has started world-wide. New solutions can provide substantial benefits

  7. Robust inference in sample selection models

    KAUST Repository

    Zhelonkin, Mikhail; Genton, Marc G.; Ronchetti, Elvezio

    2015-01-01

    The problem of non-random sample selectivity often occurs in practice in many fields. The classical estimators introduced by Heckman are the backbone of the standard statistical analysis of these models. However, these estimators are very sensitive to small deviations from the distributional assumptions which are often not satisfied in practice. We develop a general framework to study the robustness properties of estimators and tests in sample selection models. We derive the influence function and the change-of-variance function of Heckman's two-stage estimator, and we demonstrate the non-robustness of this estimator and its estimated variance to small deviations from the model assumed. We propose a procedure for robustifying the estimator, prove its asymptotic normality and give its asymptotic variance. Both cases with and without an exclusion restriction are covered. This allows us to construct a simple robust alternative to the sample selection bias test. We illustrate the use of our new methodology in an analysis of ambulatory expenditures and we compare the performance of the classical and robust methods in a Monte Carlo simulation study.

  8. Robust inference in sample selection models

    KAUST Repository

    Zhelonkin, Mikhail

    2015-11-20

    The problem of non-random sample selectivity often occurs in practice in many fields. The classical estimators introduced by Heckman are the backbone of the standard statistical analysis of these models. However, these estimators are very sensitive to small deviations from the distributional assumptions which are often not satisfied in practice. We develop a general framework to study the robustness properties of estimators and tests in sample selection models. We derive the influence function and the change-of-variance function of Heckman\\'s two-stage estimator, and we demonstrate the non-robustness of this estimator and its estimated variance to small deviations from the model assumed. We propose a procedure for robustifying the estimator, prove its asymptotic normality and give its asymptotic variance. Both cases with and without an exclusion restriction are covered. This allows us to construct a simple robust alternative to the sample selection bias test. We illustrate the use of our new methodology in an analysis of ambulatory expenditures and we compare the performance of the classical and robust methods in a Monte Carlo simulation study.

  9. Highly Robust Methods in Data Mining

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2013-01-01

    Roč. 8, č. 1 (2013), s. 9-24 ISSN 1452-4864 Institutional support: RVO:67985807 Keywords : data mining * robust statistics * high-dimensional data * cluster analysis * logistic regression * neural networks Subject RIV: BB - Applied Statistics, Operational Research

  10. Robust Utility Maximization Under Convex Portfolio Constraints

    International Nuclear Information System (INIS)

    Matoussi, Anis; Mezghani, Hanen; Mnif, Mohamed

    2015-01-01

    We study a robust maximization problem from terminal wealth and consumption under a convex constraints on the portfolio. We state the existence and the uniqueness of the consumption–investment strategy by studying the associated quadratic backward stochastic differential equation. We characterize the optimal control by using the duality method and deriving a dynamic maximum principle

  11. Robust diamond meshes with unique wettability properties.

    Science.gov (United States)

    Yang, Yizhou; Li, Hongdong; Cheng, Shaoheng; Zou, Guangtian; Wang, Chuanxi; Lin, Quan

    2014-03-18

    Robust diamond meshes with excellent superhydrophobic and superoleophilic properties have been fabricated. Superhydrophobicity is observed for water with varying pH from 1 to 14 with good recyclability. Reversible superhydrophobicity and hydrophilicity can be easily controlled. The diamond meshes show highly efficient water-oil separation and water pH droplet transference.

  12. Robust predictions of the interacting boson model

    International Nuclear Information System (INIS)

    Casten, R.F.; Koeln Univ.

    1994-01-01

    While most recognized for its symmetries and algebraic structure, the IBA model has other less-well-known but equally intrinsic properties which give unavoidable, parameter-free predictions. These predictions concern central aspects of low-energy nuclear collective structure. This paper outlines these ''robust'' predictions and compares them with the data

  13. Robust keyword retrieval method for OCRed text

    Science.gov (United States)

    Fujii, Yusaku; Takebe, Hiroaki; Tanaka, Hiroshi; Hotta, Yoshinobu

    2011-01-01

    Document management systems have become important because of the growing popularity of electronic filing of documents and scanning of books, magazines, manuals, etc., through a scanner or a digital camera, for storage or reading on a PC or an electronic book. Text information acquired by optical character recognition (OCR) is usually added to the electronic documents for document retrieval. Since texts generated by OCR generally include character recognition errors, robust retrieval methods have been introduced to overcome this problem. In this paper, we propose a retrieval method that is robust against both character segmentation and recognition errors. In the proposed method, the insertion of noise characters and dropping of characters in the keyword retrieval enables robustness against character segmentation errors, and character substitution in the keyword of the recognition candidate for each character in OCR or any other character enables robustness against character recognition errors. The recall rate of the proposed method was 15% higher than that of the conventional method. However, the precision rate was 64% lower.

  14. Robust Algebraic Multilevel Methods and Algorithms

    CERN Document Server

    Kraus, Johannes

    2009-01-01

    This book deals with algorithms for the solution of linear systems of algebraic equations with large-scale sparse matrices, with a focus on problems that are obtained after discretization of partial differential equations using finite element methods. Provides a systematic presentation of the recent advances in robust algebraic multilevel methods. Can be used for advanced courses on the topic.

  15. Replication and Robustness in Developmental Research

    Science.gov (United States)

    Duncan, Greg J.; Engel, Mimi; Claessens, Amy; Dowsett, Chantelle J.

    2014-01-01

    Replications and robustness checks are key elements of the scientific method and a staple in many disciplines. However, leading journals in developmental psychology rarely include explicit replications of prior research conducted by different investigators, and few require authors to establish in their articles or online appendices that their key…

  16. A Robust Alternative to the Normal Distribution.

    Science.gov (United States)

    1982-07-07

    for any Purpose of the United States Governuent DEPARTMENT OF STATISTICS t -, STANFORD UIVERSITY I STANFORD, CALIFORNIA A Robust Alternative to the...Stanford University Technical Report No. 3. [5] Bhattacharya, S. K. (1966). A Modified Bessel Function lodel in Life Testing. Metrika 10, 133-144

  17. Robust bayesian inference of generalized Pareto distribution ...

    African Journals Online (AJOL)

    En utilisant une etude exhaustive de Monte Carlo, nous prouvons que, moyennant une fonction perte generalisee adequate, on peut construire un estimateur Bayesien robuste du modele. Key words: Bayesian estimation; Extreme value; Generalized Fisher information; Gener- alized Pareto distribution; Monte Carlo; ...

  18. Robust balance shift control with posture optimization

    NARCIS (Netherlands)

    Kavafoglu, Z.; Kavafoglu, Ersan; Egges, J.

    2015-01-01

    In this paper we present a control framework which creates robust and natural balance shifting behaviours during standing. Given high-level features such as the position of the center of mass projection and the foot configurations, a kinematic posture satisfying these features is synthesized using

  19. Finite Algorithms for Robust Linear Regression

    DEFF Research Database (Denmark)

    Madsen, Kaj; Nielsen, Hans Bruun

    1990-01-01

    The Huber M-estimator for robust linear regression is analyzed. Newton type methods for solution of the problem are defined and analyzed, and finite convergence is proved. Numerical experiments with a large number of test problems demonstrate efficiency and indicate that this kind of approach may...

  20. Robust network design for multispecies conservation

    Science.gov (United States)

    Ronan Le Bras; Bistra Dilkina; Yexiang Xue; Carla P. Gomes; Kevin S. McKelvey; Michael K. Schwartz; Claire A. Montgomery

    2013-01-01

    Our work is motivated by an important network design application in computational sustainability concerning wildlife conservation. In the face of human development and climate change, it is important that conservation plans for protecting landscape connectivity exhibit certain level of robustness. While previous work has focused on conservation strategies that result...

  1. Robust control charts in statistical process control

    NARCIS (Netherlands)

    Nazir, H.Z.

    2014-01-01

    The presence of outliers and contaminations in the output of the process highly affects the performance of the design structures of commonly used control charts and hence makes them of less practical use. One of the solutions to deal with this problem is to use control charts which are robust

  2. Some Diagnostic Tools in Robust Econometrics

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2011-01-01

    Roč. 50, č. 2 (2011), s. 55-67 ISSN 0231-9721 Grant - others:GA ČR(CZ) GA402/09/0557 Institutional research plan: CEZ:AV0Z10300504 Keywords : robust regression * autocorrelated errors * heteroscedastic regression * instrumental variables * least weighted squares Subject RIV: BB - Applied Statistics, Operational Research http://dml.cz/handle/10338.dmlcz/141754

  3. Robust estimation for ordinary differential equation models.

    Science.gov (United States)

    Cao, J; Wang, L; Xu, J

    2011-12-01

    Applied scientists often like to use ordinary differential equations (ODEs) to model complex dynamic processes that arise in biology, engineering, medicine, and many other areas. It is interesting but challenging to estimate ODE parameters from noisy data, especially when the data have some outliers. We propose a robust method to address this problem. The dynamic process is represented with a nonparametric function, which is a linear combination of basis functions. The nonparametric function is estimated by a robust penalized smoothing method. The penalty term is defined with the parametric ODE model, which controls the roughness of the nonparametric function and maintains the fidelity of the nonparametric function to the ODE model. The basis coefficients and ODE parameters are estimated in two nested levels of optimization. The coefficient estimates are treated as an implicit function of ODE parameters, which enables one to derive the analytic gradients for optimization using the implicit function theorem. Simulation studies show that the robust method gives satisfactory estimates for the ODE parameters from noisy data with outliers. The robust method is demonstrated by estimating a predator-prey ODE model from real ecological data. © 2011, The International Biometric Society.

  4. Chromatin regulators, phenotypic robustness and autism risk

    Directory of Open Access Journals (Sweden)

    Reut eSuliman

    2014-04-01

    Full Text Available Though extensively characterized clinically, the causes of autism spectrum disorder (ASD remain a mystery. ASD is known to have a strong genetic basis, but it is genetically very heterogeneous. Recent studies have estimated that de novo disruptive mutations in hundreds of genes may contribute to ASD. However, it is unclear how it is possible for mutations in so many different genes to contribute to ASD. Recent findings suggest that many of the mutations disrupt genes involved in transcription regulation that are expressed prenatally in the developing brain. De novo disruptive mutations are also more frequent in girls with ASD, despite the fact that ASD is more prevalent in boys. In this paper, we hypothesize that loss of robustness may contribute to ASD. Loss of phenotypic robustness may be caused by mutations that disrupt capacitors that operate in the developing brain. This may lead to the release of cryptic genetic variation that contributes to ASD. Reduced robustness is consistent with the observed variability in expressivity and incomplete penetrance. It is also consistent with the hypothesis that the development of the female brain is more robust, and it may explain the higher rate and severity of disruptive de novo mutations in girls with ASD.

  5. Robust and Adaptive Control With Aerospace Applications

    CERN Document Server

    Lavretsky, Eugene

    2013-01-01

    Robust and Adaptive Control shows the reader how to produce consistent and accurate controllers that operate in the presence of uncertainties and unforeseen events. Driven by aerospace applications the focus of the book is primarily on continuous-dynamical systems.  The text is a three-part treatment, beginning with robust and optimal linear control methods and moving on to a self-contained presentation of the design and analysis of model reference adaptive control (MRAC) for nonlinear uncertain dynamical systems. Recent extensions and modifications to MRAC design are included, as are guidelines for combining robust optimal and MRAC controllers. Features of the text include: ·         case studies that demonstrate the benefits of robust and adaptive control for piloted, autonomous and experimental aerial platforms; ·         detailed background material for each chapter to motivate theoretical developments; ·         realistic examples and simulation data illustrating key features ...

  6. Robust Pedestrian Navigation for Challenging Applications

    OpenAIRE

    Gilliéron, PY; Renaudin, V

    2009-01-01

    Presentation of a concept for robust indoor navigation. The concept is based on three key elements: - the use of an absolute geographical reference - the hybridisation of complementary technologies - specific motion models. This concept is illustrated by the means of two applications: the urban displacement of blind people and the indoor guidance of fire-fighters

  7. Robust online face tracking-by-detection

    NARCIS (Netherlands)

    Comaschi, F.; Stuijk, S.; Basten, T.; Corporaal, H.

    2016-01-01

    The problem of online face tracking from unconstrained videos is still unresolved. Challenges range from coping with severe online appearance variations to coping with occlusion. We propose RFTD (Robust Face Tracking-by-Detection), a system which combines tracking and detection into a single

  8. Robust control investigations for equipment loaded panels

    DEFF Research Database (Denmark)

    Aglietti, G.S.; Langley, R.S.; Rogers, E.

    1998-01-01

    This paper develops a modelling technique for equipment load panels which directly produces (adequate) models of the underlying dynamics on which to base robust controller design/evaluations. This technique is based on the use of the Lagrange's equations of motion and the resulting models...

  9. Robust Design of Sounds in Mechanical Mechanisms

    DEFF Research Database (Denmark)

    Boegedal Jensen, Annemette; Munch, Natasja; Howard, Thomas J.

    2015-01-01

    mechanism consisting of a toothed rack and a click arm. First several geometries of the teeth and the click arm’s head were investigated to identify the most robust and repeatable design. It was found that a flat surface in the valleys between the teeth is very beneficial in relation to repeatability...

  10. Robust glint detection through homography normalization

    DEFF Research Database (Denmark)

    Hansen, Dan Witzner; Roholm, Lars; García Ferreiros, Iván

    2014-01-01

    A novel normalization principle for robust glint detection is presented. The method is based on geometric properties of corneal reflections and allows for simple and effective detection of glints even in the presence of several spurious and identically appearing reflections. The method is tested...

  11. Robust power spectral estimation for EEG data.

    Science.gov (United States)

    Melman, Tamar; Victor, Jonathan D

    2016-08-01

    Typical electroencephalogram (EEG) recordings often contain substantial artifact. These artifacts, often large and intermittent, can interfere with quantification of the EEG via its power spectrum. To reduce the impact of artifact, EEG records are typically cleaned by a preprocessing stage that removes individual segments or components of the recording. However, such preprocessing can introduce bias, discard available signal, and be labor-intensive. With this motivation, we present a method that uses robust statistics to reduce dependence on preprocessing by minimizing the effect of large intermittent outliers on the spectral estimates. Using the multitaper method (Thomson, 1982) as a starting point, we replaced the final step of the standard power spectrum calculation with a quantile-based estimator, and the Jackknife approach to confidence intervals with a Bayesian approach. The method is implemented in provided MATLAB modules, which extend the widely used Chronux toolbox. Using both simulated and human data, we show that in the presence of large intermittent outliers, the robust method produces improved estimates of the power spectrum, and that the Bayesian confidence intervals yield close-to-veridical coverage factors. The robust method, as compared to the standard method, is less affected by artifact: inclusion of outliers produces fewer changes in the shape of the power spectrum as well as in the coverage factor. In the presence of large intermittent outliers, the robust method can reduce dependence on data preprocessing as compared to standard methods of spectral estimation. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Chance constrained uncertain classification via robust optimization

    NARCIS (Netherlands)

    Ben-Tal, A.; Bhadra, S.; Bhattacharayya, C.; Saketha Nat, J.

    2011-01-01

    This paper studies the problem of constructing robust classifiers when the training is plagued with uncertainty. The problem is posed as a Chance-Constrained Program (CCP) which ensures that the uncertain data points are classified correctly with high probability. Unfortunately such a CCP turns out

  13. Quantifying the robustness of metro networks

    NARCIS (Netherlands)

    Wang, X.; Koç, Y.; Derrible, S.; Nasir Ahmad, Sk.; Kooij, R.E.

    2015-01-01

    Metros (heavy rail transit systems) are integral parts of urban transportation systems. Failures in their operations can have serious impacts on urban mobility, and measuring their robustness is therefore critical. Moreover, as physical networks, metros can be viewed as network topological entities,

  14. Robust Refinement as Implemented in TOPAS

    Energy Technology Data Exchange (ETDEWEB)

    Stone, K.; Stephens, P

    2010-01-01

    A robust refinement procedure is implemented in the program TOPAS through an iterative reweighting of the data. Examples are given of the procedure as applied to fitting partially overlapped peaks by full and partial models and also of the structures of ibuprofen and acetaminophen in the presence of unmodeled impurity contributions

  15. Robust Satisfiability of Systems of Equations

    Czech Academy of Sciences Publication Activity Database

    Franek, Peter; Krčál, M.

    2015-01-01

    Roč. 62, č. 4 (2015), Article 26 ISSN 0004-5411 R&D Projects: GA ČR GBP202/12/G061 Grant - others:GA MŠk(CZ) LL1201 Institutional support: RVO:67985807 Keywords : nonlinear equations * satisfability * undecibility * topological extensions * uncertainty * robustness Subject RIV: IN - Informatics, Computer Science Impact factor: 1.803, year: 2015

  16. Robust fitting of diurnal brightness temperature cycle

    CSIR Research Space (South Africa)

    Udahemuka, G

    2007-11-01

    Full Text Available for a pixel concerned. Robust fitting of observed Diurnal Temperature Cycle (DTC) taken over a day of a given pixel without cloud cover and other abnormally conditions such as fire can give a data based brightness temperature model for a given pixel...

  17. Sexual Orientation and Spatial Position Effects on Selective Forms of Object Location Memory

    Science.gov (United States)

    Rahman, Qazi; Newland, Cherie; Smyth, Beatrice Mary

    2011-01-01

    Prior research has demonstrated robust sex and sexual orientation-related differences in object location memory in humans. Here we show that this sexual variation may depend on the spatial position of target objects and the task-specific nature of the spatial array. We tested the recovery of object locations in three object arrays (object…

  18. CODAS object monitoring service

    International Nuclear Information System (INIS)

    Wheatley, M.R.; Rainford, M.

    2001-01-01

    The primary Control and Data Acquisition System (CODAS) of JET is based on a TCP/IP network of more than 150 computers. The CODAS computers provide the JET machine control and data acquisition for over 70,000 digital and analog signals. The Object Monitoring Service (OMS) is used by applications for monitoring objects for presentation to the JET machine operators and for the operation of individual software components (such as valve state, access control, mimic definition changes and internal data distribution). Each server typically handles connections from around 60 clients monitoring upwards of 2000 objects. Some servers have over 150 clients and 5000 objects. Acquisition libraries are dynamically linked into a running server as required either to acquire data values for objects or to forward requests to other OMS servers. A mechanism involving dynamic linking allows new libraries to be integrated without stopping or changing running software. OMS provides a very reliable and highly successful 'data-type independent' means of monitoring many different objects. It allows applications to take advantage of new data sources, without the need to change existing code

  19. Nuclear Fuel Cycle Objectives

    International Nuclear Information System (INIS)

    2013-01-01

    One of the IAEA's statutory objectives is to 'seek to accelerate and enlarge the contribution of atomic energy to peace, health and prosperity throughout the world'. One way this objective is achieved is through the publication of a range of technical series. Two of these are the IAEA Nuclear Energy Series and the IAEA Safety Standards Series. According to Article III.A.6 of the IAEA Statute, the safety standards establish 'standards of safety for protection of health and minimization of danger to life and property'. The safety standards include the Safety Fundamentals, Safety Requirements and Safety Guides. These standards are written primarily in a regulatory style, and are binding on the IAEA for its own programmes. The principal users are the regulatory bodies in member States and other national authorities. The IAEA Nuclear Energy Series comprises reports designed to encourage and assist R and D on, and application of, nuclear energy for peaceful uses. This includes practical examples to be used by owners and operators of utilities in member States, implementing organizations, academia and government officials, among others. This information is presented in guides, reports on technology status and advances, and best practices for peaceful uses of nuclear energy based on inputs from international experts. The IAEA Nuclear Energy Series complements the IAEA Safety Standards Series. The Nuclear Energy Basic Principles is the highest level publication in the IAEA Nuclear Energy Series, and describes the rationale and vision for the peaceful uses of nuclear energy. It presents eight Basic Principles on which nuclear energy systems should be based to fulfil nuclear energy's potential to help meet growing global energy needs. The Nuclear Energy Series Objectives are the second level publications. They describe what needs to be considered and the specific goals to be achieved at different stages of implementation, all of which are consistent with the Basic Principles

  20. Objects of consciousness

    Directory of Open Access Journals (Sweden)

    Donald David Hoffman

    2014-06-01

    Full Text Available Current models of visual perception typically assume that human vision estimates true properties of physical objects, properties that exist even if unperceived. However, recent studies of perceptual evolution, using evolutionary games and genetic algorithms, reveal that natural selection often drives true perceptions to extinction when they compete with perceptions tuned to fitness rather than truth: Perception guides adaptive behavior; it does not estimate a preexisting physical truth. Moreover, shifting from evolutionary biology to quantum physics, there is reason to disbelieve in preexist-ing physical truths: Certain interpretations of quantum theory deny that dynamical properties of physical objects have defi-nite values when unobserved. In some of these interpretations the observer is fundamental, and wave functions are com-pendia of subjective probabilities, not preexisting elements of physical reality. These two considerations, from evolutionary biology and quantum physics, suggest that current models of object perception require fundamental reformulation. Here we begin such a reformulation, starting with a formal model of consciousness that we call a conscious agent. We develop the dynamics of interacting conscious agents, and study how the perception of objects and space-time can emerge from such dynamics. We show that one particular object, the quantum free particle, has a wave function that is identical in form to the harmonic functions that characterize the asymptotic dynamics of conscious agents; particles are vibrations not of strings but of interacting conscious agents. This allows us to reinterpret physical properties such as position, momentum, and energy as properties of interacting conscious agents, rather than as preexisting physical truths. We sketch how this approach might extend to the perception of relativistic quantum objects, and to classical objects of macroscopic scale.

  1. Radioactive Waste Management Objectives

    International Nuclear Information System (INIS)

    2011-01-01

    One of the IAEA's statutory objectives is to 'seek to accelerate and enlarge the contribution of atomic energy to peace, health and prosperity throughout the world'. One way it achieves this objective is to issue publications in various series. Two of these series are the IAEA Nuclear Energy Series and the IAEA Safety Standards Series. According to Article III, paragraph A.6, of the IAEA Statute, the IAEA safety standards establish 'standards of safety for protection of health and minimization of danger to life and property.' The safety standards include the Safety Fundamentals, Safety Requirements and Safety Guides. These standards are primarily written in a regulatory style, and are binding on the IAEA for its own activities. The principal users are Member State regulatory bodies and other national authorities. The IAEA Nuclear Energy Series consists of reports designed to encourage and assist research on, and development and practical application of, nuclear energy for peaceful uses. This includes practical examples to be used by owners and operators of utilities in Member States, implementing organizations, academia and politicians, among others. The information is presented in guides, reports on the status of technology and advances, and best practices for peaceful uses of nuclear energy based on inputs from international experts. The series complements the IAEA's safety standards, and provides detailed guidance, experience, good practices and examples on the five areas covered in the IAEA Nuclear Energy Series. The Nuclear Energy Basic Principles is the highest level publication in the IAEA Nuclear Energy Series and describes the rationale and vision for the peaceful uses of nuclear energy. It presents eight Basic Principles on which nuclear energy systems should be based to fulfil nuclear energy's potential to help meet growing global energy needs. The Nuclear Energy Series Objectives are the second level publications. They describe what needs to be

  2. Robust feedback zoom tracking for digital video surveillance.

    Science.gov (United States)

    Zou, Tengyue; Tang, Xiaoqi; Song, Bao; Wang, Jin; Chen, Jihong

    2012-01-01

    Zoom tracking is an important function in video surveillance, particularly in traffic management and security monitoring. It involves keeping an object of interest in focus during the zoom operation. Zoom tracking is typically achieved by moving the zoom and focus motors in lenses following the so-called "trace curve", which shows the in-focus motor positions versus the zoom motor positions for a specific object distance. The main task of a zoom tracking approach is to accurately estimate the trace curve for the specified object. Because a proportional integral derivative (PID) controller has historically been considered to be the best controller in the absence of knowledge of the underlying process and its high-quality performance in motor control, in this paper, we propose a novel feedback zoom tracking (FZT) approach based on the geometric trace curve estimation and PID feedback controller. The performance of this approach is compared with existing zoom tracking methods in digital video surveillance. The real-time implementation results obtained on an actual digital video platform indicate that the developed FZT approach not only solves the traditional one-to-many mapping problem without pre-training but also improves the robustness for tracking moving or switching objects which is the key challenge in video surveillance.

  3. TESS Objects of Interest

    Science.gov (United States)

    Guerrero, Natalia; Glidden, Ana; Fausnaugh, Michael; TESS Team

    2018-01-01

    We describe the search for TESS Objects of Interest (TOIs), led by the MIT branch of the TESS Science Office (TSO). TSO has developed a tool called TESS Exoplanet Vetter (TEV) to facilitate this process. Individuals independently examine data validation products for each target and assign a category to the object: planet candidate, eclipsing binary, other astrophysical, stellar variability, or instrument noise/systematic. TEV assigns a preliminary follow-up priority designation to each object and allows for modification when final dispositions are decided on in a group setting. When all targets are vetted, TEV exports a catalogue of TOIs which is delivered to the TESS Follow-Up Observing Program (TFOP), working with ExoFOP-TESS, and made publicly available on the official TESS website and the Mikulski Archive for Space Telescopes (MAST).

  4. [Medicine and conscientious objection].

    Science.gov (United States)

    Martínez, K

    2007-01-01

    Conscientious objection to democratically accepted laws in democratic societies is a fact, both among citizens and among professionals. Due respect for laws is a prima facie duty in these societies. But democratic justice must at the same time respect peoples' conscience for it constitutes the ethical identity of individuals. And both law and ethics are necessary - although neither of them is sufficient - for its realization. The problem of conscientious objection among healthcare professionals is analysed from this standpoint and the conclusion is that objection is not an absolute right to exemption from several duties, but that the responsibility of the professional and of the institutions towards the citizenry must always be taken into account. Some solutions are suggested that try to protect both the professionals and the citizens in a bi-directional way.

  5. Media, journalism, objectivity

    Directory of Open Access Journals (Sweden)

    Vlajki Emil

    2013-01-01

    Full Text Available This is the text around the themes: Media and Journalism, are confronted two directions of opinions: humanism and elitism. Humanism believes that media and journalism must be metaphysically objective: able to tell the truth regardless of time, place and terms of events. Another approach, elitism, is connected with Hegel's philosophy of history. Hegel's conceptual apparatus includes: Idea, History dialectic, 'cunning mind,' self- development and self-realization. In this context, media and journalism are considered as organic unity, an inseparable part of some dialectical totality. More specifically media and journalism can be objective only if they defend concrete ideological assumptions of society to which they belong. Any other understanding of these two concepts is non-objective, mere moralizing and / or demagoguery.

  6. Robust Optimization on Regional WCO-for-Biodiesel Supply Chain under Supply and Demand Uncertainties

    Directory of Open Access Journals (Sweden)

    Yong Zhang

    2016-01-01

    Full Text Available This paper aims to design a robust waste cooking oil- (WCO- for-biodiesel supply chain under WCO supply and price as well as biodiesel demand and price uncertainties, so as to improve biorefineries’ ability to cope with the poor environment. A regional supply chain is firstly introduced based on the biggest WCO-for-biodiesel company in Changzhou, Jiangsu province, and it comprises three components: WCO supplier, biorefinery, and demand zone. And then a robust mixed integer linear model with multiple objectives (economic, environmental, and social objectives is proposed for both biorefinery location and transportation plans. After that, a heuristic algorithm based on genetic algorithm is proposed to solve this model. Finally, the 27 cities in Yangtze River delta are adopted to verify the proposed models and methods, and the sustainability and robustness of biodiesel supply are discussed.

  7. Pinocchio: Geppetto's transitional object

    Directory of Open Access Journals (Sweden)

    Gabriele Zeloni

    2015-01-01

    Full Text Available The literature has been considered by Freud and others after him, a form of unaware exploration of mind that can leads to discoveries similar to psychoanalysis’s discoveries. From this perspective, the author puts forward the following hypothesis: Pinocchio is a puppet who comes to life and is therefore, from a child's perception, a transitional object according to Winnicott. Consequently Geppetto is nothing more than the involuntary representation of any child interacting with the transitional object. The author explains the results of the analysis of the text in support of the hypothesis and reflects on the impact of The adventure of Pinocchio on the reader.

  8. Object-oriented communications

    International Nuclear Information System (INIS)

    Chapman, L.J.

    1989-01-01

    OOC is a high-level communications protocol based on the object-oriented paradigm. OOC's syntax, semantics, and pragmatics balance simplicity and expressivity for controls environments. While natural languages are too complex, computer protocols are often insufficiently expressive. An object-oriented communications philosophy provides a base for building the necessary high-level communications primitives like I don't understand and the current value of X is K. OOC is sufficiently flexible to express data acquisition, control requests, alarm messages, and error messages in a straightforward generic way. It can be used in networks, for inter-task communication, and even for intra-task communication

  9. Quantum objective realism

    International Nuclear Information System (INIS)

    Bednorz, Adam

    2015-01-01

    The question of whether quantum measurements reflect some underlying objective reality has no generally accepted answer. We show that a description of such reality is possible under natural conditions such as linearity and causality, although in terms of moments and cumulants of finite order and without relativistic invariance. The proposed construction of observations’ probability distribution originates from weak, noninvasive measurements, with detection error replaced by some external finite noise. The noise allows us to construct microscopic objective reality, but remains dynamically decoupled and hence unobservable at the macroscopic level. (paper)

  10. Learning Objects Web

    DEFF Research Database (Denmark)

    Blåbjerg, Niels Jørgen

    2005-01-01

    Learning Objects Web er et DEFF projekt som Aalborg Universitetsbibliotek har initieret. Projektet tager afsæt i de resultater og erfaringer som er opnået med vores tidligere projekt Streaming Webbased Information Modules (SWIM). Vi har et internationalt netværk af interessenter som giver os...... sparring og feedback i forhold til udviklingskoncept både omkring de teoretiske rammer og i forhold til praktisk anvendelse af vores undervisningskoncept. Med disse rygstød og input har vi forfulgt ønsket om at videreudvikle SWIM i det nye projekt Learning Objects Web. Udgivelsesdato: juni...

  11. Robustness of Linear Systems towards Multi-Dissipative Pertubations

    DEFF Research Database (Denmark)

    Thygesen, Uffe Høgsbro; Poulsen, Niels Kjølstad

    1997-01-01

    We consider the question of robust stability of a linear time invariant plant subject to dynamic perturbations, which are dissipative in the sense of Willems with respect to several quadratic supply rates. For instance, parasitic dynamics are often both small gain and passive. We reduce several...... robustness analysis questions to linear matrix inequalities: robust stability, robust H2 performance and robust performance in presence of disturbances with finite signal-to-noise ratios...

  12. Intelligent and robust optimization frameworks for smart grids

    Science.gov (United States)

    Dhansri, Naren Reddy

    A smart grid implies a cyberspace real-time distributed power control system to optimally deliver electricity based on varying consumer characteristics. Although smart grids solve many of the contemporary problems, they give rise to new control and optimization problems with the growing role of renewable energy sources such as wind or solar energy. Under highly dynamic nature of distributed power generation and the varying consumer demand and cost requirements, the total power output of the grid should be controlled such that the load demand is met by giving a higher priority to renewable energy sources. Hence, the power generated from renewable energy sources should be optimized while minimizing the generation from non renewable energy sources. This research develops a demand-based automatic generation control and optimization framework for real-time smart grid operations by integrating conventional and renewable energy sources under varying consumer demand and cost requirements. Focusing on the renewable energy sources, the intelligent and robust control frameworks optimize the power generation by tracking the consumer demand in a closed-loop control framework, yielding superior economic and ecological benefits and circumvent nonlinear model complexities and handles uncertainties for superior real-time operations. The proposed intelligent system framework optimizes the smart grid power generation for maximum economical and ecological benefits under an uncertain renewable wind energy source. The numerical results demonstrate that the proposed framework is a viable approach to integrate various energy sources for real-time smart grid implementations. The robust optimization framework results demonstrate the effectiveness of the robust controllers under bounded power plant model uncertainties and exogenous wind input excitation while maximizing economical and ecological performance objectives. Therefore, the proposed framework offers a new worst-case deterministic

  13. SU-E-T-625: Robustness Evaluation and Robust Optimization of IMPT Plans Based on Per-Voxel Standard Deviation of Dose Distributions.

    Science.gov (United States)

    Liu, W; Mohan, R

    2012-06-01

    Proton dose distributions, IMPT in particular, are highly sensitive to setup and range uncertainties. We report a novel method, based on per-voxel standard deviation (SD) of dose distributions, to evaluate the robustness of proton plans and to robustly optimize IMPT plans to render them less sensitive to uncertainties. For each optimization iteration, nine dose distributions are computed - the nominal one, and one each for ± setup uncertainties along x, y and z axes and for ± range uncertainty. SD of dose in each voxel is used to create SD-volume histogram (SVH) for each structure. SVH may be considered a quantitative representation of the robustness of the dose distribution. For optimization, the desired robustness may be specified in terms of an SD-volume (SV) constraint on the CTV and incorporated as a term in the objective function. Results of optimization with and without this constraint were compared in terms of plan optimality and robustness using the so called'worst case' dose distributions; which are obtained by assigning the lowest among the nine doses to each voxel in the clinical target volume (CTV) and the highest to normal tissue voxels outside the CTV. The SVH curve and the area under it for each structure were used as quantitative measures of robustness. Penalty parameter of SV constraint may be varied to control the tradeoff between robustness and plan optimality. We applied these methods to one case each of H&N and lung. In both cases, we found that imposing SV constraint improved plan robustness but at the cost of normal tissue sparing. SVH-based optimization and evaluation is an effective tool for robustness evaluation and robust optimization of IMPT plans. Studies need to be conducted to test the methods for larger cohorts of patients and for other sites. This research is supported by National Cancer Institute (NCI) grant P01CA021239, the University Cancer Foundation via the Institutional Research Grant program at the University of Texas MD

  14. Robust Parameter Coordination for Multidisciplinary Design

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    This paper introduced a robust parameter coordination method to analyze parameter uncertainties so as to predict conflicts and coordinate parameters in multidisciplinary design. The proposed method is based on constraints network, which gives a formulated model to analyze the coupling effects between design variables and product specifications. In this model, interval boxes are adopted to describe the uncertainty of design parameters quantitatively to enhance the design robustness. To solve this constraint network model, a general consistent algorithm framework is designed and implemented with interval arithmetic and the genetic algorithm, which can deal with both algebraic and ordinary differential equations. With the help of this method, designers could infer the consistent solution space from the given specifications. A case study involving the design of a bogie dumping system demonstrates the usefulness of this approach.

  15. Robust lyapunov controller for uncertain systems

    KAUST Repository

    Laleg-Kirati, Taous-Meriem

    2017-02-23

    Various examples of systems and methods are provided for Lyapunov control for uncertain systems. In one example, a system includes a process plant and a robust Lyapunov controller configured to control an input of the process plant. The robust Lyapunov controller includes an inner closed loop Lyapunov controller and an outer closed loop error stabilizer. In another example, a method includes monitoring a system output of a process plant; generating an estimated system control input based upon a defined output reference; generating a system control input using the estimated system control input and a compensation term; and adjusting the process plant based upon the system control input to force the system output to track the defined output reference. An inner closed loop Lyapunov controller can generate the estimated system control input and an outer closed loop error stabilizer can generate the system control input.

  16. Communicating via robust synchronization of chaotic lasers

    International Nuclear Information System (INIS)

    Lopez-Gutierrez, R.M.; Posadas-Castillo, C.; Lopez-Mancilla, D.; Cruz-Hernandez, C.

    2009-01-01

    In this paper, the robust synchronization problem for coupled chaotic Nd:YAG lasers is addressed. We resort to complex systems theory to achieve chaos synchronization. Based on stability theory, it is shown that the state trajectories of the perturbed error synchronization are ultimately bounded, provided the unperturbed synchronization error system is exponentially stable, and some conditions on the bounds of the perturbation terms are satisfied. So that, encoding, transmission, and decoding in chaotic optical communications are presented. We analyze the transmission and recovery of encrypted information when parameter mismatches are considered. Computer simulations are provided to show the effectiveness of this robustness synchronization property, we present the encrypted transmission of image messages, and we show that, the transmitted image is faithfully recovered.

  17. Communicating via robust synchronization of chaotic lasers

    Energy Technology Data Exchange (ETDEWEB)

    Lopez-Gutierrez, R.M. [Engineering Faculty, Baja California Autonomous University (UABC), Km. 103 Carret. Tij-Ens., 22860 Ensenada, B.C. (Mexico); Posadas-Castillo, C. [Engineering Faculty, Baja California Autonomous University (UABC), Km. 103 Carret. Tij-Ens., 22860 Ensenada, B.C. (Mexico); FIME, Autonomous University of Nuevo Leon (UANL), Pedro de Alba, S.N., Cd. Universitaria, San Nicolas de los Garza, NL (Mexico); Lopez-Mancilla, D. [Departamento de Ciencias Exactas y Tecnologicas, Centro Universitario de los Lagos, Universidad de Guadalajara (CULagos-UdeG), Enrique Diaz de Leon s/n, 47460 Lagos de Moreno, Jal. (Mexico); Cruz-Hernandez, C. [Electronics and Telecommunications Department, Scientific Research and Advanced Studies of Ensenada (CICESE), Km. 107 Carret. Tij-Ens., 22860 Ensenada, B.C. (Mexico)], E-mail: ccruz@cicese.mx

    2009-10-15

    In this paper, the robust synchronization problem for coupled chaotic Nd:YAG lasers is addressed. We resort to complex systems theory to achieve chaos synchronization. Based on stability theory, it is shown that the state trajectories of the perturbed error synchronization are ultimately bounded, provided the unperturbed synchronization error system is exponentially stable, and some conditions on the bounds of the perturbation terms are satisfied. So that, encoding, transmission, and decoding in chaotic optical communications are presented. We analyze the transmission and recovery of encrypted information when parameter mismatches are considered. Computer simulations are provided to show the effectiveness of this robustness synchronization property, we present the encrypted transmission of image messages, and we show that, the transmitted image is faithfully recovered.

  18. Robust small area prediction for counts.

    Science.gov (United States)

    Tzavidis, Nikos; Ranalli, M Giovanna; Salvati, Nicola; Dreassi, Emanuela; Chambers, Ray

    2015-06-01

    A new semiparametric approach to model-based small area prediction for counts is proposed and used for estimating the average number of visits to physicians for Health Districts in Central Italy. The proposed small area predictor can be viewed as an outlier robust alternative to the more commonly used empirical plug-in predictor that is based on a Poisson generalized linear mixed model with Gaussian random effects. Results from the real data application and from a simulation experiment confirm that the proposed small area predictor has good robustness properties and in some cases can be more efficient than alternative small area approaches. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  19. Robust quantum optimizer with full connectivity.

    Science.gov (United States)

    Nigg, Simon E; Lörch, Niels; Tiwari, Rakesh P

    2017-04-01

    Quantum phenomena have the potential to speed up the solution of hard optimization problems. For example, quantum annealing, based on the quantum tunneling effect, has recently been shown to scale exponentially better with system size than classical simulated annealing. However, current realizations of quantum annealers with superconducting qubits face two major challenges. First, the connectivity between the qubits is limited, excluding many optimization problems from a direct implementation. Second, decoherence degrades the success probability of the optimization. We address both of these shortcomings and propose an architecture in which the qubits are robustly encoded in continuous variable degrees of freedom. By leveraging the phenomenon of flux quantization, all-to-all connectivity with sufficient tunability to implement many relevant optimization problems is obtained without overhead. Furthermore, we demonstrate the robustness of this architecture by simulating the optimal solution of a small instance of the nondeterministic polynomial-time hard (NP-hard) and fully connected number partitioning problem in the presence of dissipation.

  20. Robust haptic large distance telemanipulation for ITER

    International Nuclear Information System (INIS)

    Heck, D.J.F.; Heemskerk, C.J.M.; Koning, J.F.; Abbasi, A.; Nijmeijer, H.

    2013-01-01

    Highlights: • ITER remote handling maintenance can be controlled safely over a large distance. • Bilateral teleoperation experiments were performed in a local network. • Wave variables make the controller robust against constant communication delays. • Master and slave position synchronization guaranteed by proportional action. -- Abstract: During shutdowns, maintenance crews are expected to work in 24/6 shifts to perform critical remote handling maintenance tasks on the ITER system. In this article, we investigate the possibility to safely perform these haptic maintenance tasks remotely from control stations located anywhere around the world. To guarantee stability in time delayed bilateral teleoperation, the symmetric position tracking controller using wave variables is selected. This algorithm guarantees robustness against communication delays, can eliminate wave reflections and provide position synchronization of the master and slave devices. Experiments have been conducted under realistic local network bandwidth, latency and jitter constraints. They show sufficient transparency even for substantial communication delays