Interior point decoding for linear vector channels
International Nuclear Information System (INIS)
Wadayama, T
2008-01-01
In this paper, a novel decoding algorithm for low-density parity-check (LDPC) codes based on convex optimization is presented. The decoding algorithm, called interior point decoding, is designed for linear vector channels. The linear vector channels include many practically important channels such as inter-symbol interference channels and partial response channels. It is shown that the maximum likelihood decoding (MLD) rule for a linear vector channel can be relaxed to a convex optimization problem, which is called a relaxed MLD problem
Interior point decoding for linear vector channels
Energy Technology Data Exchange (ETDEWEB)
Wadayama, T [Nagoya Institute of Technology, Gokiso, Showa-ku, Nagoya, Aichi, 466-8555 (Japan)], E-mail: wadayama@nitech.ac.jp
2008-01-15
In this paper, a novel decoding algorithm for low-density parity-check (LDPC) codes based on convex optimization is presented. The decoding algorithm, called interior point decoding, is designed for linear vector channels. The linear vector channels include many practically important channels such as inter-symbol interference channels and partial response channels. It is shown that the maximum likelihood decoding (MLD) rule for a linear vector channel can be relaxed to a convex optimization problem, which is called a relaxed MLD problem.
Robust point matching via vector field consensus.
Jiayi Ma; Ji Zhao; Jinwen Tian; Yuille, Alan L; Zhuowen Tu
2014-04-01
In this paper, we propose an efficient algorithm, called vector field consensus, for establishing robust point correspondences between two sets of points. Our algorithm starts by creating a set of putative correspondences which can contain a very large number of false correspondences, or outliers, in addition to a limited number of true correspondences (inliers). Next, we solve for correspondence by interpolating a vector field between the two point sets, which involves estimating a consensus of inlier points whose matching follows a nonparametric geometrical constraint. We formulate this a maximum a posteriori (MAP) estimation of a Bayesian model with hidden/latent variables indicating whether matches in the putative set are outliers or inliers. We impose nonparametric geometrical constraints on the correspondence, as a prior distribution, using Tikhonov regularizers in a reproducing kernel Hilbert space. MAP estimation is performed by the EM algorithm which by also estimating the variance of the prior model (initialized to a large value) is able to obtain good estimates very quickly (e.g., avoiding many of the local minima inherent in this formulation). We illustrate this method on data sets in 2D and 3D and demonstrate that it is robust to a very large number of outliers (even up to 90%). We also show that in the special case where there is an underlying parametric geometrical model (e.g., the epipolar line constraint) that we obtain better results than standard alternatives like RANSAC if a large number of outliers are present. This suggests a two-stage strategy, where we use our nonparametric model to reduce the size of the putative set and then apply a parametric variant of our approach to estimate the geometric parameters. Our algorithm is computationally efficient and we provide code for others to use it. In addition, our approach is general and can be applied to other problems, such as learning with a badly corrupted training data set.
Twisted Vector Bundles on Pointed Nodal Curves
Indian Academy of Sciences (India)
Abstract. Motivated by the quest for a good compactification of the moduli space of -bundles on a nodal curve we establish a striking relationship between Abramovich's and Vistoli's twisted bundles and Gieseker vector bundles.
Estimation and Forecasting in Vector Autoregressive Moving Average Models for Rich Datasets
DEFF Research Database (Denmark)
Dias, Gustavo Fruet; Kapetanios, George
We address the issue of modelling and forecasting macroeconomic variables using rich datasets, by adopting the class of Vector Autoregressive Moving Average (VARMA) models. We overcome the estimation issue that arises with this class of models by implementing an iterative ordinary least squares (...
Vector Nonlinear Time-Series Analysis of Gamma-Ray Burst Datasets on Heterogeneous Clusters
Directory of Open Access Journals (Sweden)
Ioana Banicescu
2005-01-01
Full Text Available The simultaneous analysis of a number of related datasets using a single statistical model is an important problem in statistical computing. A parameterized statistical model is to be fitted on multiple datasets and tested for goodness of fit within a fixed analytical framework. Definitive conclusions are hopefully achieved by analyzing the datasets together. This paper proposes a strategy for the efficient execution of this type of analysis on heterogeneous clusters. Based on partitioning processors into groups for efficient communications and a dynamic loop scheduling approach for load balancing, the strategy addresses the variability of the computational loads of the datasets, as well as the unpredictable irregularities of the cluster environment. Results from preliminary tests of using this strategy to fit gamma-ray burst time profiles with vector functional coefficient autoregressive models on 64 processors of a general purpose Linux cluster demonstrate the effectiveness of the strategy.
New fuzzy support vector machine for the class imbalance problem in medical datasets classification.
Gu, Xiaoqing; Ni, Tongguang; Wang, Hongyuan
2014-01-01
In medical datasets classification, support vector machine (SVM) is considered to be one of the most successful methods. However, most of the real-world medical datasets usually contain some outliers/noise and data often have class imbalance problems. In this paper, a fuzzy support machine (FSVM) for the class imbalance problem (called FSVM-CIP) is presented, which can be seen as a modified class of FSVM by extending manifold regularization and assigning two misclassification costs for two classes. The proposed FSVM-CIP can be used to handle the class imbalance problem in the presence of outliers/noise, and enhance the locality maximum margin. Five real-world medical datasets, breast, heart, hepatitis, BUPA liver, and pima diabetes, from the UCI medical database are employed to illustrate the method presented in this paper. Experimental results on these datasets show the outperformed or comparable effectiveness of FSVM-CIP.
New Fuzzy Support Vector Machine for the Class Imbalance Problem in Medical Datasets Classification
Directory of Open Access Journals (Sweden)
Xiaoqing Gu
2014-01-01
Full Text Available In medical datasets classification, support vector machine (SVM is considered to be one of the most successful methods. However, most of the real-world medical datasets usually contain some outliers/noise and data often have class imbalance problems. In this paper, a fuzzy support machine (FSVM for the class imbalance problem (called FSVM-CIP is presented, which can be seen as a modified class of FSVM by extending manifold regularization and assigning two misclassification costs for two classes. The proposed FSVM-CIP can be used to handle the class imbalance problem in the presence of outliers/noise, and enhance the locality maximum margin. Five real-world medical datasets, breast, heart, hepatitis, BUPA liver, and pima diabetes, from the UCI medical database are employed to illustrate the method presented in this paper. Experimental results on these datasets show the outperformed or comparable effectiveness of FSVM-CIP.
Vector boson excitations near deconfined quantum critical points.
Huh, Yejin; Strack, Philipp; Sachdev, Subir
2013-10-18
We show that the Néel states of two-dimensional antiferromagnets have low energy vector boson excitations in the vicinity of deconfined quantum critical points. We compute the universal damping of these excitations arising from spin-wave emission. Detection of such a vector boson will demonstrate the existence of emergent topological gauge excitations in a quantum spin system.
Twisted vector bundles on pointed nodal curves
Indian Academy of Sciences (India)
R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22
by identifying the points p1 and p2. If m ≥ 2, let R1,...,Rm−1 be m − 1 copies of the projective line P1 and let xi,yi be two distinct points in Ri. Let R be the nodal curve which arises from the union. R0 ⊔ R1 ⊔···⊔ Rm−1 ⊔ Rm by identifying p1 ∈ R0 and p2 ∈ Rm with x1 ∈ R1 and ym−1 ∈ Rm−1 respectively and by identifying ...
Critical Point Cancellation in 3D Vector Fields: Robustness and Discussion.
Skraba, Primoz; Rosen, Paul; Wang, Bei; Chen, Guoning; Bhatia, Harsh; Pascucci, Valerio
2016-02-29
Vector field topology has been successfully applied to represent the structure of steady vector fields. Critical points, one of the essential components of vector field topology, play an important role in describing the complexity of the extracted structure. Simplifying vector fields via critical point cancellation has practical merit for interpreting the behaviors of complex vector fields such as turbulence. However, there is no effective technique that allows direct cancellation of critical points in 3D. This work fills this gap and introduces the first framework to directly cancel pairs or groups of 3D critical points in a hierarchical manner with a guaranteed minimum amount of perturbation based on their robustness, a quantitative measure of their stability. In addition, our framework does not require the extraction of the entire 3D topology, which contains non-trivial separation structures, and thus is computationally effective. Furthermore, our algorithm can remove critical points in any subregion of the domain whose degree is zero and handle complex boundary configurations, making it capable of addressing challenging scenarios that may not be resolved otherwise. We apply our method to synthetic and simulation datasets to demonstrate its effectiveness.
Integration of Point Clouds Dataset from Different Sensors
Abdullah, C. K. A. F. Che Ku; Baharuddin, N. Z. S.; Ariff, M. F. M.; Majid, Z.; Lau, C. L.; Yusoff, A. R.; Idris, K. M.; Aspuri, A.
2017-02-01
Laser Scanner technology become an option in the process of collecting data nowadays. It is composed of Airborne Laser Scanner (ALS) and Terrestrial Laser Scanner (TLS). ALS like Phoenix AL3-32 can provide accurate information from the viewpoint of rooftop while TLS as Leica C10 can provide complete data for building facade. However if both are integrated, it is able to produce more accurate data. The focus of this study is to integrate both types of data acquisition of ALS and TLS and determine the accuracy of the data obtained. The final results acquired will be used to generate models of three-dimensional (3D) buildings. The scope of this study is focusing on data acquisition of UTM Eco-home through laser scanning methods such as ALS which scanning on the roof and the TLS which scanning on building façade. Both device is used to ensure that no part of the building that are not scanned. In data integration process, both are registered by the selected points among the manmade features which are clearly visible in Cyclone 7.3 software. The accuracy of integrated data is determined based on the accuracy assessment which is carried out using man-made registration methods. The result of integration process can achieve below 0.04m. This integrated data then are used to generate a 3D model of UTM Eco-home building using SketchUp software. In conclusion, the combination of the data acquisition integration between ALS and TLS would produce the accurate integrated data and able to use for generate a 3D model of UTM eco-home. For visualization purposes, the 3D building model which generated is prepared in Level of Detail 3 (LOD3) which recommended by City Geographic Mark-Up Language (CityGML).
Visualizing Robustness of Critical Points for 2D Time-Varying Vector Fields
Wang, B.
2013-06-01
Analyzing critical points and their temporal evolutions plays a crucial role in understanding the behavior of vector fields. A key challenge is to quantify the stability of critical points: more stable points may represent more important phenomena or vice versa. The topological notion of robustness is a tool which allows us to quantify rigorously the stability of each critical point. Intuitively, the robustness of a critical point is the minimum amount of perturbation necessary to cancel it within a local neighborhood, measured under an appropriate metric. In this paper, we introduce a new analysis and visualization framework which enables interactive exploration of robustness of critical points for both stationary and time-varying 2D vector fields. This framework allows the end-users, for the first time, to investigate how the stability of a critical point evolves over time. We show that this depends heavily on the global properties of the vector field and that structural changes can correspond to interesting behavior. We demonstrate the practicality of our theories and techniques on several datasets involving combustion and oceanic eddy simulations and obtain some key insights regarding their stable and unstable features. © 2013 The Author(s) Computer Graphics Forum © 2013 The Eurographics Association and Blackwell Publishing Ltd.
Visualizing Robustness of Critical Points for 2D Time-Varying Vector Fields
Wang, B.; Rosen, P.; Skraba, P.; Bhatia, H.; Pascucci, V.
2013-01-01
Analyzing critical points and their temporal evolutions plays a crucial role in understanding the behavior of vector fields. A key challenge is to quantify the stability of critical points: more stable points may represent more important phenomena or vice versa. The topological notion of robustness is a tool which allows us to quantify rigorously the stability of each critical point. Intuitively, the robustness of a critical point is the minimum amount of perturbation necessary to cancel it within a local neighborhood, measured under an appropriate metric. In this paper, we introduce a new analysis and visualization framework which enables interactive exploration of robustness of critical points for both stationary and time-varying 2D vector fields. This framework allows the end-users, for the first time, to investigate how the stability of a critical point evolves over time. We show that this depends heavily on the global properties of the vector field and that structural changes can correspond to interesting behavior. We demonstrate the practicality of our theories and techniques on several datasets involving combustion and oceanic eddy simulations and obtain some key insights regarding their stable and unstable features. © 2013 The Author(s) Computer Graphics Forum © 2013 The Eurographics Association and Blackwell Publishing Ltd.
Point-based warping with optimized weighting factors of displacement vectors
Pielot, Ranier; Scholz, Michael; Obermayer, Klaus; Gundelfinger, Eckart D.; Hess, Andreas
2000-06-01
The accurate comparison of inter-individual 3D image brain datasets requires non-affine transformation techniques (warping) to reduce geometric variations. Constrained by the biological prerequisites we use in this study a landmark-based warping method with weighted sums of displacement vectors, which is enhanced by an optimization process. Furthermore, we investigate fast automatic procedures for determining landmarks to improve the practicability of 3D warping. This combined approach was tested on 3D autoradiographs of Gerbil brains. The autoradiographs were obtained after injecting a non-metabolized radioactive glucose derivative into the Gerbil thereby visualizing neuronal activity in the brain. Afterwards the brain was processed with standard autoradiographical methods. The landmark-generator computes corresponding reference points simultaneously within a given number of datasets by Monte-Carlo-techniques. The warping function is a distance weighted exponential function with a landmark- specific weighting factor. These weighting factors are optimized by a computational evolution strategy. The warping quality is quantified by several coefficients (correlation coefficient, overlap-index, and registration error). The described approach combines a highly suitable procedure to automatically detect landmarks in autoradiographical brain images and an enhanced point-based warping technique, optimizing the local weighting factors. This optimization process significantly improves the similarity between the warped and the target dataset.
Reducing and filtering point clouds with enhanced vector quantization.
Ferrari, Stefano; Ferrigno, Giancarlo; Piuri, Vincenzo; Borghese, N Alberto
2007-01-01
Modern scanners are able to deliver huge quantities of three-dimensional (3-D) data points sampled on an object's surface, in a short time. These data have to be filtered and their cardinality reduced to come up with a mesh manageable at interactive rates. We introduce here a novel procedure to accomplish these two tasks, which is based on an optimized version of soft vector quantization (VQ). The resulting technique has been termed enhanced vector quantization (EVQ) since it introduces several improvements with respect to the classical soft VQ approaches. These are based on computationally expensive iterative optimization; local computation is introduced here, by means of an adequate partitioning of the data space called hyperbox (HB), to reduce the computational time so as to be linear in the number of data points N, saving more than 80% of time in real applications. Moreover, the algorithm can be fully parallelized, thus leading to an implementation that is sublinear in N. The voxel side and the other parameters are automatically determined from data distribution on the basis of the Zador's criterion. This makes the algorithm completely automatic. Because the only parameter to be specified is the compression rate, the procedure is suitable even for nontrained users. Results obtained in reconstructing faces of both humans and puppets as well as artifacts from point clouds publicly available on the web are reported and discussed, in comparison with other methods available in the literature. EVQ has been conceived as a general procedure, suited for VQ applications with large data sets whose data space has relatively low dimensionality.
Directory of Open Access Journals (Sweden)
Mustafa Serter Uzer
2013-01-01
Full Text Available This paper offers a hybrid approach that uses the artificial bee colony (ABC algorithm for feature selection and support vector machines for classification. The purpose of this paper is to test the effect of elimination of the unimportant and obsolete features of the datasets on the success of the classification, using the SVM classifier. The developed approach conventionally used in liver diseases and diabetes diagnostics, which are commonly observed and reduce the quality of life, is developed. For the diagnosis of these diseases, hepatitis, liver disorders and diabetes datasets from the UCI database were used, and the proposed system reached a classification accuracies of 94.92%, 74.81%, and 79.29%, respectively. For these datasets, the classification accuracies were obtained by the help of the 10-fold cross-validation method. The results show that the performance of the method is highly successful compared to other results attained and seems very promising for pattern recognition applications.
Georeferencing UAS Derivatives Through Point Cloud Registration with Archived Lidar Datasets
Magtalas, M. S. L. Y.; Aves, J. C. L.; Blanco, A. C.
2016-10-01
Georeferencing gathered images is a common step before performing spatial analysis and other processes on acquired datasets using unmanned aerial systems (UAS). Methods of applying spatial information to aerial images or their derivatives is through onboard GPS (Global Positioning Systems) geotagging, or through tying of models through GCPs (Ground Control Points) acquired in the field. Currently, UAS (Unmanned Aerial System) derivatives are limited to meter-levels of accuracy when their generation is unaided with points of known position on the ground. The use of ground control points established using survey-grade GPS or GNSS receivers can greatly reduce model errors to centimeter levels. However, this comes with additional costs not only with instrument acquisition and survey operations, but also in actual time spent in the field. This study uses a workflow for cloud-based post-processing of UAS data in combination with already existing LiDAR data. The georeferencing of the UAV point cloud is executed using the Iterative Closest Point algorithm (ICP). It is applied through the open-source CloudCompare software (Girardeau-Montaut, 2006) on a `skeleton point cloud'. This skeleton point cloud consists of manually extracted features consistent on both LiDAR and UAV data. For this cloud, roads and buildings with minimal deviations given their differing dates of acquisition are considered consistent. Transformation parameters are computed for the skeleton cloud which could then be applied to the whole UAS dataset. In addition, a separate cloud consisting of non-vegetation features automatically derived using CANUPO classification algorithm (Brodu and Lague, 2012) was used to generate a separate set of parameters. Ground survey is done to validate the transformed cloud. An RMSE value of around 16 centimeters was found when comparing validation data to the models georeferenced using the CANUPO cloud and the manual skeleton cloud. Cloud-to-cloud distance computations of
Andrade de Araújo, Hallysson Douglas; Dos Santos Silva, Luanna Ribeiro; de Siqueira, Williams Nascimento; Martins da Fonseca, Caíque Silveira; da Silva, Nicácio Henrique; de Albuquerque Melo, Ana Maria Mendonça; Barroso Martins, Mônica Cristina; de Menezes Lima, Vera Lúcia
2018-04-01
This text presents complementary data corresponding to schistosomiasis mansoni's vector control and enviromental toxicity using usnic acid. These informations support our research article "Toxicity of Usnic Acid from Cladonia substellata (Lichen) to embryos and adults of Biomphalaria glabrata " by Araújo et al. [1], and focuses on the analysis of the detailed data regarding the different concentrations of Usnic Acid and their efficiency to B. glabrata mortality and non-viability, as also to environmental toxicity, evaluated by A. salina mortality.
Directory of Open Access Journals (Sweden)
adi sucipto
2017-09-01
Full Text Available There are many types of investments that can be used to generate income, such as in the form of land, houses, gold, precious metals etc., there are also in the form of financial assets such as stocks, mutual funds, bonds and money markets or capital markets. One of the investments that attract enough attention today is the capital market investment. The purpose of this study is to predict and improve the accuracy of foreign exchange rates on forex business by using the Support Vector Machine model as a model for predicting and using more data sets compared with previous research that is as many as 1558 dataset. This study uses currency exchange rate data obtained from PT. Best Profit Future Cab. Surabaya is already in the form of data consisting of open, high, low, close attributes by using the current data of Euro currency exchange rate to USA Dollar with period every 1 minutes from May 12, 2016 at 09.51 until 13 May 2016 at 12:30 As much as 1689 dataset, After conducting research using Support Vector Machine model with kernel trick method to predict Forex using current data of Euro exchange rate to USA Dollar with period every 1 minutes from May 12, 2016 at 09.51 until 13 May 2016 at 12:30 as much as 1689 The dataset yielded a considerable prediction accuracy of 97.86%, with this considerable accuracy indicating that the movement of the Euro currency exchange rate to the USA Dollar on May 12 to May 13, 2016 can be predicted precisely.
Miladinovich, D.; Datta-Barua, S.; Bust, G. S.; Ramirez, U.
2017-12-01
Understanding physical processes during storm time in the ionosphere-thermosphere (IT) system is limited, in part, due to the inability to obtain accurate estimates of IT states on a global scale. One reason for this inability is the sparsity of spatially distributed high quality data sets. Data assimilation is showing promise toward enabling global estimates by blending high quality observational data sets with established climate models. We are continuing development of an algorithm called Estimating Model Parameters for Ionospheric Reverse Engineering (EMPIRE) to enable assimilation of global datasets for storm time estimates of IT drivers. EMPIRE is a data assimilation algorithm that uses a Kalman filtering routine to ingest model and observational data. The EMPIRE algorithm is based on spherical harmonics which provide a spherically symmetric, smooth, continuous, and orthonormal set of basis functions suitable for a spherical domain such as Earth's IT region (200-600 km altitude). Once the basis function coefficients are determined, the newly fitted function represents the disagreement between observational measurements and models. We apply spherical harmonics to study the March 17, 2015 storm. Data sources include Fabry-Perot interferometer neutral wind measurements and global Ionospheric Data Assimilation 4 Dimensional (IDA4D) assimilated total electron content (TEC). Models include Weimer 2000 electric potential, International Geomagnetic Reference Field (IGRF) magnetic field, and Horizontal Wind Model 2014 (HWM14) neutral winds. We present the EMPIRE assimilation results of Earth's electric potential and thermospheric winds. We also compare EMPIRE storm time E cross B ion drift estimates to measured drifts produced from the Super Dual Auroral Radar Network (SuperDARN) and Active Magnetosphere and Planetary Electrodynamics Response Experiment (AMPERE) measurement datasets. The analysis from these results will enable the generation of globally assimilated
Xu, Y.; Sun, Z.; Boerner, R.; Koch, T.; Hoegner, L.; Stilla, U.
2018-04-01
In this work, we report a novel way of generating ground truth dataset for analyzing point cloud from different sensors and the validation of algorithms. Instead of directly labeling large amount of 3D points requiring time consuming manual work, a multi-resolution 3D voxel grid for the testing site is generated. Then, with the help of a set of basic labeled points from the reference dataset, we can generate a 3D labeled space of the entire testing site with different resolutions. Specifically, an octree-based voxel structure is applied to voxelize the annotated reference point cloud, by which all the points are organized by 3D grids of multi-resolutions. When automatically annotating the new testing point clouds, a voting based approach is adopted to the labeled points within multiple resolution voxels, in order to assign a semantic label to the 3D space represented by the voxel. Lastly, robust line- and plane-based fast registration methods are developed for aligning point clouds obtained via various sensors. Benefiting from the labeled 3D spatial information, we can easily create new annotated 3D point clouds of different sensors of the same scene directly by considering the corresponding labels of 3D space the points located, which would be convenient for the validation and evaluation of algorithms related to point cloud interpretation and semantic segmentation.
Datum Feature Extraction and Deformation Analysis Method Based on Normal Vector of Point Cloud
Sun, W.; Wang, J.; Jin, F.; Liang, Z.; Yang, Y.
2018-04-01
In order to solve the problem lacking applicable analysis method in the application of three-dimensional laser scanning technology to the field of deformation monitoring, an efficient method extracting datum feature and analysing deformation based on normal vector of point cloud was proposed. Firstly, the kd-tree is used to establish the topological relation. Datum points are detected by tracking the normal vector of point cloud determined by the normal vector of local planar. Then, the cubic B-spline curve fitting is performed on the datum points. Finally, datum elevation and the inclination angle of the radial point are calculated according to the fitted curve and then the deformation information was analyzed. The proposed approach was verified on real large-scale tank data set captured with terrestrial laser scanner in a chemical plant. The results show that the method could obtain the entire information of the monitor object quickly and comprehensively, and reflect accurately the datum feature deformation.
Topological Vector Space-Valued Cone Metric Spaces and Fixed Point Theorems
Directory of Open Access Journals (Sweden)
Radenović Stojan
2010-01-01
Full Text Available We develop the theory of topological vector space valued cone metric spaces with nonnormal cones. We prove three general fixed point results in these spaces and deduce as corollaries several extensions of theorems about fixed points and common fixed points, known from the theory of (normed-valued cone metric spaces. Examples are given to distinguish our results from the known ones.
Space vector modulation strategy for neutral-point voltage balancing in three-level inverter systems
DEFF Research Database (Denmark)
Choi, Uimin; Lee, Kyo Beum
2013-01-01
This study proposes a space vector modulation (SVM) strategy to balance the neutral-point voltage of three-level inverter systems. The proposed method is implemented by combining conventional symmetric SVM with nearest three-vector (NTV) modulation. The conventional SVM is converted to NTV...... modulation by properly adding or subtracting a minimum gate-on time. In addition, using this method, the switching frequency is reduced and a decrease of switching loss would be yielded. The neutral-point voltage is balanced by the proposed SVM strategy without additional hardware or complex calculations....... Simulation and experimental results are shown to verify the validity and feasibility of the proposed SVM strategy....
Dogon-yaro, M. A.; Kumar, P.; Rahman, A. Abdul; Buyuksalih, G.
2016-10-01
Timely and accurate acquisition of information on the condition and structural changes of urban trees serves as a tool for decision makers to better appreciate urban ecosystems and their numerous values which are critical to building up strategies for sustainable development. The conventional techniques used for extracting tree features include; ground surveying and interpretation of the aerial photography. However, these techniques are associated with some constraint, such as labour intensive field work, a lot of financial requirement, influences by weather condition and topographical covers which can be overcome by means of integrated airborne based LiDAR and very high resolution digital image datasets. This study presented a semi-automated approach for extracting urban trees from integrated airborne based LIDAR and multispectral digital image datasets over Istanbul city of Turkey. The above scheme includes detection and extraction of shadow free vegetation features based on spectral properties of digital images using shadow index and NDVI techniques and automated extraction of 3D information about vegetation features from the integrated processing of shadow free vegetation image and LiDAR point cloud datasets. The ability of the developed algorithms shows a promising result as an automated and cost effective approach to estimating and delineated 3D information of urban trees. The research also proved that integrated datasets is a suitable technology and a viable source of information for city managers to be used in urban trees management.
National Cooperative Soil Survey (NCSS) Laboratory Data, NCSS Lab Data Mart Point Dataset
Department of Agriculture — This layer represents the National Cooperative Soil Survey laboratory data of soil properties for soil samples taken at sites or points on the Earth’s globe – mainly...
Kraus, Wayne A; Wagner, Albert F
1986-04-01
A triatomic classical trajectory code has been modified by extensive vectorization of the algorithms to achieve much improved performance on an FPS 164 attached processor. Extensive timings on both the FPS 164 and a VAX 11/780 with floating point accelerator are presented as a function of the number of trajectories simultaneously run. The timing tests involve a potential energy surface of the LEPS variety and trajectories with 1000 time steps. The results indicate that vectorization results in timing improvements on both the VAX and the FPS. For larger numbers of trajectories run simultaneously, up to a factor of 25 improvement in speed occurs between VAX and FPS vectorized code. Copyright © 1986 John Wiley & Sons, Inc.
Extensions of vector-valued Baire one functions with preservation of points of continuity
Czech Academy of Sciences Publication Activity Database
Koc, M.; Kolář, Jan
2016-01-01
Roč. 442, č. 1 (2016), s. 138-148 ISSN 0022-247X R&D Projects: GA ČR(CZ) GA14-07880S Institutional support: RVO:67985840 Keywords : vector-valued Baire one functions * extensions * non-tangential limit * continuity points Subject RIV: BA - General Mathematics Impact factor: 1.064, year: 2016 http://www.sciencedirect.com/science/article/pii/S0022247X1630097X
DEFF Research Database (Denmark)
Boeriis, Morten; van Leeuwen, Theo
2017-01-01
should be taken into account in discussing ‘reactions’, which Kress and van Leeuwen link only to eyeline vectors. Finally, the question can be raised as to whether actions are always realized by vectors. Drawing on a re-reading of Rudolf Arnheim’s account of vectors, these issues are outlined......This article revisits the concept of vectors, which, in Kress and van Leeuwen’s Reading Images (2006), plays a crucial role in distinguishing between ‘narrative’, action-oriented processes and ‘conceptual’, state-oriented processes. The use of this concept in image analysis has usually focused...
Directory of Open Access Journals (Sweden)
M. A. Dogon-Yaro
2016-09-01
Full Text Available Mapping of trees plays an important role in modern urban spatial data management, as many benefits and applications inherit from this detailed up-to-date data sources. Timely and accurate acquisition of information on the condition of urban trees serves as a tool for decision makers to better appreciate urban ecosystems and their numerous values which are critical to building up strategies for sustainable development. The conventional techniques used for extracting trees include ground surveying and interpretation of the aerial photography. However, these techniques are associated with some constraints, such as labour intensive field work and a lot of financial requirement which can be overcome by means of integrated LiDAR and digital image datasets. Compared to predominant studies on trees extraction mainly in purely forested areas, this study concentrates on urban areas, which have a high structural complexity with a multitude of different objects. This paper presented a workflow about semi-automated approach for extracting urban trees from integrated processing of airborne based LiDAR point cloud and multispectral digital image datasets over Istanbul city of Turkey. The paper reveals that the integrated datasets is a suitable technology and viable source of information for urban trees management. As a conclusion, therefore, the extracted information provides a snapshot about location, composition and extent of trees in the study area useful to city planners and other decision makers in order to understand how much canopy cover exists, identify new planting, removal, or reforestation opportunities and what locations have the greatest need or potential to maximize benefits of return on investment. It can also help track trends or changes to the urban trees over time and inform future management decisions.
Directory of Open Access Journals (Sweden)
Sara Studwell
2017-04-01
Full Text Available Scientific research is producing ever-increasing amounts of data. Organizing and reflecting relationships across data collections, datasets, publications, and other research objects are essential functionalities of the modern science environment, yet challenging to implement. Landing pages are often used for providing ‘big picture’ contextual frameworks for datasets and data collections, and many large-volume data holders are utilizing them in thoughtful, creative ways. The benefits of their organizational efforts, however, are not realized unless the user eventually sees the landing page at the end point of their search. What if that organization and ‘big picture’ context could benefit the user at the beginning of the search? That is a challenging approach, but The Department of Energy’s (DOE Office of Scientific and Technical Information (OSTI is redesigning the database functionality of the DOE Data Explorer (DDE with that goal in mind. Phase I is focused on redesigning the DDE database to leverage relationships between two existing distinct populations in DDE, data Projects and individual Datasets, and then adding a third intermediate population, data Collections. Mapped, structured linkages, designed to show user relationships, will allow users to make informed search choices. These linkages will be sustainable and scalable, created automatically with the use of new metadata fields and existing authorities. Phase II will study selected DOE Data ID Service clients, analyzing how their landing pages are organized, and how that organization might be used to improve DDE search capabilities. At the heart of both phases is the realization that adding more metadata information for cross-referencing may require additional effort for data scientists. OSTI’s approach seeks to leverage existing metadata and landing page intelligence without imposing an additional burden on the data creators.
Dogon-Yaro, M. A.; Kumar, P.; Rahman, A. Abdul; Buyuksalih, G.
2016-09-01
Mapping of trees plays an important role in modern urban spatial data management, as many benefits and applications inherit from this detailed up-to-date data sources. Timely and accurate acquisition of information on the condition of urban trees serves as a tool for decision makers to better appreciate urban ecosystems and their numerous values which are critical to building up strategies for sustainable development. The conventional techniques used for extracting trees include ground surveying and interpretation of the aerial photography. However, these techniques are associated with some constraints, such as labour intensive field work and a lot of financial requirement which can be overcome by means of integrated LiDAR and digital image datasets. Compared to predominant studies on trees extraction mainly in purely forested areas, this study concentrates on urban areas, which have a high structural complexity with a multitude of different objects. This paper presented a workflow about semi-automated approach for extracting urban trees from integrated processing of airborne based LiDAR point cloud and multispectral digital image datasets over Istanbul city of Turkey. The paper reveals that the integrated datasets is a suitable technology and viable source of information for urban trees management. As a conclusion, therefore, the extracted information provides a snapshot about location, composition and extent of trees in the study area useful to city planners and other decision makers in order to understand how much canopy cover exists, identify new planting, removal, or reforestation opportunities and what locations have the greatest need or potential to maximize benefits of return on investment. It can also help track trends or changes to the urban trees over time and inform future management decisions.
Directory of Open Access Journals (Sweden)
T. D. Carozzi
2004-07-01
Full Text Available We introduce a technique to determine instantaneous local properties of waves based on discrete-time sampled, real-valued measurements from 4 or more spatial points. The technique is a generalisation to the spatial domain of the notion of instantaneous frequency used in signal processing. The quantities derived by our technique are closely related to those used in geometrical optics, namely the local wave vector and instantaneous phase velocity. Thus, this experimental technique complements ray-tracing. We provide example applications of the technique to electric field and potential data from the EFW instrument on Cluster. Cluster is the first space mission for which direct determination of the full 3-dimensional local wave vector is possible, as described here.
Directory of Open Access Journals (Sweden)
Xianglin Meng
2018-03-01
Full Text Available The normal vector estimation of the large-scale scattered point cloud (LSSPC plays an important role in point-based shape editing. However, the normal vector estimation for LSSPC cannot meet the great challenge of the sharp increase of the point cloud that is mainly attributed to its low computational efficiency. In this paper, a novel, fast method-based on bi-linear interpolation is reported on the normal vector estimation for LSSPC. We divide the point sets into many small cubes to speed up the local point search and construct interpolation nodes on the isosurface expressed by the point cloud. On the premise of calculating the normal vectors of these interpolated nodes, a normal vector bi-linear interpolation of the points in the cube is realized. The proposed approach has the merits of accurate, simple, and high efficiency, because the algorithm only needs to search neighbor and calculates normal vectors for interpolation nodes that are usually far less than the point cloud. The experimental results of several real and simulated point sets show that our method is over three times faster than the Elliptic Gabriel Graph-based method, and the average deviation is less than 0.01 mm.
Polverari, F.; Talone, M.; Crapolicchio, R. Levy, G.; Marzano, F.
2013-12-01
The European Remote-sensing Satellite (ERS)-2 scatterometer provides wind retrievals over Ocean. To satisfy the needs of high quality and homogeneous set of scatterometer measurements, the European Space Agency (ESA) has developed the project Advanced Scatterometer Processing System (ASPS) with which a long-term dataset of new ERS-2 wind products, with an enhanced resolution of 25km square, has been generated by the reprocessing of the entire ERS mission. This paper presents the main results of the validation work of such new dataset using in situ measurements provided by the Prediction and Research Moored Array in the Tropical Atlantic (PIRATA). The comparison indicates that, on average, the scatterometer data agree well with buoys measurements, however the scatterometer tends to overestimates lower winds and underestimates higher winds.
International Nuclear Information System (INIS)
Kareim, Ameer A; Mansor, Muhamad Bin
2013-01-01
The aim of this paper is to improve efficiency of maximum power point tracking (MPPT) for PV systems. The Support Vector Machine (SVM) was proposed to achieve the MPPT controller. The theoretical, the perturbation and observation (P and O), and incremental conductance (IC) algorithms were used to compare with proposed SVM algorithm. MATLAB models for PV module, theoretical, SVM, P and O, and IC algorithms are implemented. The improved MPPT uses the SVM method to predict the optimum voltage of the PV system in order to extract the maximum power point (MPP). The SVM technique used two inputs which are solar radiation and ambient temperature of the modeled PV module. The results show that the proposed SVM technique has less Root Mean Square Error (RMSE) and higher efficiency than P and O and IC methods.
Accelerating simulation for the multiple-point statistics algorithm using vector quantization
Zuo, Chen; Pan, Zhibin; Liang, Hao
2018-03-01
Multiple-point statistics (MPS) is a prominent algorithm to simulate categorical variables based on a sequential simulation procedure. Assuming training images (TIs) as prior conceptual models, MPS extracts patterns from TIs using a template and records their occurrences in a database. However, complex patterns increase the size of the database and require considerable time to retrieve the desired elements. In order to speed up simulation and improve simulation quality over state-of-the-art MPS methods, we propose an accelerating simulation for MPS using vector quantization (VQ), called VQ-MPS. First, a variable representation is presented to make categorical variables applicable for vector quantization. Second, we adopt a tree-structured VQ to compress the database so that stationary simulations are realized. Finally, a transformed template and classified VQ are used to address nonstationarity. A two-dimensional (2D) stationary channelized reservoir image is used to validate the proposed VQ-MPS. In comparison with several existing MPS programs, our method exhibits significantly better performance in terms of computational time, pattern reproductions, and spatial uncertainty. Further demonstrations consist of a 2D four facies simulation, two 2D nonstationary channel simulations, and a three-dimensional (3D) rock simulation. The results reveal that our proposed method is also capable of solving multifacies, nonstationarity, and 3D simulations based on 2D TIs.
Aging Detection of Electrical Point Machines Based on Support Vector Data Description
Directory of Open Access Journals (Sweden)
Jaewon Sa
2017-11-01
Full Text Available Electrical point machines (EPM must be replaced at an appropriate time to prevent the occurrence of operational safety or stability problems in trains resulting from aging or budget constraints. However, it is difficult to replace EPMs effectively because the aging conditions of EPMs depend on the operating environments, and thus, a guideline is typically not be suitable for replacing EPMs at the most timely moment. In this study, we propose a method of classification for the detection of an aging effect to facilitate the timely replacement of EPMs. We employ support vector data description to segregate data of “aged” and “not-yet-aged” equipment by analyzing the subtle differences in normalized electrical signals resulting from aging. Based on the before and after-replacement data that was obtained from experimental studies that were conducted on EPMs, we confirmed that the proposed method was capable of classifying machines based on exhibited aging effects with adequate accuracy.
International Nuclear Information System (INIS)
Deo, Ravinesh C.; Wen, Xiaohu; Qi, Feng
2016-01-01
Highlights: • A forecasting model for short- and long-term global incident solar radiation (R_n) has been developed. • The support vector machine and discrete wavelet transformation algorithm has been integrated. • The precision of the wavelet-coupled hybrid model is assessed using several prediction score metrics. • The proposed model is an appealing tool for forecasting R_n in the present study region. - Abstract: A solar radiation forecasting model can be utilized is a scientific contrivance for investigating future viability of solar energy potentials. In this paper, a wavelet-coupled support vector machine (W-SVM) model was adopted to forecast global incident solar radiation based on the sunshine hours (S_t), minimum temperature (T_m_a_x), maximum temperature (T_m_a_x), windspeed (U), evaporation (E) and precipitation (P) as the predictor variables. To ascertain conclusive results, the merit of the W-SVM was benchmarked with the classical SVM model. For daily forecasting, sixteen months of data (01-March-2014 to 30-June-2015) partitioned into the train (65%) and test (35%) set for the three metropolitan stations (Brisbane City, Cairns Aero and Townsville Aero) were utilized. Data were decomposed into their wavelet sub-series by discrete wavelet transformation algorithm and summed up to create new series with one approximation and four levels of detail using Daubechies-2 mother wavelet. For daily forecasting, six model scenarios were formulated where the number of input was increased and the forecast was assessed by statistical metrics (correlation coefficient r; Willmott’s index d; Nash-Sutcliffe coefficient E_N_S; peak deviation P_d_v), distribution statistics and prediction errors (mean absolute error MAE; root mean square error RMSE; mean absolute percentage error MAPE; relative root mean square error RMSE). Results for daily forecasts showed that the W-SVM model outperformed the classical SVM model for optimum input combinations. A sensitivity
Converting point-wise nuclear cross sections to pole representation using regularized vector fitting
Peng, Xingjie; Ducru, Pablo; Liu, Shichang; Forget, Benoit; Liang, Jingang; Smith, Kord
2018-03-01
Direct Doppler broadening of nuclear cross sections in Monte Carlo codes has been widely sought for coupled reactor simulations. One recent approach proposed analytical broadening using a pole representation of the commonly used resonance models and the introduction of a local windowing scheme to improve performance (Hwang, 1987; Forget et al., 2014; Josey et al., 2015, 2016). This pole representation has been achieved in the past by converting resonance parameters in the evaluation nuclear data library into poles and residues. However, cross sections of some isotopes are only provided as point-wise data in ENDF/B-VII.1 library. To convert these isotopes to pole representation, a recent approach has been proposed using the relaxed vector fitting (RVF) algorithm (Gustavsen and Semlyen, 1999; Gustavsen, 2006; Liu et al., 2018). This approach however needs to specify ahead of time the number of poles. This article addresses this issue by adding a poles and residues filtering step to the RVF procedure. This regularized VF (ReV-Fit) algorithm is shown to efficiently converge the poles close to the physical ones, eliminating most of the superfluous poles, and thus enabling the conversion of point-wise nuclear cross sections.
Directory of Open Access Journals (Sweden)
Jinliang Huang
2012-12-01
Full Text Available Land use and land cover (LULC information is an important component influencing watershed modeling with regards to hydrology and water quality in the river basin. In this study, the sensitivity of the Soil and Water Assessment Tool (SWAT model to LULC datasets with three points in time and three levels of detail was assessed in a coastal subtropical watershed located in Southeast China. The results showed good agreement between observed and simulated values for both monthly and daily streamflow and monthly NH4+-N and TP loads. Three LULC datasets in 2002, 2007 and 2010 had relatively little influence on simulated monthly and daily streamflow, whereas they exhibited greater effects on simulated monthly NH4+-N and TP loads. When using the two LULC datasets in 2007 and 2010 compared with that in 2002, the relative differences in predicted monthly NH4+-N and TP loads were −11.0 to −7.8% and −4.8 to −9.0%, respectively. There were no significant differences in simulated monthly and daily streamflow when using the three LULC datasets with ten, five and three categories. When using LULC datasets from ten categories compared to five and three categories, the relative differences in predicted monthly NH4+-N and TP loads were −6.6 to −6.5% and −13.3 to −7.3%, respectively. Overall, the sensitivity of the SWAT model to LULC datasets with different points in time and levels of detail was lower in monthly and daily streamflow simulation than in monthly NH4+-N and TP loads prediction. This research provided helpful insights into the influence of LULC datasets on watershed modeling.
DEFF Research Database (Denmark)
Lavancier, Frédéric; Møller, Jesper
We consider a dependent thinning of a regular point process with the aim of obtaining aggregation on the large scale and regularity on the small scale in the resulting target point process of retained points. Various parametric models for the underlying processes are suggested and the properties...
Wenying, Wei; Jinyu, Han; Wen, Xu
2004-01-01
The specific position of a group in the molecule has been considered, and a group vector space method for estimating enthalpy of vaporization at the normal boiling point of organic compounds has been developed. Expression for enthalpy of vaporization Delta(vap)H(T(b)) has been established and numerical values of relative group parameters obtained. The average percent deviation of estimation of Delta(vap)H(T(b)) is 1.16, which show that the present method demonstrates significant improvement in applicability to predict the enthalpy of vaporization at the normal boiling point, compared the conventional group methods.
The waiting room: vector for health education? The general practitioner's point of view.
Gignon, Maxine; Idris, Hadjila; Manaouil, Cecile; Ganry, Oliver
2012-09-18
General practitioners (GPs) play a central role in disseminating information and most health policies are tending to develop this pivotal role of GPs in dissemination of health-related information to the public. The objective of this study was to evaluate use of the waiting room by GPs as a vector for health promotion. A cross-sectional study was conducted on a representative sample of GPs using semi-structured, face-to-face interviews. A structured grid was used to describe the documents. Quantitative and qualitative analysis was performed. Sixty GPs participated in the study. They stated that a waiting room had to be pleasant, but agreed that it was a useful vector for providing health information. The GPs stated that they distributed documents designed to improve patient care by encouraging screening, providing health education information and addressing delicate subjects more easily. However, some physicians believed that this information can sometimes make patients more anxious. A large number of documents were often available, covering a variety of topics. General practitioners intentionally use their waiting rooms to disseminate a broad range of health-related information, but without developing a clearly defined strategy. It would be interesting to correlate the topics addressed by waiting room documents with prevention practices introduced during the visit.
Fixed Point in Topological Vector Space-Valued Cone Metric Spaces
Directory of Open Access Journals (Sweden)
Muhammad Arshad
2010-01-01
Full Text Available We obtain common fixed points of a pair of mappings satisfying a generalized contractive type condition in TVS-valued cone metric spaces. Our results generalize some well-known recent results in the literature.
DEFF Research Database (Denmark)
2008-01-01
A Coding/Modulating units (200-1-200-N) outputs modulated symbols by modulating coding bit streams based on certain modulation scheme. The limited perturbation vector is calculated by using distribution of perturbation vectors. The original constellation points of modulated symbols are extended t...
DEFF Research Database (Denmark)
Bennike, Tue Bjerg; Carlsen, Thomas Gelsing; Ellingsen, Torkell
2017-01-01
The datasets presented in this article are related to the research articles entitled “Neutrophil Extracellular Traps in Ulcerative Colitis: A Proteome Analysis of Intestinal Biopsies” (Bennike et al., 2015 [1]), and “Proteome Analysis of Rheumatoid Arthritis Gut Mucosa” (Bennike et al., 2017 [2])...... been deposited to the ProteomeXchange Consortium via the PRIDE partner repository with the dataset identifiers PXD001608 for ulcerative colitis and control samples, and PXD003082 for rheumatoid arthritis samples....
Stanley, Clayton; Byrne, Michael D
2016-12-01
The growth of social media and user-created content on online sites provides unique opportunities to study models of human declarative memory. By framing the task of choosing a hashtag for a tweet and tagging a post on Stack Overflow as a declarative memory retrieval problem, 2 cognitively plausible declarative memory models were applied to millions of posts and tweets and evaluated on how accurately they predict a user's chosen tags. An ACT-R based Bayesian model and a random permutation vector-based model were tested on the large data sets. The results show that past user behavior of tag use is a strong predictor of future behavior. Furthermore, past behavior was successfully incorporated into the random permutation model that previously used only context. Also, ACT-R's attentional weight term was linked to an entropy-weighting natural language processing method used to attenuate high-frequency words (e.g., articles and prepositions). Word order was not found to be a strong predictor of tag use, and the random permutation model performed comparably to the Bayesian model without including word order. This shows that the strength of the random permutation model is not in the ability to represent word order, but rather in the way in which context information is successfully compressed. The results of the large-scale exploration show how the architecture of the 2 memory models can be modified to significantly improve accuracy, and may suggest task-independent general modifications that can help improve model fit to human data in a much wider range of domains. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
DEFF Research Database (Denmark)
Bennike, Tue Bjerg; Carlsen, Thomas Gelsing; Ellingsen, Torkell
2017-01-01
patients (Morgan et al., 2012; Abraham and Medzhitov, 2011; Bennike, 2014) [8–10. Therefore, we characterized the proteome of colon mucosa biopsies from 10 inflammatory bowel disease ulcerative colitis (UC) patients, 11 gastrointestinal healthy rheumatoid arthritis (RA) patients, and 10 controls. We...... been deposited to the ProteomeXchange Consortium via the PRIDE partner repository with the dataset identifiers PXD001608 for ulcerative colitis and control samples, and PXD003082 for rheumatoid arthritis samples....
NSGIC State | GIS Inventory — Public Access Points dataset current as of 2005. Location of public beach access along the Oregon Coast. Boat ramp locations were added to the dataset to allow users...
Minnesota Department of Natural Resources — This vector dataset is a detailed (1-acre minimum), hierarchically organized vegetation cover map produced by computer classification of combined two-season pairs of...
Kansas Data Access and Support Center — The Kansas Tagged Vector Contour (TVC) dataset consists of digitized contours from the 7.5 minute topographic quadrangle maps. Coverage for the state is incomplete....
Directory of Open Access Journals (Sweden)
Alev Dilek Aydin
2015-01-01
Full Text Available The ANN method has been applied by means of multilayered feedforward neural networks (MLFNs by using different macroeconomic variables such as the exchange rate of USD/TRY, gold prices, and the Borsa Istanbul (BIST 100 index based on monthly data over the period of January 2000 and September 2014 for Turkey. Vector autoregressive (VAR method has also been applied with the same variables for the same period of time. In this study, different from other studies conducted up to the present, ENCOG machine learning framework has been used along with JAVA programming language in order to constitute the ANN. The training of network has been done by resilient propagation method. The ex post and ex ante estimates obtained by the ANN method have been compared with the results obtained by the econometric forecasting method of VAR. Strikingly, our findings based on the ANN method reveal that there is a possibility of financial distress or a financial crisis in Turkey starting from October 2017. The results which were obtained with the method of VAR also support the results of ANN method. Additionally, our results indicate that the ANN approach has more superior prediction performance than the VAR method.
Aydin, Alev Dilek; Caliskan Cavdar, Seyma
2015-01-01
The ANN method has been applied by means of multilayered feedforward neural networks (MLFNs) by using different macroeconomic variables such as the exchange rate of USD/TRY, gold prices, and the Borsa Istanbul (BIST) 100 index based on monthly data over the period of January 2000 and September 2014 for Turkey. Vector autoregressive (VAR) method has also been applied with the same variables for the same period of time. In this study, different from other studies conducted up to the present, ENCOG machine learning framework has been used along with JAVA programming language in order to constitute the ANN. The training of network has been done by resilient propagation method. The ex post and ex ante estimates obtained by the ANN method have been compared with the results obtained by the econometric forecasting method of VAR. Strikingly, our findings based on the ANN method reveal that there is a possibility of financial distress or a financial crisis in Turkey starting from October 2017. The results which were obtained with the method of VAR also support the results of ANN method. Additionally, our results indicate that the ANN approach has more superior prediction performance than the VAR method.
Halovic, Shaun; Kroos, Christian
2017-12-01
This data set describes the experimental data collected and reported in the research article "Walking my way? Walker gender and display format confounds the perception of specific emotions" (Halovic and Kroos, in press) [1]. The data set represent perceiver identification rates for different emotions (happiness, sadness, anger, fear and neutral), as displayed by full-light, point-light and synthetic point-light walkers. The perceiver identification scores have been transformed into H t rates, which represent proportions/percentages of correct identifications above what would be expected by chance. This data set also provides H t rates separately for male, female and ambiguously gendered walkers.
Zhong, Li; Li, Baozheng; Mah, Cathryn S.; Govindasamy, Lakshmanan; Agbandje-McKenna, Mavis; Cooper, Mario; Herzog, Roland W.; Zolotukhin, Irene; Warrington, Kenneth H.; Weigel-Van Aken, Kirsten A.; Hobbs, Jacqueline A.; Zolotukhin, Sergei; Muzyczka, Nicholas; Srivastava, Arun
2008-01-01
Recombinant adeno-associated virus 2 (AAV2) vectors are in use in several Phase I/II clinical trials, but relatively large vector doses are needed to achieve therapeutic benefits. Large vector doses also trigger an immune response as a significant fraction of the vectors fails to traffic efficiently to the nucleus and is targeted for degradation by the host cell proteasome machinery. We have reported that epidermal growth factor receptor protein tyrosine kinase (EGFR-PTK) signaling negatively...
Directory of Open Access Journals (Sweden)
Palanisamy Ramasamy
2017-11-01
Full Text Available A Unified Power Quality Conditioner (UPQC is designed using a Neutral Point Clamped (NPC multilevel inverter to improve the power quality. When designed for high/medium voltage and power applications, the voltage stress across the switches and harmonic content in the output voltage are increased. A 3-phase 4-wire NPC inverter system is developed as Power Quality Conditioner using an effectual three dimensional Space Vector Modulation (3D-SVM technique. The proposed system behaves like a UPQC with shunt and series active filter under balanced and unbalanced loading conditions. In addition to the improvement of the power quality issues, it also balances the neutral point voltage and voltage balancing across the capacitors under unbalanced condition. The hardware and simulation results of proposed system are compared with 2D-SVM and 3D-SVM. The proposed system is stimulated using MATLAB and the hardware is designed using FPGA. From the results it is evident that effectual 3D-SVM technique gives better performance compared to other control methods.
Zhong, Li; Li, Baozheng; Mah, Cathryn S.; Govindasamy, Lakshmanan; Agbandje-McKenna, Mavis; Cooper, Mario; Herzog, Roland W.; Zolotukhin, Irene; Warrington, Kenneth H.; Weigel-Van Aken, Kirsten A.; Hobbs, Jacqueline A.; Zolotukhin, Sergei; Muzyczka, Nicholas; Srivastava, Arun
2008-01-01
Recombinant adeno-associated virus 2 (AAV2) vectors are in use in several Phase I/II clinical trials, but relatively large vector doses are needed to achieve therapeutic benefits. Large vector doses also trigger an immune response as a significant fraction of the vectors fails to traffic efficiently to the nucleus and is targeted for degradation by the host cell proteasome machinery. We have reported that epidermal growth factor receptor protein tyrosine kinase (EGFR-PTK) signaling negatively affects transduction by AAV2 vectors by impairing nuclear transport of the vectors. We have also observed that EGFR-PTK can phosphorylate AAV2 capsids at tyrosine residues. Tyrosine-phosphorylated AAV2 vectors enter cells efficiently but fail to transduce effectively, in part because of ubiquitination of AAV capsids followed by proteasome-mediated degradation. We reasoned that mutations of the surface-exposed tyrosine residues might allow the vectors to evade phosphorylation and subsequent ubiquitination and, thus, prevent proteasome-mediated degradation. Here, we document that site-directed mutagenesis of surface-exposed tyrosine residues leads to production of vectors that transduce HeLa cells ≈10-fold more efficiently in vitro and murine hepatocytes nearly 30-fold more efficiently in vivo at a log lower vector dose. Therapeutic levels of human Factor IX (F.IX) are also produced at an ≈10-fold reduced vector dose. The increased transduction efficiency of tyrosine-mutant vectors is due to lack of capsid ubiquitination and improved intracellular trafficking to the nucleus. These studies have led to the development of AAV vectors that are capable of high-efficiency transduction at lower doses, which has important implications in their use in human gene therapy. PMID:18511559
An Annotated Dataset of 14 Meat Images
DEFF Research Database (Denmark)
Stegmann, Mikkel Bille
2002-01-01
This note describes a dataset consisting of 14 annotated images of meat. Points of correspondence are placed on each image. As such, the dataset can be readily used for building statistical models of shape. Further, format specifications and terms of use are given.......This note describes a dataset consisting of 14 annotated images of meat. Points of correspondence are placed on each image. As such, the dataset can be readily used for building statistical models of shape. Further, format specifications and terms of use are given....
U.S. Environmental Protection Agency — EPA Nanorelease Dataset. This dataset is associated with the following publication: Wohlleben, W., C. Kingston, J. Carter, E. Sahle-Demessie, S. Vazquez-Campos, B....
Directory of Open Access Journals (Sweden)
Long Jiao
2015-05-01
Full Text Available The quantitative structure property relationship (QSPR for the boiling point (Tb of polychlorinated dibenzo-p-dioxins and polychlorinated dibenzofurans (PCDD/Fs was investigated. The molecular distance-edge vector (MDEV index was used as the structural descriptor. The quantitative relationship between the MDEV index and Tb was modeled by using multivariate linear regression (MLR and artificial neural network (ANN, respectively. Leave-one-out cross validation and external validation were carried out to assess the prediction performance of the models developed. For the MLR method, the prediction root mean square relative error (RMSRE of leave-one-out cross validation and external validation was 1.77 and 1.23, respectively. For the ANN method, the prediction RMSRE of leave-one-out cross validation and external validation was 1.65 and 1.16, respectively. A quantitative relationship between the MDEV index and Tb of PCDD/Fs was demonstrated. Both MLR and ANN are practicable for modeling this relationship. The MLR model and ANN model developed can be used to predict the Tb of PCDD/Fs. Thus, the Tb of each PCDD/F was predicted by the developed models.
Vector field statistical analysis of kinematic and force trajectories.
Pataky, Todd C; Robinson, Mark A; Vanrenterghem, Jos
2013-09-27
When investigating the dynamics of three-dimensional multi-body biomechanical systems it is often difficult to derive spatiotemporally directed predictions regarding experimentally induced effects. A paradigm of 'non-directed' hypothesis testing has emerged in the literature as a result. Non-directed analyses typically consist of ad hoc scalar extraction, an approach which substantially simplifies the original, highly multivariate datasets (many time points, many vector components). This paper describes a commensurately multivariate method as an alternative to scalar extraction. The method, called 'statistical parametric mapping' (SPM), uses random field theory to objectively identify field regions which co-vary significantly with the experimental design. We compared SPM to scalar extraction by re-analyzing three publicly available datasets: 3D knee kinematics, a ten-muscle force system, and 3D ground reaction forces. Scalar extraction was found to bias the analyses of all three datasets by failing to consider sufficient portions of the dataset, and/or by failing to consider covariance amongst vector components. SPM overcame both problems by conducting hypothesis testing at the (massively multivariate) vector trajectory level, with random field corrections simultaneously accounting for temporal correlation and vector covariance. While SPM has been widely demonstrated to be effective for analyzing 3D scalar fields, the current results are the first to demonstrate its effectiveness for 1D vector field analysis. It was concluded that SPM offers a generalized, statistically comprehensive solution to scalar extraction's over-simplification of vector trajectories, thereby making it useful for objectively guiding analyses of complex biomechanical systems. © 2013 Published by Elsevier Ltd. All rights reserved.
Rotations with Rodrigues' vector
International Nuclear Information System (INIS)
Pina, E
2011-01-01
The rotational dynamics was studied from the point of view of Rodrigues' vector. This vector is defined here by its connection with other forms of parametrization of the rotation matrix. The rotation matrix was expressed in terms of this vector. The angular velocity was computed using the components of Rodrigues' vector as coordinates. It appears to be a fundamental matrix that is used to express the components of the angular velocity, the rotation matrix and the angular momentum vector. The Hamiltonian formalism of rotational dynamics in terms of this vector uses the same matrix. The quantization of the rotational dynamics is performed with simple rules if one uses Rodrigues' vector and similar formal expressions for the quantum operators that mimic the Hamiltonian classical dynamics.
CERC Dataset (Full Hadza Data)
DEFF Research Database (Denmark)
2016-01-01
The dataset includes demographic, behavioral, and religiosity data from eight different populations from around the world. The samples were drawn from: (1) Coastal and (2) Inland Tanna, Vanuatu; (3) Hadzaland, Tanzania; (4) Lovu, Fiji; (5) Pointe aux Piment, Mauritius; (6) Pesqueiro, Brazil; (7......) Kyzyl, Tyva Republic; and (8) Yasawa, Fiji. Related publication: Purzycki, et al. (2016). Moralistic Gods, Supernatural Punishment and the Expansion of Human Sociality. Nature, 530(7590): 327-330....
Support vector machine (SVM) was applied for land-cover characterization using MODIS time-series data. Classification performance was examined with respect to training sample size, sample variability, and landscape homogeneity (purity). The results were compared to two convention...
An Analysis on Better Testing than Training Performances on the Iris Dataset
Schutten, Marten; Wiering, Marco
2016-01-01
The Iris dataset is a well known dataset containing information on three different types of Iris flowers. A typical and popular method for solving classification problems on datasets such as the Iris set is the support vector machine (SVM). In order to do so the dataset is separated in a set used
Directory of Open Access Journals (Sweden)
Ana Rodríguez
2016-12-01
Full Text Available We have categorized the dataset from content and emission of terpene volatiles of peel and juice in both Navelina and Pineapple sweet orange cultivars in which D-limonene was either up- (S, down-regulated (AS or non-altered (EV; control (“Impact of D-limonene synthase up- or down-regulation on sweet orange fruit and juice odor perception”(A. Rodríguez, J.E. Peris, A. Redondo, T. Shimada, E. Costell, I. Carbonell, C. Rojas, L. Peña, (2016 [1]. Data from volatile identification and quantification by HS-SPME and GC–MS were classified by Principal Component Analysis (PCA individually or as chemical groups. AS juice was characterized by the higher influence of the oxygen fraction, and S juice by the major influence of ethyl esters. S juices emitted less linalool compared to AS and EV juices.
International Nuclear Information System (INIS)
Yagi, T.; Tatsumi-Miyajima, J.; Sato, M.; Kraemer, K.H.; Takebe, H.
1991-01-01
To assess the contribution to mutagenesis by human DNA repair defects, a UV-treated shuttle vector plasmid, pZ189, was passed through fibroblasts derived from Japanese xeroderma pigmentosum (XP) patients in two different DNA repair complementation groups (A and F). Patients with XP have clinical and cellular UV hypersensitivity, increased frequency of skin cancer, and defects in DNA repair. The XP DNA repair defects represented by complementation groups A (XP-A) and F (XP-F) are more common in Japan than in Europe or the United States. In comparison to results with DNA repair-proficient human cells (W138-VA13), UV-treated pZ189 passed through the XP-A [XP2OS(SV)] or XP-F [XP2YO(SV)] cells showed fewer surviving plasmids (XP-A less than XP-F) and a higher frequency of mutated plasmids (XP-A greater than XP-F). Base sequence analysis of more than 200 mutated plasmids showed the major type of base substitution mutation to be the G:C----A:T transition with all three cell lines. The XP-A and XP-F cells revealed a higher frequency of G:C----A:T transitions and a lower frequency of transversions among plasmids with single or tandem mutations and a lower frequency of plasmids with multiple point mutations compared to the normal line. The spectrum of mutations in pZ189 with the XP-A cells was similar to that with the XP-F cells. Seventy-six to 91% of the single base substitution mutations occurred at G:C base pairs in which the 5'-neighboring base of the cytosine was thymine or cytosine. These studies indicate that the DNA repair defects in Japanese XP patients in complementation groups A and F result in different frequencies of plasmid survival and mutagenesis but in similar types of mutagenic abnormalities despite marked differences in clinical features
Aaron Journal article datasets
U.S. Environmental Protection Agency — All figures used in the journal article are in netCDF format. This dataset is associated with the following publication: Sims, A., K. Alapaty , and S. Raman....
Integrated Surface Dataset (Global)
National Oceanic and Atmospheric Administration, Department of Commerce — The Integrated Surface (ISD) Dataset (ISD) is composed of worldwide surface weather observations from over 35,000 stations, though the best spatial coverage is...
U.S. Environmental Protection Agency — The EPA Control Measure Dataset is a collection of documents describing air pollution control available to regulated facilities for the control and abatement of air...
National Hydrography Dataset (NHD)
Kansas Data Access and Support Center — The National Hydrography Dataset (NHD) is a feature-based database that interconnects and uniquely identifies the stream segments or reaches that comprise the...
National Oceanic and Atmospheric Administration, Department of Commerce — This dataset contains ecological information collected on the major adult spawning and juvenile habitats of market squid off California and the US Pacific Northwest....
U.S. Environmental Protection Agency — Soil and air concentrations of asbestos in Sumas study. This dataset is associated with the following publication: Wroble, J., T. Frederick, A. Frame, and D....
Newell, Homer E
2006-01-01
When employed with skill and understanding, vector analysis can be a practical and powerful tool. This text develops the algebra and calculus of vectors in a manner useful to physicists and engineers. Numerous exercises (with answers) not only provide practice in manipulation but also help establish students' physical and geometric intuition in regard to vectors and vector concepts.Part I, the basic portion of the text, consists of a thorough treatment of vector algebra and the vector calculus. Part II presents the illustrative matter, demonstrating applications to kinematics, mechanics, and e
Hoffmann, Banesh
1975-01-01
From his unusual beginning in ""Defining a vector"" to his final comments on ""What then is a vector?"" author Banesh Hoffmann has written a book that is provocative and unconventional. In his emphasis on the unresolved issue of defining a vector, Hoffmann mixes pure and applied mathematics without using calculus. The result is a treatment that can serve as a supplement and corrective to textbooks, as well as collateral reading in all courses that deal with vectors. Major topics include vectors and the parallelogram law; algebraic notation and basic ideas; vector algebra; scalars and scalar p
Kashefpur, Masoud; Kafieh, Rahele; Jorjandi, Sahar; Golmohammadi, Hadis; Khodabande, Zahra; Abbasi, Mohammadreza; Teifuri, Nilufar; Fakharzadeh, Ali Akbar; Kashefpoor, Maryam; Rabbani, Hossein
2017-01-01
An online depository was introduced to share clinical ground truth with the public and provide open access for researchers to evaluate their computer-aided algorithms. PHP was used for web programming and MySQL for database managing. The website was entitled "biosigdata.com." It was a fast, secure, and easy-to-use online database for medical signals and images. Freely registered users could download the datasets and could also share their own supplementary materials while maintaining their privacies (citation and fee). Commenting was also available for all datasets, and automatic sitemap and semi-automatic SEO indexing have been set for the site. A comprehensive list of available websites for medical datasets is also presented as a Supplementary (http://journalonweb.com/tempaccess/4800.584.JMSS_55_16I3253.pdf).
CompMusic
2014-01-01
The audio examples were recorded from a professional Carnatic percussionist in a semi-anechoic studio conditions by Akshay Anantapadmanabhan using SM-58 microphones and an H4n ZOOM recorder. The audio was sampled at 44.1 kHz and stored as 16 bit wav files. The dataset can be used for training models for each Mridangam stroke. /n/nA detailed description of the Mridangam and its strokes can be found in the paper below. A part of the dataset was used in the following paper. /nAkshay Anantapadman...
DEFF Research Database (Denmark)
Sturm, Bob L.
2013-01-01
The GTZAN dataset appears in at least 100 published works, and is the most-used public dataset for evaluation in machine listening research for music genre recognition (MGR). Our recent work, however, shows GTZAN has several faults (repetitions, mislabelings, and distortions), which challenge...... of GTZAN, and provide a catalog of its faults. We review how GTZAN has been used in MGR research, and find few indications that its faults have been known and considered. Finally, we rigorously study the effects of its faults on evaluating five different MGR systems. The lesson is not to banish GTZAN...
Dataset - Adviesregel PPL 2010
Evert, van F.K.; Schans, van der D.A.; Geel, van W.C.A.; Slabbekoorn, J.J.; Booij, R.; Jukema, J.N.; Meurs, E.J.J.; Uenk, D.
2011-01-01
This dataset contains experimental data from a number of field experiments with potato in The Netherlands (Van Evert et al., 2011). The data are presented as an SQL dump of a PostgreSQL database (version 8.4.4). An outline of the entity-relationship diagram of the database is given in an
Wolstenholme, E Œ
1978-01-01
Elementary Vectors, Third Edition serves as an introductory course in vector analysis and is intended to present the theoretical and application aspects of vectors. The book covers topics that rigorously explain and provide definitions, principles, equations, and methods in vector analysis. Applications of vector methods to simple kinematical and dynamical problems; central forces and orbits; and solutions to geometrical problems are discussed as well. This edition of the text also provides an appendix, intended for students, which the author hopes to bridge the gap between theory and appl
Directory of Open Access Journals (Sweden)
J. De Keyser
2007-05-01
Full Text Available This paper describes a general-purpose algorithm for computing the gradients in space and time of a scalar field, a vector field, or a divergence-free vector field, from in situ measurements by one or more spacecraft. The algorithm provides total error estimates on the computed gradient, including the effects of measurement errors, the errors due to a lack of spatio-temporal homogeneity, and errors due to small-scale fluctuations. It also has the ability to diagnose the conditioning of the problem. Optimal use is made of the data, in terms of exploiting the maximum amount of information relative to the uncertainty on the data, by solving the problem in a weighted least-squares sense. The method is illustrated using Cluster magnetic field and electron density data to compute various gradients during a traversal of the inner magnetosphere. In particular, Cluster is shown to cross azimuthal density structure, and the existence of field-aligned currents in the plasmasphere is demonstrated.
Vector grammars and PN machines
Institute of Scientific and Technical Information of China (English)
蒋昌俊
1996-01-01
The concept of vector grammars under the string semantic is introduced.The dass of vector grammars is given,which is similar to the dass of Chomsky grammars.The regular vector grammar is divided further.The strong and weak relation between the vector grammar and scalar grammar is discussed,so the spectrum system graph of scalar and vector grammars is made.The equivalent relation between the regular vector grammar and Petri nets (also called PN machine) is pointed.The hybrid PN machine is introduced,and its language is proved equivalent to the language of the context-free vector grammar.So the perfect relation structure between vector grammars and PN machines is formed.
Directory of Open Access Journals (Sweden)
Tihana Jovanic
Full Text Available Somatic hypermutation (SHM of immunoglobulin genes is currently viewed as a two step process initiated by the deamination of deoxycytidine (C to deoxyuridine (U, catalysed by the activation induced deaminase (AID. Phase 1 mutations arise from DNA replication across the uracil residue or the abasic site, generated by the uracil-DNA glycosylase, yielding transitions or transversions at G:C pairs. Phase 2 mutations result from the recognition of the U:G mismatch by the Msh2/Msh6 complex (MutS Homologue, followed by the excision of the mismatched nucleotide and the repair, by the low fidelity DNA polymerase eta, of the gap generated by the exonuclease I. These mutations are mainly focused at A:T pairs. Whereas in activated B cells both G:C and A:T pairs are equally targeted, ectopic expression of AID was shown to trigger only G:C mutations on a stably integrated reporter gene. Here we show that when using non-replicative episomal vectors containing a GFP gene, inactivated by the introduction of stop codons at various positions, a high level of EGFP positive cells was obtained after transient expression in Jurkat cells constitutively expressing AID. We show that mutations at G:C and A:T pairs are produced. EGFP positive cells are obtained in the absence of vector replication demonstrating that the mutations are dependent only on the mismatch repair (MMR pathway. This implies that the generation of phase 1 mutations is not a prerequisite for the expression of phase 2 mutations.
Brand, Louis
2006-01-01
The use of vectors not only simplifies treatments of differential geometry, mechanics, hydrodynamics, and electrodynamics, but also makes mathematical and physical concepts more tangible and easy to grasp. This text for undergraduates was designed as a short introductory course to give students the tools of vector algebra and calculus, as well as a brief glimpse into these subjects' manifold applications. The applications are developed to the extent that the uses of the potential function, both scalar and vector, are fully illustrated. Moreover, the basic postulates of vector analysis are brou
Network Intrusion Dataset Assessment
2013-03-01
International Conference on Computational Intelligence and Natural Computing, volume 2, pages 413–416, June 2009. • Rung Ching Chen, Kai -Fan Cheng, and...Chia-Fen Hsieh . “Using rough set and support vector machine for network intrusion detection.” International Journal of Network Security & Its...intrusion detection using FP tree rules.” Journal Of Advanced Networking and Applications, 1(1):30–39, 2009. • Ming-Yang Su, Gwo-Jong Yu , and Chun-Yuen
DEFF Research Database (Denmark)
2012-01-01
The present invention relates to a compact, reliable and low-cost vector velocimeter for example for determining velocities of particles suspended in a gas or fluid flow, or for determining velocity, displacement, rotation, or vibration of a solid surface, the vector velocimeter comprising a laser...
Allegheny County Cell Tower Points
Allegheny County / City of Pittsburgh / Western PA Regional Data Center — This dataset portrays cell tower locations as points in Allegheny County. The dataset is based on outbuilding codes in the Property Assessment Parcel Database used...
Comparison of recent SnIa datasets
International Nuclear Information System (INIS)
Sanchez, J.C. Bueno; Perivolaropoulos, L.; Nesseris, S.
2009-01-01
We rank the six latest Type Ia supernova (SnIa) datasets (Constitution (C), Union (U), ESSENCE (Davis) (E), Gold06 (G), SNLS 1yr (S) and SDSS-II (D)) in the context of the Chevalier-Polarski-Linder (CPL) parametrization w(a) = w 0 +w 1 (1−a), according to their Figure of Merit (FoM), their consistency with the cosmological constant (ΛCDM), their consistency with standard rulers (Cosmic Microwave Background (CMB) and Baryon Acoustic Oscillations (BAO)) and their mutual consistency. We find a significant improvement of the FoM (defined as the inverse area of the 95.4% parameter contour) with the number of SnIa of these datasets ((C) highest FoM, (U), (G), (D), (E), (S) lowest FoM). Standard rulers (CMB+BAO) have a better FoM by about a factor of 3, compared to the highest FoM SnIa dataset (C). We also find that the ranking sequence based on consistency with ΛCDM is identical with the corresponding ranking based on consistency with standard rulers ((S) most consistent, (D), (C), (E), (U), (G) least consistent). The ranking sequence of the datasets however changes when we consider the consistency with an expansion history corresponding to evolving dark energy (w 0 ,w 1 ) = (−1.4,2) crossing the phantom divide line w = −1 (it is practically reversed to (G), (U), (E), (S), (D), (C)). The SALT2 and MLCS2k2 fitters are also compared and some peculiar features of the SDSS-II dataset when standardized with the MLCS2k2 fitter are pointed out. Finally, we construct a statistic to estimate the internal consistency of a collection of SnIa datasets. We find that even though there is good consistency among most samples taken from the above datasets, this consistency decreases significantly when the Gold06 (G) dataset is included in the sample
Guilfoyle, Richard A.; Smith, Lloyd M.
1994-01-01
A vector comprising a filamentous phage sequence containing a first copy of filamentous phage gene X and other sequences necessary for the phage to propagate is disclosed. The vector also contains a second copy of filamentous phage gene X downstream from a promoter capable of promoting transcription in a bacterial host. In a preferred form of the present invention, the filamentous phage is M13 and the vector additionally includes a restriction endonuclease site located in such a manner as to substantially inactivate the second gene X when a DNA sequence is inserted into the restriction site.
Guilfoyle, R.A.; Smith, L.M.
1994-12-27
A vector comprising a filamentous phage sequence containing a first copy of filamentous phage gene X and other sequences necessary for the phage to propagate is disclosed. The vector also contains a second copy of filamentous phage gene X downstream from a promoter capable of promoting transcription in a bacterial host. In a preferred form of the present invention, the filamentous phage is M13 and the vector additionally includes a restriction endonuclease site located in such a manner as to substantially inactivate the second gene X when a DNA sequence is inserted into the restriction site. 2 figures.
Levine, Robert
2004-01-01
The cross-product is a mathematical operation that is performed between two 3-dimensional vectors. The result is a vector that is orthogonal or perpendicular to both of them. Learning about this for the first time while taking Calculus-III, the class was taught that if AxB = AxC, it does not necessarily follow that B = C. This seemed baffling. The…
Energy Technology Data Exchange (ETDEWEB)
Rejon-Barrera, Fernando [Institute for Theoretical Physics, University of Amsterdam,Science Park 904, Postbus 94485, 1090 GL, Amsterdam (Netherlands); Robbins, Daniel [Department of Physics, Texas A& M University,TAMU 4242, College Station, TX 77843 (United States)
2016-01-22
We work out all of the details required for implementation of the conformal bootstrap program applied to the four-point function of two scalars and two vectors in an abstract conformal field theory in arbitrary dimension. This includes a review of which tensor structures make appearances, a construction of the projectors onto the required mixed symmetry representations, and a computation of the conformal blocks for all possible operators which can be exchanged. These blocks are presented as differential operators acting upon the previously known scalar conformal blocks. Finally, we set up the bootstrap equations which implement crossing symmetry. Special attention is given to the case of conserved vectors, where several simplifications occur.
,
2002-01-01
The National Elevation Dataset (NED) is a new raster product assembled by the U.S. Geological Survey. NED is designed to provide National elevation data in a seamless form with a consistent datum, elevation unit, and projection. Data corrections were made in the NED assembly process to minimize artifacts, perform edge matching, and fill sliver areas of missing data. NED has a resolution of one arc-second (approximately 30 meters) for the conterminous United States, Hawaii, Puerto Rico and the island territories and a resolution of two arc-seconds for Alaska. NED data sources have a variety of elevation units, horizontal datums, and map projections. In the NED assembly process the elevation values are converted to decimal meters as a consistent unit of measure, NAD83 is consistently used as horizontal datum, and all the data are recast in a geographic projection. Older DEM's produced by methods that are now obsolete have been filtered during the NED assembly process to minimize artifacts that are commonly found in data produced by these methods. Artifact removal greatly improves the quality of the slope, shaded-relief, and synthetic drainage information that can be derived from the elevation data. Figure 2 illustrates the results of this artifact removal filtering. NED processing also includes steps to adjust values where adjacent DEM's do not match well, and to fill sliver areas of missing data between DEM's. These processing steps ensure that NED has no void areas and artificial discontinuities have been minimized. The artifact removal filtering process does not eliminate all of the artifacts. In areas where the only available DEM is produced by older methods, then "striping" may still occur.
An Annotated Dataset of 14 Cardiac MR Images
DEFF Research Database (Denmark)
Stegmann, Mikkel Bille
2002-01-01
This note describes a dataset consisting of 14 annotated cardiac MR images. Points of correspondence are placed on each image at the left ventricle (LV). As such, the dataset can be readily used for building statistical models of shape. Further, format specifications and terms of use are given....
Hyperbolic-symmetry vector fields.
Gao, Xu-Zhen; Pan, Yue; Cai, Meng-Qiang; Li, Yongnan; Tu, Chenghou; Wang, Hui-Tian
2015-12-14
We present and construct a new kind of orthogonal coordinate system, hyperbolic coordinate system. We present and design a new kind of local linearly polarized vector fields, which is defined as the hyperbolic-symmetry vector fields because the points with the same polarization form a series of hyperbolae. We experimentally demonstrate the generation of such a kind of hyperbolic-symmetry vector optical fields. In particular, we also study the modified hyperbolic-symmetry vector optical fields with the twofold and fourfold symmetric states of polarization when introducing the mirror symmetry. The tight focusing behaviors of these vector fields are also investigated. In addition, we also fabricate micro-structures on the K9 glass surfaces by several tightly focused (modified) hyperbolic-symmetry vector fields patterns, which demonstrate that the simulated tightly focused fields are in good agreement with the fabricated micro-structures.
U.S. Environmental Protection Agency — Dataset presents concentrations of organic pollutants, such as polyaromatic hydrocarbon compounds, in water samples. Water samples of known volume and concentration...
Allegheny County Address Points
Allegheny County / City of Pittsburgh / Western PA Regional Data Center — This dataset contains address points which represent physical address locations assigned by the Allegheny County addressing authority. Data is updated by County...
Robinson, Gilbert de B
2011-01-01
This brief undergraduate-level text by a prominent Cambridge-educated mathematician explores the relationship between algebra and geometry. An elementary course in plane geometry is the sole requirement for Gilbert de B. Robinson's text, which is the result of several years of teaching and learning the most effective methods from discussions with students. Topics include lines and planes, determinants and linear equations, matrices, groups and linear transformations, and vectors and vector spaces. Additional subjects range from conics and quadrics to homogeneous coordinates and projective geom
Thomas, E. G. F.
2012-01-01
This paper deals with the theory of integration of scalar functions with respect to a measure with values in a, not necessarily locally convex, topological vector space. It focuses on the extension of such integrals from bounded measurable functions to the class of integrable functions, proving
Editorial: Datasets for Learning Analytics
Dietze, Stefan; George, Siemens; Davide, Taibi; Drachsler, Hendrik
2018-01-01
The European LinkedUp and LACE (Learning Analytics Community Exchange) project have been responsible for setting up a series of data challenges at the LAK conferences 2013 and 2014 around the LAK dataset. The LAK datasets consists of a rich collection of full text publications in the domain of
Transversals of Complex Polynomial Vector Fields
DEFF Research Database (Denmark)
Dias, Kealey
Vector fields in the complex plane are defined by assigning the vector determined by the value P(z) to each point z in the complex plane, where P is a polynomial of one complex variable. We consider special families of so-called rotated vector fields that are determined by a polynomial multiplied...... by rotational constants. Transversals are a certain class of curves for such a family of vector fields that represent the bifurcation states for this family of vector fields. More specifically, transversals are curves that coincide with a homoclinic separatrix for some rotation of the vector field. Given...... a concrete polynomial, it seems to take quite a bit of work to prove that it is generic, i.e. structurally stable. This has been done for a special class of degree d polynomial vector fields having simple equilibrium points at the d roots of unity, d odd. In proving that such vector fields are generic...
Open University Learning Analytics dataset.
Kuzilek, Jakub; Hlosta, Martin; Zdrahal, Zdenek
2017-11-28
Learning Analytics focuses on the collection and analysis of learners' data to improve their learning experience by providing informed guidance and to optimise learning materials. To support the research in this area we have developed a dataset, containing data from courses presented at the Open University (OU). What makes the dataset unique is the fact that it contains demographic data together with aggregated clickstream data of students' interactions in the Virtual Learning Environment (VLE). This enables the analysis of student behaviour, represented by their actions. The dataset contains the information about 22 courses, 32,593 students, their assessment results, and logs of their interactions with the VLE represented by daily summaries of student clicks (10,655,280 entries). The dataset is freely available at https://analyse.kmi.open.ac.uk/open_dataset under a CC-BY 4.0 license.
Local Patch Vectors Encoded by Fisher Vectors for Image Classification
Directory of Open Access Journals (Sweden)
Shuangshuang Chen
2018-02-01
Full Text Available The objective of this work is image classification, whose purpose is to group images into corresponding semantic categories. Four contributions are made as follows: (i For computational simplicity and efficiency, we directly adopt raw image patch vectors as local descriptors encoded by Fisher vector (FV subsequently; (ii For obtaining representative local features within the FV encoding framework, we compare and analyze three typical sampling strategies: random sampling, saliency-based sampling and dense sampling; (iii In order to embed both global and local spatial information into local features, we construct an improved spatial geometry structure which shows good performance; (iv For reducing the storage and CPU costs of high dimensional vectors, we adopt a new feature selection method based on supervised mutual information (MI, which chooses features by an importance sorting algorithm. We report experimental results on dataset STL-10. It shows very promising performance with this simple and efficient framework compared to conventional methods.
Selecting the Optimal Combination Model of FSSVM for the Imbalance Datasets
Directory of Open Access Journals (Sweden)
Chuandong Qin
2014-01-01
Full Text Available Imbalanced data learning is one of the most active and important fields in machine learning research. The existing class imbalance learning methods can make Support Vector Machines (SVMs less sensitive to class imbalance; they still suffer from the disturbance of outliers and noise present in the datasets. A kind of Fuzzy Smooth Support Vector Machines (FSSVMs are proposed based on the Smooth Support Vector Machine (SSVM of O. L. Mangasarian. SSVM can be computed by the Broyden-Fletcher-Goldfarb-Shanno (BFGS algorithm or the Newton-Armijo algorithm easily. Two kinds of fuzzy memberships and three smooth functions can be chosen in the algorithms. The fuzzy memberships consider the contribution rate of each sample to the optimal separating hyperplane. The polynomial smooth functions can make the optimization problem more accurate at the inflection point. Those changes play the active effects on trials. The results of the experiments show that the FSSVMs can gain the better accuracy and the shorter time than the SSVMs and some of the other methods.
Interactive visualization and analysis of multimodal datasets for surgical applications.
Kirmizibayrak, Can; Yim, Yeny; Wakid, Mike; Hahn, James
2012-12-01
Surgeons use information from multiple sources when making surgical decisions. These include volumetric datasets (such as CT, PET, MRI, and their variants), 2D datasets (such as endoscopic videos), and vector-valued datasets (such as computer simulations). Presenting all the information to the user in an effective manner is a challenging problem. In this paper, we present a visualization approach that displays the information from various sources in a single coherent view. The system allows the user to explore and manipulate volumetric datasets, display analysis of dataset values in local regions, combine 2D and 3D imaging modalities and display results of vector-based computer simulations. Several interaction methods are discussed: in addition to traditional interfaces including mouse and trackers, gesture-based natural interaction methods are shown to control these visualizations with real-time performance. An example of a medical application (medialization laryngoplasty) is presented to demonstrate how the combination of different modalities can be used in a surgical setting with our approach.
An introduction to vectors, vector operators and vector analysis
Joag, Pramod S
2016-01-01
Ideal for undergraduate and graduate students of science and engineering, this book covers fundamental concepts of vectors and their applications in a single volume. The first unit deals with basic formulation, both conceptual and theoretical. It discusses applications of algebraic operations, Levi-Civita notation, and curvilinear coordinate systems like spherical polar and parabolic systems and structures, and analytical geometry of curves and surfaces. The second unit delves into the algebra of operators and their types and also explains the equivalence between the algebra of vector operators and the algebra of matrices. Formulation of eigen vectors and eigen values of a linear vector operator are elaborated using vector algebra. The third unit deals with vector analysis, discussing vector valued functions of a scalar variable and functions of vector argument (both scalar valued and vector valued), thus covering both the scalar vector fields and vector integration.
Turkey Run Landfill Emissions Dataset
U.S. Environmental Protection Agency — landfill emissions measurements for the Turkey run landfill in Georgia. This dataset is associated with the following publication: De la Cruz, F., R. Green, G....
U.S. Environmental Protection Agency — Emissions data from open air oil burns. This dataset is associated with the following publication: Gullett, B., J. Aurell, A. Holder, B. Mitchell, D. Greenwell, M....
Chemical product and function dataset
U.S. Environmental Protection Agency — Merged product weight fraction and chemical function data. This dataset is associated with the following publication: Isaacs , K., M. Goldsmith, P. Egeghy , K....
The NOAA Dataset Identifier Project
de la Beaujardiere, J.; Mccullough, H.; Casey, K. S.
2013-12-01
The US National Oceanic and Atmospheric Administration (NOAA) initiated a project in 2013 to assign persistent identifiers to datasets archived at NOAA and to create informational landing pages about those datasets. The goals of this project are to enable the citation of datasets used in products and results in order to help provide credit to data producers, to support traceability and reproducibility, and to enable tracking of data usage and impact. A secondary goal is to encourage the submission of datasets for long-term preservation, because only archived datasets will be eligible for a NOAA-issued identifier. A team was formed with representatives from the National Geophysical, Oceanographic, and Climatic Data Centers (NGDC, NODC, NCDC) to resolve questions including which identifier scheme to use (answer: Digital Object Identifier - DOI), whether or not to embed semantics in identifiers (no), the level of granularity at which to assign identifiers (as coarsely as reasonable), how to handle ongoing time-series data (do not break into chunks), creation mechanism for the landing page (stylesheet from formal metadata record preferred), and others. Decisions made and implementation experience gained will inform the writing of a Data Citation Procedural Directive to be issued by the Environmental Data Management Committee in 2014. Several identifiers have been issued as of July 2013, with more on the way. NOAA is now reporting the number as a metric to federal Open Government initiatives. This paper will provide further details and status of the project.
Herrero-Bervera, E.; Jicha, B.
2017-12-01
New paleomagnetic measurements, coupled with 40Ar/39Ar dating are revolutionizing our understanding of the geodynamo by providing terrestrial lava records of the short-term behavior of the paleofield. As part of an investigation of the Koolau volcano, Oahu, and the short-term behavior of the geomagnetic field, we have sampled the exposed flows of a long volcanic section (i.e. 191-m) located on the volcano's southwest collapsed flank at a locality known as Makapuu Point. Paleomagnetic and K-Ar investigations of the Koolau Volcanic Series have revealed excursional directions for lavas ranging from 2-3 Ma. The easy access and close geographical proximity to the K-Ar dated lava flows made this newly studied 191-m thick sequence of flows an excellent candidate for detailed paleomagnetic analysis. At least 10 samples, collected from each of the successive sites, were stepwise demagnetized by both a.f. (5-100 mT) and thermal (28 to 700 °C) methods. Mean directions were obtained by p.c. analysis. All samples yielded a strong and stable ChRM vector demagnetization diagrams based on 7 or more demagnetization steps, with thermal and a.f. results differing insignificantly. k-T analysis conducted on individual lava flows indicated 50% with reversible curves. Curie points from these analyses revealed a temperature close to or equal to 150-250oC, 575°C and 620oC, indicative of Ti-poor and Ti-rich magnetite as well as titanomaghemite ranging from single domain to pseudosingle domain grain sizes. The mean directions of the base of the section sampled up to ˜14m of the section are excursional ( 10 flows). We have also conducted absolute paleointensity (PI) determinations of the excursional flows using the Thellier-Coe protocol yielding PI values as low as 19 mT and up to 88 mT within the excursional zone of the record. 40Ar/39Ar incremental heating experiments on the groundmass from at least one flow site at 9-m from sea level that yields a plateau with an age of 2.60±0.13 Ma
The Harvard organic photovoltaic dataset.
Lopez, Steven A; Pyzer-Knapp, Edward O; Simm, Gregor N; Lutzow, Trevor; Li, Kewei; Seress, Laszlo R; Hachmann, Johannes; Aspuru-Guzik, Alán
2016-09-27
The Harvard Organic Photovoltaic Dataset (HOPV15) presented in this work is a collation of experimental photovoltaic data from the literature, and corresponding quantum-chemical calculations performed over a range of conformers, each with quantum chemical results using a variety of density functionals and basis sets. It is anticipated that this dataset will be of use in both relating electronic structure calculations to experimental observations through the generation of calibration schemes, as well as for the creation of new semi-empirical methods and the benchmarking of current and future model chemistries for organic electronic applications.
The Harvard organic photovoltaic dataset
Lopez, Steven A.; Pyzer-Knapp, Edward O.; Simm, Gregor N.; Lutzow, Trevor; Li, Kewei; Seress, Laszlo R.; Hachmann, Johannes; Aspuru-Guzik, Alán
2016-01-01
The Harvard Organic Photovoltaic Dataset (HOPV15) presented in this work is a collation of experimental photovoltaic data from the literature, and corresponding quantum-chemical calculations performed over a range of conformers, each with quantum chemical results using a variety of density functionals and basis sets. It is anticipated that this dataset will be of use in both relating electronic structure calculations to experimental observations through the generation of calibration schemes, as well as for the creation of new semi-empirical methods and the benchmarking of current and future model chemistries for organic electronic applications. PMID:27676312
Violation of vector dominance in the vector manifestation
International Nuclear Information System (INIS)
Sasaki, Chihiro
2003-01-01
The vector manifestation (VM) is a new pattern for realizing the chiral symmetry in QCD. In the VM, the massless vector meson becomes the chiral partner of pion at the critical point, in contrast with the restoration based on the linear sigma model. Including the intrinsic temperature dependences of the parameters of the hidden local symmetry (HLS) Lagrangian determined from the underlying QCD through the Wilsonian matching together with the hadronic thermal corrections, we present a new prediction of the VM on the direct photon-π-π coupling which measures the validity of the vector dominance (VD) of the electromagnetic form factor of the pion. We find that the VD is largely violated at the critical temperature, which indicates that the assumption of the VD made in several analysis on the dilepton spectra in hot matter may need to be weakened for consistently including the effect of the dropping mass of the vector meson. (author)
Generation of arbitrary vector beams
Perez-Garcia, Benjamin; López-Mariscal, Carlos; Hernandez-Aranda, Raul I.; Gutiérrez-Vega, Julio C.
2017-08-01
Optical vector beams arise from point to point spatial variations of the electric component of an electromagnetic field over the transverse plane. In this work, we present a novel experimental technique to generate arbitrary vec- tor beams, and provide sufficient evidence to validate their state of polarization. This technique takes advantage of the capability of a Spatial Light Modulator to simultaneously generate two components of an electromagnetic field by halving the screen of the device and subsequently recombining them in a Sagnac interferometer. Our experimental results show the versatility and robustness of this technique for the generation of vector beams.
Querying Large Biological Network Datasets
Gulsoy, Gunhan
2013-01-01
New experimental methods has resulted in increasing amount of genetic interaction data to be generated every day. Biological networks are used to store genetic interaction data gathered. Increasing amount of data available requires fast large scale analysis methods. Therefore, we address the problem of querying large biological network datasets.…
Fluxnet Synthesis Dataset Collaboration Infrastructure
Energy Technology Data Exchange (ETDEWEB)
Agarwal, Deborah A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Humphrey, Marty [Univ. of Virginia, Charlottesville, VA (United States); van Ingen, Catharine [Microsoft. San Francisco, CA (United States); Beekwilder, Norm [Univ. of Virginia, Charlottesville, VA (United States); Goode, Monte [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Jackson, Keith [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Rodriguez, Matt [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Weber, Robin [Univ. of California, Berkeley, CA (United States)
2008-02-06
The Fluxnet synthesis dataset originally compiled for the La Thuile workshop contained approximately 600 site years. Since the workshop, several additional site years have been added and the dataset now contains over 920 site years from over 240 sites. A data refresh update is expected to increase those numbers in the next few months. The ancillary data describing the sites continues to evolve as well. There are on the order of 120 site contacts and 60proposals have been approved to use thedata. These proposals involve around 120 researchers. The size and complexity of the dataset and collaboration has led to a new approach to providing access to the data and collaboration support and the support team attended the workshop and worked closely with the attendees and the Fluxnet project office to define the requirements for the support infrastructure. As a result of this effort, a new website (http://www.fluxdata.org) has been created to provide access to the Fluxnet synthesis dataset. This new web site is based on a scientific data server which enables browsing of the data on-line, data download, and version tracking. We leverage database and data analysis tools such as OLAP data cubes and web reports to enable browser and Excel pivot table access to the data.
A New Dataset Size Reduction Approach for PCA-Based Classification in OCR Application
Directory of Open Access Journals (Sweden)
Mohammad Amin Shayegan
2014-01-01
Full Text Available A major problem of pattern recognition systems is due to the large volume of training datasets including duplicate and similar training samples. In order to overcome this problem, some dataset size reduction and also dimensionality reduction techniques have been introduced. The algorithms presently used for dataset size reduction usually remove samples near to the centers of classes or support vector samples between different classes. However, the samples near to a class center include valuable information about the class characteristics and the support vector is important for evaluating system efficiency. This paper reports on the use of Modified Frequency Diagram technique for dataset size reduction. In this new proposed technique, a training dataset is rearranged and then sieved. The sieved training dataset along with automatic feature extraction/selection operation using Principal Component Analysis is used in an OCR application. The experimental results obtained when using the proposed system on one of the biggest handwritten Farsi/Arabic numeral standard OCR datasets, Hoda, show about 97% accuracy in the recognition rate. The recognition speed increased by 2.28 times, while the accuracy decreased only by 0.7%, when a sieved version of the dataset, which is only as half as the size of the initial training dataset, was used.
Measurement of Charmless B to Vector-Vector decays at BaBar
International Nuclear Information System (INIS)
Olaiya, Emmanuel
2011-01-01
The authors present results of B → vector-vector (VV) and B → vector-axial vector (VA) decays B 0 → φX(X = φ,ρ + or ρ 0 ), B + → φK (*)+ , B 0 → K*K*, B 0 → ρ + b 1 - and B + → K* 0 α 1 + . The largest dataset used for these results is based on 465 x 10 6 Υ(4S) → B(bar B) decays, collected with the BABAR detector at the PEP-II B meson factory located at the Stanford Linear Accelerator Center (SLAC). Using larger datasets, the BABAR experiment has provided more precise B → VV measurements, further supporting the smaller than expected longitudinal polarization fraction of B → φK*. Additional B meson to vector-vector and vector-axial vector decays have also been studied with a view to shedding light on the polarization anomaly. Taking into account the available errors, we find no disagreement between theory and experiment for these additional decays.
Feature Vector Construction Method for IRIS Recognition
Odinokikh, G.; Fartukov, A.; Korobkin, M.; Yoo, J.
2017-05-01
One of the basic stages of iris recognition pipeline is iris feature vector construction procedure. The procedure represents the extraction of iris texture information relevant to its subsequent comparison. Thorough investigation of feature vectors obtained from iris showed that not all the vector elements are equally relevant. There are two characteristics which determine the vector element utility: fragility and discriminability. Conventional iris feature extraction methods consider the concept of fragility as the feature vector instability without respect to the nature of such instability appearance. This work separates sources of the instability into natural and encodinginduced which helps deeply investigate each source of instability independently. According to the separation concept, a novel approach of iris feature vector construction is proposed. The approach consists of two steps: iris feature extraction using Gabor filtering with optimal parameters and quantization with separated preliminary optimized fragility thresholds. The proposed method has been tested on two different datasets of iris images captured under changing environmental conditions. The testing results show that the proposed method surpasses all the methods considered as a prior art by recognition accuracy on both datasets.
Vector manifestation and violation of vector dominance in hot matter
International Nuclear Information System (INIS)
Harada, Masayasu; Sasaki, Chihiro
2004-01-01
We show the details of the calculation of the hadronic thermal corrections to the two-point functions in the effective field theory of QCD for pions and vector mesons based on the hidden local symmetry (HLS) in hot matter using the background field gauge. We study the temperature dependence of the pion velocity in the low-temperature region determined from the hadronic thermal corrections, and show that, due to the presence of the dynamical vector meson, the pion velocity is smaller than the speed of the light already at one-loop level, in contrast to the result obtained in the ordinary chiral perturbation theory including only the pion at one-loop. Including the intrinsic temperature dependences of the parameters of the HLS Lagrangian determined from the underlying QCD through the Wilsonian matching, we show how the vector manifestation (VM), in which the massless vector meson becomes the chiral partner of pion, is realized at the critical temperature. We present a new prediction of the VM on the direct photon-π-π coupling which measures the validity of the vector dominance (VD) of the electromagnetic form factor of the pion: we find that the VD is largely violated at the critical temperature, which indicates that the assumption of the VD made in several analyses on the dilepton spectra in hot matter may need to be weakened for consistently including the effect of the dropping mass of the vector meson
2D Vector Field Simplification Based on Robustness
Skraba, Primoz
2014-03-01
Vector field simplification aims to reduce the complexity of the flow by removing features in order of their relevance and importance, to reveal prominent behavior and obtain a compact representation for interpretation. Most existing simplification techniques based on the topological skeleton successively remove pairs of critical points connected by separatrices, using distance or area-based relevance measures. These methods rely on the stable extraction of the topological skeleton, which can be difficult due to instability in numerical integration, especially when processing highly rotational flows. These geometric metrics do not consider the flow magnitude, an important physical property of the flow. In this paper, we propose a novel simplification scheme derived from the recently introduced topological notion of robustness, which provides a complementary view on flow structure compared to the traditional topological-skeleton-based approaches. Robustness enables the pruning of sets of critical points according to a quantitative measure of their stability, that is, the minimum amount of vector field perturbation required to remove them. This leads to a hierarchical simplification scheme that encodes flow magnitude in its perturbation metric. Our novel simplification algorithm is based on degree theory, has fewer boundary restrictions, and so can handle more general cases. Finally, we provide an implementation under the piecewise-linear setting and apply it to both synthetic and real-world datasets. © 2014 IEEE.
Viking Seismometer PDS Archive Dataset
Lorenz, R. D.
2016-12-01
The Viking Lander 2 seismometer operated successfully for over 500 Sols on the Martian surface, recording at least one likely candidate Marsquake. The Viking mission, in an era when data handling hardware (both on board and on the ground) was limited in capability, predated modern planetary data archiving, and ad-hoc repositories of the data, and the very low-level record at NSSDC, were neither convenient to process nor well-known. In an effort supported by the NASA Mars Data Analysis Program, we have converted the bulk of the Viking dataset (namely the 49,000 and 270,000 records made in High- and Event- modes at 20 and 1 Hz respectively) into a simple ASCII table format. Additionally, since wind-generated lander motion is a major component of the signal, contemporaneous meteorological data are included in summary records to facilitate correlation. These datasets are being archived at the PDS Geosciences Node. In addition to brief instrument and dataset descriptions, the archive includes code snippets in the freely-available language 'R' to demonstrate plotting and analysis. Further, we present examples of lander-generated noise, associated with the sampler arm, instrument dumps and other mechanical operations.
PHYSICS PERFORMANCE AND DATASET (PPD)
L. Silvestris
2013-01-01
The first part of the Long Shutdown period has been dedicated to the preparation of the samples for the analysis targeting the summer conferences. In particular, the 8 TeV data acquired in 2012, including most of the “parked datasets”, have been reconstructed profiting from improved alignment and calibration conditions for all the sub-detectors. A careful planning of the resources was essential in order to deliver the datasets well in time to the analysts, and to schedule the update of all the conditions and calibrations needed at the analysis level. The newly reprocessed data have undergone detailed scrutiny by the Dataset Certification team allowing to recover some of the data for analysis usage and further improving the certification efficiency, which is now at 91% of the recorded luminosity. With the aim of delivering a consistent dataset for 2011 and 2012, both in terms of conditions and release (53X), the PPD team is now working to set up a data re-reconstruction and a new MC pro...
RARD: The Related-Article Recommendation Dataset
Beel, Joeran; Carevic, Zeljko; Schaible, Johann; Neusch, Gabor
2017-01-01
Recommender-system datasets are used for recommender-system evaluations, training machine-learning algorithms, and exploring user behavior. While there are many datasets for recommender systems in the domains of movies, books, and music, there are rather few datasets from research-paper recommender systems. In this paper, we introduce RARD, the Related-Article Recommendation Dataset, from the digital library Sowiport and the recommendation-as-a-service provider Mr. DLib. The dataset contains ...
Akdemir, Bayram; Doǧan, Sercan; Aksoy, Muharrem H.; Canli, Eyüp; Özgören, Muammer
2015-03-01
Liquid behaviors are very important for many areas especially for Mechanical Engineering. Fast camera is a way to observe and search the liquid behaviors. Camera traces the dust or colored markers travelling in the liquid and takes many pictures in a second as possible as. Every image has large data structure due to resolution. For fast liquid velocity, there is not easy to evaluate or make a fluent frame after the taken images. Artificial intelligence has much popularity in science to solve the nonlinear problems. Adaptive neural fuzzy inference system is a common artificial intelligence in literature. Any particle velocity in a liquid has two dimension speed and its derivatives. Adaptive Neural Fuzzy Inference System has been used to create an artificial frame between previous and post frames as offline. Adaptive neural fuzzy inference system uses velocities and vorticities to create a crossing point vector between previous and post points. In this study, Adaptive Neural Fuzzy Inference System has been used to fill virtual frames among the real frames in order to improve image continuity. So this evaluation makes the images much understandable at chaotic or vorticity points. After executed adaptive neural fuzzy inference system, the image dataset increase two times and has a sequence as virtual and real, respectively. The obtained success is evaluated using R2 testing and mean squared error. R2 testing has a statistical importance about similarity and 0.82, 0.81, 0.85 and 0.8 were obtained for velocities and derivatives, respectively.
Developing a Data-Set for Stereopsis
Directory of Open Access Journals (Sweden)
D.W Hunter
2014-08-01
Full Text Available Current research on binocular stereopsis in humans and non-human primates has been limited by a lack of available data-sets. Current data-sets fall into two categories; stereo-image sets with vergence but no ranging information (Hibbard, 2008, Vision Research, 48(12, 1427-1439 or combinations of depth information with binocular images and video taken from cameras in fixed fronto-parallel configurations exhibiting neither vergence or focus effects (Hirschmuller & Scharstein, 2007, IEEE Conf. Computer Vision and Pattern Recognition. The techniques for generating depth information are also imperfect. Depth information is normally inaccurate or simply missing near edges and on partially occluded surfaces. For many areas of vision research these are the most interesting parts of the image (Goutcher, Hunter, Hibbard, 2013, i-Perception, 4(7, 484; Scarfe & Hibbard, 2013, Vision Research. Using state-of-the-art open-source ray-tracing software (PBRT as a back-end, our intention is to release a set of tools that will allow researchers in this field to generate artificial binocular stereoscopic data-sets. Although not as realistic as photographs, computer generated images have significant advantages in terms of control over the final output and ground-truth information about scene depth is easily calculated at all points in the scene, even partially occluded areas. While individual researchers have been developing similar stimuli by hand for many decades, we hope that our software will greatly reduce the time and difficulty of creating naturalistic binocular stimuli. Our intension in making this presentation is to elicit feedback from the vision community about what sort of features would be desirable in such software.
Credit Scoring by Fuzzy Support Vector Machines with a Novel Membership Function
Directory of Open Access Journals (Sweden)
Jian Shi
2016-11-01
Full Text Available Due to the recent financial crisis and European debt crisis, credit risk evaluation has become an increasingly important issue for financial institutions. Reliable credit scoring models are crucial for commercial banks to evaluate the financial performance of clients and have been widely studied in the fields of statistics and machine learning. In this paper a novel fuzzy support vector machine (SVM credit scoring model is proposed for credit risk analysis, in which fuzzy membership is adopted to indicate different contribution of each input point to the learning of SVM classification hyperplane. Considering the methodological consistency, support vector data description (SVDD is introduced to construct the fuzzy membership function and to reduce the effect of outliers and noises. The SVDD-based fuzzy SVM model is tested against the traditional fuzzy SVM on two real-world datasets and the research results confirm the effectiveness of the presented method.
Columbia River ESI: SOCECON (Socioeconomic Resource Points and Lines)
National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains vector points and lines representing human-use resource data for Columbia River. In the data set, vector points represent aquaculture sites,...
A Hybrid Neuro-Fuzzy Model For Integrating Large Earth-Science Datasets
Porwal, A.; Carranza, J.; Hale, M.
2004-12-01
A GIS-based hybrid neuro-fuzzy approach to integration of large earth-science datasets for mineral prospectivity mapping is described. It implements a Takagi-Sugeno type fuzzy inference system in the framework of a four-layered feed-forward adaptive neural network. Each unique combination of the datasets is considered a feature vector whose components are derived by knowledge-based ordinal encoding of the constituent datasets. A subset of feature vectors with a known output target vector (i.e., unique conditions known to be associated with either a mineralized or a barren location) is used for the training of an adaptive neuro-fuzzy inference system. Training involves iterative adjustment of parameters of the adaptive neuro-fuzzy inference system using a hybrid learning procedure for mapping each training vector to its output target vector with minimum sum of squared error. The trained adaptive neuro-fuzzy inference system is used to process all feature vectors. The output for each feature vector is a value that indicates the extent to which a feature vector belongs to the mineralized class or the barren class. These values are used to generate a prospectivity map. The procedure is demonstrated by an application to regional-scale base metal prospectivity mapping in a study area located in the Aravalli metallogenic province (western India). A comparison of the hybrid neuro-fuzzy approach with pure knowledge-driven fuzzy and pure data-driven neural network approaches indicates that the former offers a superior method for integrating large earth-science datasets for predictive spatial mathematical modelling.
This data is for Figures 6 and 7 in the journal article. The data also includes the two EPANET input files used for the analysis described in the paper, one for the looped system and one for the block system.This dataset is associated with the following publication:Grayman, W., R. Murray , and D. Savic. Redesign of Water Distribution Systems for Passive Containment of Contamination. JOURNAL OF THE AMERICAN WATER WORKS ASSOCIATION. American Water Works Association, Denver, CO, USA, 108(7): 381-391, (2016).
Quantifying uncertainty in observational rainfall datasets
Lennard, Chris; Dosio, Alessandro; Nikulin, Grigory; Pinto, Izidine; Seid, Hussen
2015-04-01
The CO-ordinated Regional Downscaling Experiment (CORDEX) has to date seen the publication of at least ten journal papers that examine the African domain during 2012 and 2013. Five of these papers consider Africa generally (Nikulin et al. 2012, Kim et al. 2013, Hernandes-Dias et al. 2013, Laprise et al. 2013, Panitz et al. 2013) and five have regional foci: Tramblay et al. (2013) on Northern Africa, Mariotti et al. (2014) and Gbobaniyi el al. (2013) on West Africa, Endris et al. (2013) on East Africa and Kalagnoumou et al. (2013) on southern Africa. There also are a further three papers that the authors know about under review. These papers all use an observed rainfall and/or temperature data to evaluate/validate the regional model output and often proceed to assess projected changes in these variables due to climate change in the context of these observations. The most popular reference rainfall data used are the CRU, GPCP, GPCC, TRMM and UDEL datasets. However, as Kalagnoumou et al. (2013) point out there are many other rainfall datasets available for consideration, for example, CMORPH, FEWS, TAMSAT & RIANNAA, TAMORA and the WATCH & WATCH-DEI data. They, with others (Nikulin et al. 2012, Sylla et al. 2012) show that the observed datasets can have a very wide spread at a particular space-time coordinate. As more ground, space and reanalysis-based rainfall products become available, all which use different methods to produce precipitation data, the selection of reference data is becoming an important factor in model evaluation. A number of factors can contribute to a uncertainty in terms of the reliability and validity of the datasets such as radiance conversion algorithims, the quantity and quality of available station data, interpolation techniques and blending methods used to combine satellite and guage based products. However, to date no comprehensive study has been performed to evaluate the uncertainty in these observational datasets. We assess 18 gridded
Hawaii ESI: NESTS (Nest Points)
National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains sensitive biological resource data for seabird nesting colonies in coastal Hawaii. Vector points in this data set represent locations of...
Virginia ESI: REPTPT (Reptile Points)
National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains sensitive biological resource data for sea turtles in Virginia. Vector points in this data set represent nesting sites. Species-specific...
Maryland ESI: NESTS (Nest Points)
National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains sensitive biological resource data for raptors in Maryland. Vector points in this data set represent bird nesting sites. Species-specific...
Louisiana ESI: NESTS (Nest Points)
National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains sensitive biological resource data for seabird and wading bird nesting colonies in coastal Louisiana. Vector points in this data set represent...
The CMS dataset bookkeeping service
Afaq, A.; Dolgert, A.; Guo, Y.; Jones, C.; Kosyakov, S.; Kuznetsov, V.; Lueking, L.; Riley, D.; Sekhri, V.
2008-07-01
The CMS Dataset Bookkeeping Service (DBS) has been developed to catalog all CMS event data from Monte Carlo and Detector sources. It provides the ability to identify MC or trigger source, track data provenance, construct datasets for analysis, and discover interesting data. CMS requires processing and analysis activities at various service levels and the DBS system provides support for localized processing or private analysis, as well as global access for CMS users at large. Catalog entries can be moved among the various service levels with a simple set of migration tools, thus forming a loose federation of databases. DBS is available to CMS users via a Python API, Command Line, and a Discovery web page interfaces. The system is built as a multi-tier web application with Java servlets running under Tomcat, with connections via JDBC to Oracle or MySQL database backends. Clients connect to the service through HTTP or HTTPS with authentication provided by GRID certificates and authorization through VOMS. DBS is an integral part of the overall CMS Data Management and Workflow Management systems.
The CMS dataset bookkeeping service
Energy Technology Data Exchange (ETDEWEB)
Afaq, A; Guo, Y; Kosyakov, S; Lueking, L; Sekhri, V [Fermilab, Batavia, Illinois 60510 (United States); Dolgert, A; Jones, C; Kuznetsov, V; Riley, D [Cornell University, Ithaca, New York 14850 (United States)
2008-07-15
The CMS Dataset Bookkeeping Service (DBS) has been developed to catalog all CMS event data from Monte Carlo and Detector sources. It provides the ability to identify MC or trigger source, track data provenance, construct datasets for analysis, and discover interesting data. CMS requires processing and analysis activities at various service levels and the DBS system provides support for localized processing or private analysis, as well as global access for CMS users at large. Catalog entries can be moved among the various service levels with a simple set of migration tools, thus forming a loose federation of databases. DBS is available to CMS users via a Python API, Command Line, and a Discovery web page interfaces. The system is built as a multi-tier web application with Java servlets running under Tomcat, with connections via JDBC to Oracle or MySQL database backends. Clients connect to the service through HTTP or HTTPS with authentication provided by GRID certificates and authorization through VOMS. DBS is an integral part of the overall CMS Data Management and Workflow Management systems.
The CMS dataset bookkeeping service
International Nuclear Information System (INIS)
Afaq, A; Guo, Y; Kosyakov, S; Lueking, L; Sekhri, V; Dolgert, A; Jones, C; Kuznetsov, V; Riley, D
2008-01-01
The CMS Dataset Bookkeeping Service (DBS) has been developed to catalog all CMS event data from Monte Carlo and Detector sources. It provides the ability to identify MC or trigger source, track data provenance, construct datasets for analysis, and discover interesting data. CMS requires processing and analysis activities at various service levels and the DBS system provides support for localized processing or private analysis, as well as global access for CMS users at large. Catalog entries can be moved among the various service levels with a simple set of migration tools, thus forming a loose federation of databases. DBS is available to CMS users via a Python API, Command Line, and a Discovery web page interfaces. The system is built as a multi-tier web application with Java servlets running under Tomcat, with connections via JDBC to Oracle or MySQL database backends. Clients connect to the service through HTTP or HTTPS with authentication provided by GRID certificates and authorization through VOMS. DBS is an integral part of the overall CMS Data Management and Workflow Management systems
The CMS dataset bookkeeping service
International Nuclear Information System (INIS)
Afaq, Anzar; Dolgert, Andrew; Guo, Yuyi; Jones, Chris; Kosyakov, Sergey; Kuznetsov, Valentin; Lueking, Lee; Riley, Dan; Sekhri, Vijay
2007-01-01
The CMS Dataset Bookkeeping Service (DBS) has been developed to catalog all CMS event data from Monte Carlo and Detector sources. It provides the ability to identify MC or trigger source, track data provenance, construct datasets for analysis, and discover interesting data. CMS requires processing and analysis activities at various service levels and the DBS system provides support for localized processing or private analysis, as well as global access for CMS users at large. Catalog entries can be moved among the various service levels with a simple set of migration tools, thus forming a loose federation of databases. DBS is available to CMS users via a Python API, Command Line, and a Discovery web page interfaces. The system is built as a multi-tier web application with Java servlets running under Tomcat, with connections via JDBC to Oracle or MySQL database backends. Clients connect to the service through HTTP or HTTPS with authentication provided by GRID certificates and authorization through VOMS. DBS is an integral part of the overall CMS Data Management and Workflow Management systems
Raster images vectorization system
Genytė, Jurgita
2006-01-01
The problem of raster images vectorization was analyzed and researched in this work. Existing vectorization systems are quite expensive, the results are inaccurate, and the manual vectorization of a large number of drafts is impossible. That‘s why our goal was to design and develop a new raster images vectorization system using our suggested automatic vectorization algorithm and the way to record results in a new universal vectorial file format. The work consists of these main parts: analysis...
International Nuclear Information System (INIS)
Pavicic, Mladen; Merlet, Jean-Pierre; McKay, Brendan; Megill, Norman D
2005-01-01
We give a constructive and exhaustive definition of Kochen-Specker (KS) vectors in a Hilbert space of any dimension as well as of all the remaining vectors of the space. KS vectors are elements of any set of orthonormal states, i.e., vectors in an n-dimensional Hilbert space, H n , n≥3, to which it is impossible to assign 1s and 0s in such a way that no two mutually orthogonal vectors from the set are both assigned 1 and that not all mutually orthogonal vectors are assigned 0. Our constructive definition of such KS vectors is based on algorithms that generate MMP diagrams corresponding to blocks of orthogonal vectors in R n , on algorithms that single out those diagrams on which algebraic (0)-(1) states cannot be defined, and on algorithms that solve nonlinear equations describing the orthogonalities of the vectors by means of statistically polynomially complex interval analysis and self-teaching programs. The algorithms are limited neither by the number of dimensions nor by the number of vectors. To demonstrate the power of the algorithms, all four-dimensional KS vector systems containing up to 24 vectors were generated and described, all three-dimensional vector systems containing up to 30 vectors were scanned, and several general properties of KS vectors were found
Visualization of conserved structures by fusing highly variable datasets.
Silverstein, Jonathan C; Chhadia, Ankur; Dech, Fred
2002-01-01
Skill, effort, and time are required to identify and visualize anatomic structures in three-dimensions from radiological data. Fundamentally, automating these processes requires a technique that uses symbolic information not in the dynamic range of the voxel data. We were developing such a technique based on mutual information for automatic multi-modality image fusion (MIAMI Fuse, University of Michigan). This system previously demonstrated facility at fusing one voxel dataset with integrated symbolic structure information to a CT dataset (different scale and resolution) from the same person. The next step of development of our technique was aimed at accommodating the variability of anatomy from patient to patient by using warping to fuse our standard dataset to arbitrary patient CT datasets. A standard symbolic information dataset was created from the full color Visible Human Female by segmenting the liver parenchyma, portal veins, and hepatic veins and overwriting each set of voxels with a fixed color. Two arbitrarily selected patient CT scans of the abdomen were used for reference datasets. We used the warping functions in MIAMI Fuse to align the standard structure data to each patient scan. The key to successful fusion was the focused use of multiple warping control points that place themselves around the structure of interest automatically. The user assigns only a few initial control points to align the scans. Fusion 1 and 2 transformed the atlas with 27 points around the liver to CT1 and CT2 respectively. Fusion 3 transformed the atlas with 45 control points around the liver to CT1 and Fusion 4 transformed the atlas with 5 control points around the portal vein. The CT dataset is augmented with the transformed standard structure dataset, such that the warped structure masks are visualized in combination with the original patient dataset. This combined volume visualization is then rendered interactively in stereo on the ImmersaDesk in an immersive Virtual
2008 TIGER/Line Nationwide Dataset
California Natural Resource Agency — This dataset contains a nationwide build of the 2008 TIGER/Line datasets from the US Census Bureau downloaded in April 2009. The TIGER/Line Shapefiles are an extract...
Satellite-Based Precipitation Datasets
Munchak, S. J.; Huffman, G. J.
2017-12-01
Of the possible sources of precipitation data, those based on satellites provide the greatest spatial coverage. There is a wide selection of datasets, algorithms, and versions from which to choose, which can be confusing to non-specialists wishing to use the data. The International Precipitation Working Group (IPWG) maintains tables of the major publicly available, long-term, quasi-global precipitation data sets (http://www.isac.cnr.it/ ipwg/data/datasets.html), and this talk briefly reviews the various categories. As examples, NASA provides two sets of quasi-global precipitation data sets: the older Tropical Rainfall Measuring Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA) and current Integrated Multi-satellitE Retrievals for Global Precipitation Measurement (GPM) mission (IMERG). Both provide near-real-time and post-real-time products that are uniformly gridded in space and time. The TMPA products are 3-hourly 0.25°x0.25° on the latitude band 50°N-S for about 16 years, while the IMERG products are half-hourly 0.1°x0.1° on 60°N-S for over 3 years (with plans to go to 16+ years in Spring 2018). In addition to the precipitation estimates, each data set provides fields of other variables, such as the satellite sensor providing estimates and estimated random error. The discussion concludes with advice about determining suitability for use, the necessity of being clear about product names and versions, and the need for continued support for satellite- and surface-based observation.
Directory of Open Access Journals (Sweden)
Mok Tik
2014-06-01
Full Text Available This study formulates regression of vector data that will enable statistical analysis of various geodetic phenomena such as, polar motion, ocean currents, typhoon/hurricane tracking, crustal deformations, and precursory earthquake signals. The observed vector variable of an event (dependent vector variable is expressed as a function of a number of hypothesized phenomena realized also as vector variables (independent vector variables and/or scalar variables that are likely to impact the dependent vector variable. The proposed representation has the unique property of solving the coefficients of independent vector variables (explanatory variables also as vectors, hence it supersedes multivariate multiple regression models, in which the unknown coefficients are scalar quantities. For the solution, complex numbers are used to rep- resent vector information, and the method of least squares is deployed to estimate the vector model parameters after transforming the complex vector regression model into a real vector regression model through isomorphism. Various operational statistics for testing the predictive significance of the estimated vector parameter coefficients are also derived. A simple numerical example demonstrates the use of the proposed vector regression analysis in modeling typhoon paths.
A Large-Scale 3D Object Recognition dataset
DEFF Research Database (Denmark)
Sølund, Thomas; Glent Buch, Anders; Krüger, Norbert
2016-01-01
geometric groups; concave, convex, cylindrical and flat 3D object models. The object models have varying amount of local geometric features to challenge existing local shape feature descriptors in terms of descriptiveness and robustness. The dataset is validated in a benchmark which evaluates the matching...... performance of 7 different state-of-the-art local shape descriptors. Further, we validate the dataset in a 3D object recognition pipeline. Our benchmark shows as expected that local shape feature descriptors without any global point relation across the surface have a poor matching performance with flat...
Using the Gravity Model to Estimate the Spatial Spread of Vector-Borne Diseases
Directory of Open Access Journals (Sweden)
Jean-Marie Aerts
2012-11-01
Full Text Available The gravity models are commonly used spatial interaction models. They have been widely applied in a large set of domains dealing with interactions amongst spatial entities. The spread of vector-borne diseases is also related to the intensity of interaction between spatial entities, namely, the physical habitat of pathogens’ vectors and/or hosts, and urban areas, thus humans. This study implements the concept behind gravity models in the spatial spread of two vector-borne diseases, nephropathia epidemica and Lyme borreliosis, based on current knowledge on the transmission mechanism of these diseases. Two sources of information on vegetated systems were tested: the CORINE land cover map and MODIS NDVI. The size of vegetated areas near urban centers and a local indicator of occupation-related exposure were found significant predictors of disease risk. Both the land cover map and the space-borne dataset were suited yet not equivalent input sources to locate and measure vegetated areas of importance for disease spread. The overall results point at the compatibility of the gravity model concept and the spatial spread of vector-borne diseases.
Dataset of transcriptional landscape of B cell early activation
Directory of Open Access Journals (Sweden)
Alexander S. Garruss
2015-09-01
Full Text Available Signaling via B cell receptors (BCR and Toll-like receptors (TLRs result in activation of B cells with distinct physiological outcomes, but transcriptional regulatory mechanisms that drive activation and distinguish these pathways remain unknown. At early time points after BCR and TLR ligand exposure, 0.5 and 2 h, RNA-seq was performed allowing observations on rapid transcriptional changes. At 2 h, ChIP-seq was performed to allow observations on important regulatory mechanisms potentially driving transcriptional change. The dataset includes RNA-seq, ChIP-seq of control (Input, RNA Pol II, H3K4me3, H3K27me3, and a separate RNA-seq for miRNA expression, which can be found at Gene Expression Omnibus Dataset GSE61608. Here, we provide details on the experimental and analysis methods used to obtain and analyze this dataset and to examine the transcriptional landscape of B cell early activation.
A cross-country Exchange Market Pressure (EMP) dataset.
Desai, Mohit; Patnaik, Ila; Felman, Joshua; Shah, Ajay
2017-06-01
The data presented in this article are related to the research article titled - "An exchange market pressure measure for cross country analysis" (Patnaik et al. [1]). In this article, we present the dataset for Exchange Market Pressure values (EMP) for 139 countries along with their conversion factors, ρ (rho). Exchange Market Pressure, expressed in percentage change in exchange rate, measures the change in exchange rate that would have taken place had the central bank not intervened. The conversion factor ρ can interpreted as the change in exchange rate associated with $1 billion of intervention. Estimates of conversion factor ρ allow us to calculate a monthly time series of EMP for 139 countries. Additionally, the dataset contains the 68% confidence interval (high and low values) for the point estimates of ρ 's. Using the standard errors of estimates of ρ 's, we obtain one sigma intervals around mean estimates of EMP values. These values are also reported in the dataset.
Chirico, P.G.; Moran, T.W.
2011-01-01
This dataset contains a collection of 24 folders, each representing a specific U.S. Geological Survey area of interest (AOI; fig. 1), as well as datasets for AOI subsets. Each folder includes the extent, contours, Digital Elevation Model (DEM), and hydrography of the corresponding AOI, which are organized into feature vector and raster datasets. The dataset comprises a geographic information system (GIS), which is available upon request from the USGS Afghanistan programs Web site (http://afghanistan.cr.usgs.gov/minerals.php), and the maps of the 24 areas of interest of the USGS AOIs.
U.S. Department of Health & Human Services — VectorBase is a Bioinformatics Resource Center for invertebrate vectors. It is one of four Bioinformatics Resource Centers funded by NIAID to provide web-based...
PHYSICS PERFORMANCE AND DATASET (PPD)
L. Silvestris
2012-01-01
Introduction The first part of the year presented an important test for the new Physics Performance and Dataset (PPD) group (cf. its mandate: http://cern.ch/go/8f77). The activity was focused on the validation of the new releases meant for the Monte Carlo (MC) production and the data-processing in 2012 (CMSSW 50X and 52X), and on the preparation of the 2012 operations. In view of the Chamonix meeting, the PPD and physics groups worked to understand the impact of the higher pile-up scenario on some of the flagship Higgs analyses to better quantify the impact of the high luminosity on the CMS physics potential. A task force is working on the optimisation of the reconstruction algorithms and on the code to cope with the performance requirements imposed by the higher event occupancy as foreseen for 2012. Concerning the preparation for the analysis of the new data, a new MC production has been prepared. The new samples, simulated at 8 TeV, are already being produced and the digitisation and recons...
Pattern Analysis On Banking Dataset
Directory of Open Access Journals (Sweden)
Amritpal Singh
2015-06-01
Full Text Available Abstract Everyday refinement and development of technology has led to an increase in the competition between the Tech companies and their going out of way to crack the system andbreak down. Thus providing Data mining a strategically and security-wise important area for many business organizations including banking sector. It allows the analyzes of important information in the data warehouse and assists the banks to look for obscure patterns in a group and discover unknown relationship in the data.Banking systems needs to process ample amount of data on daily basis related to customer information their credit card details limit and collateral details transaction details risk profiles Anti Money Laundering related information trade finance data. Thousands of decisionsbased on the related data are taken in a bank daily. This paper analyzes the banking dataset in the weka environment for the detection of interesting patterns based on its applications ofcustomer acquisition customer retention management and marketing and management of risk fraudulence detections.
PHYSICS PERFORMANCE AND DATASET (PPD)
L. Silvestris
2013-01-01
The PPD activities, in the first part of 2013, have been focused mostly on the final physics validation and preparation for the data reprocessing of the full 8 TeV datasets with the latest calibrations. These samples will be the basis for the preliminary results for summer 2013 but most importantly for the final publications on the 8 TeV Run 1 data. The reprocessing involves also the reconstruction of a significant fraction of “parked data” that will allow CMS to perform a whole new set of precision analyses and searches. In this way the CMSSW release 53X is becoming the legacy release for the 8 TeV Run 1 data. The regular operation activities have included taking care of the prolonged proton-proton data taking and the run with proton-lead collisions that ended in February. The DQM and Data Certification team has deployed a continuous effort to promptly certify the quality of the data. The luminosity-weighted certification efficiency (requiring all sub-detectors to be certified as usab...
Robustness-Based Simplification of 2D Steady and Unsteady Vector Fields.
Skraba, Primoz; Bei Wang; Guoning Chen; Rosen, Paul
2015-08-01
Vector field simplification aims to reduce the complexity of the flow by removing features in order of their relevance and importance, to reveal prominent behavior and obtain a compact representation for interpretation. Most existing simplification techniques based on the topological skeleton successively remove pairs of critical points connected by separatrices, using distance or area-based relevance measures. These methods rely on the stable extraction of the topological skeleton, which can be difficult due to instability in numerical integration, especially when processing highly rotational flows. In this paper, we propose a novel simplification scheme derived from the recently introduced topological notion of robustness which enables the pruning of sets of critical points according to a quantitative measure of their stability, that is, the minimum amount of vector field perturbation required to remove them. This leads to a hierarchical simplification scheme that encodes flow magnitude in its perturbation metric. Our novel simplification algorithm is based on degree theory and has minimal boundary restrictions. Finally, we provide an implementation under the piecewise-linear setting and apply it to both synthetic and real-world datasets. We show local and complete hierarchical simplifications for steady as well as unsteady vector fields.
Robustness-Based Simplification of 2D Steady and Unsteady Vector Fields
Skraba, Primoz
2015-08-01
© 2015 IEEE. Vector field simplification aims to reduce the complexity of the flow by removing features in order of their relevance and importance, to reveal prominent behavior and obtain a compact representation for interpretation. Most existing simplification techniques based on the topological skeleton successively remove pairs of critical points connected by separatrices, using distance or area-based relevance measures. These methods rely on the stable extraction of the topological skeleton, which can be difficult due to instability in numerical integration, especially when processing highly rotational flows. In this paper, we propose a novel simplification scheme derived from the recently introduced topological notion of robustness which enables the pruning of sets of critical points according to a quantitative measure of their stability, that is, the minimum amount of vector field perturbation required to remove them. This leads to a hierarchical simplification scheme that encodes flow magnitude in its perturbation metric. Our novel simplification algorithm is based on degree theory and has minimal boundary restrictions. Finally, we provide an implementation under the piecewise-linear setting and apply it to both synthetic and real-world datasets. We show local and complete hierarchical simplifications for steady as well as unsteady vector fields.
Robustness-Based Simplification of 2D Steady and Unsteady Vector Fields
Skraba, Primoz; Wang, Bei; Chen, Guoning; Rosen, Paul
2015-01-01
© 2015 IEEE. Vector field simplification aims to reduce the complexity of the flow by removing features in order of their relevance and importance, to reveal prominent behavior and obtain a compact representation for interpretation. Most existing simplification techniques based on the topological skeleton successively remove pairs of critical points connected by separatrices, using distance or area-based relevance measures. These methods rely on the stable extraction of the topological skeleton, which can be difficult due to instability in numerical integration, especially when processing highly rotational flows. In this paper, we propose a novel simplification scheme derived from the recently introduced topological notion of robustness which enables the pruning of sets of critical points according to a quantitative measure of their stability, that is, the minimum amount of vector field perturbation required to remove them. This leads to a hierarchical simplification scheme that encodes flow magnitude in its perturbation metric. Our novel simplification algorithm is based on degree theory and has minimal boundary restrictions. Finally, we provide an implementation under the piecewise-linear setting and apply it to both synthetic and real-world datasets. We show local and complete hierarchical simplifications for steady as well as unsteady vector fields.
Generalization of concurrence vectors
International Nuclear Information System (INIS)
Yu Changshui; Song Heshan
2004-01-01
In this Letter, based on the generalization of concurrence vectors for bipartite pure state with respect to employing tensor product of generators of the corresponding rotation groups, we generalize concurrence vectors to the case of mixed states; a new criterion of separability of multipartite pure states is given out, for which we define a concurrence vector; we generalize the vector to the case of multipartite mixed state and give out a good measure of free entanglement
Ebrahimi, Javad; Fragouli, Christina
2010-01-01
We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L X L coding matrices that play a similar role as coding coefficients in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector co...
Vector Network Coding Algorithms
Ebrahimi, Javad; Fragouli, Christina
2010-01-01
We develop new algebraic algorithms for scalar and vector network coding. In vector network coding, the source multicasts information by transmitting vectors of length L, while intermediate nodes process and combine their incoming packets by multiplying them with L x L coding matrices that play a similar role as coding c in scalar coding. Our algorithms for scalar network jointly optimize the employed field size while selecting the coding coefficients. Similarly, for vector coding, our algori...
Convexity and Marginal Vectors
van Velzen, S.; Hamers, H.J.M.; Norde, H.W.
2002-01-01
In this paper we construct sets of marginal vectors of a TU game with the property that if the marginal vectors from these sets are core elements, then the game is convex.This approach leads to new upperbounds on the number of marginal vectors needed to characterize convexity.An other result is that
DEFF Research Database (Denmark)
Becciolini, Diego; Franzosi, Diogo Buarque; Foadi, Roshan
2015-01-01
We analyze the Large Hadron Collider (LHC) phenomenology of heavy vector resonances with a $SU(2)_L\\times SU(2)_R$ spectral global symmetry. This symmetry partially protects the electroweak S-parameter from large contributions of the vector resonances. The resulting custodial vector model spectrum...
Existence and Stability of Solutions for Implicit Multivalued Vector Equilibrium Problems
Directory of Open Access Journals (Sweden)
Li Qiuying
2011-01-01
Full Text Available A class of implicit multivalued vector equilibrium problems is studied. By using the generalized Fan-Browder fixed point theorem, some existence results of solutions for the implicit multivalued vector equilibrium problems are obtained under some suitable assumptions. Moreover, a stability result of solutions for the implicit multivalued vector equilibrium problems is derived. These results extend and unify some recent results for implicit vector equilibrium problems, multivalued vector variational inequality problems, and vector variational inequality problems.
Towards human behavior recognition based on spatio temporal features and support vector machines
Ghabri, Sawsen; Ouarda, Wael; Alimi, Adel M.
2017-03-01
Security and surveillance are vital issues in today's world. The recent acts of terrorism have highlighted the urgent need for efficient surveillance. There is indeed a need for an automated system for video surveillance which can detect identity and activity of person. In this article, we propose a new paradigm to recognize an aggressive human behavior such as boxing action. Our proposed system for human activity detection includes the use of a fusion between Spatio Temporal Interest Point (STIP) and Histogram of Oriented Gradient (HoG) features. The novel feature called Spatio Temporal Histogram Oriented Gradient (STHOG). To evaluate the robustness of our proposed paradigm with a local application of HoG technique on STIP points, we made experiments on KTH human action dataset based on Multi Class Support Vector Machines classification. The proposed scheme outperforms basic descriptors like HoG and STIP to achieve 82.26% us an accuracy value of classification rate.
The Geometry of Finite Equilibrium Datasets
DEFF Research Database (Denmark)
Balasko, Yves; Tvede, Mich
We investigate the geometry of finite datasets defined by equilibrium prices, income distributions, and total resources. We show that the equilibrium condition imposes no restrictions if total resources are collinear, a property that is robust to small perturbations. We also show that the set...... of equilibrium datasets is pathconnected when the equilibrium condition does impose restrictions on datasets, as for example when total resources are widely non collinear....
IPCC Socio-Economic Baseline Dataset
National Aeronautics and Space Administration — The Intergovernmental Panel on Climate Change (IPCC) Socio-Economic Baseline Dataset consists of population, human development, economic, water resources, land...
Veterans Affairs Suicide Prevention Synthetic Dataset
Department of Veterans Affairs — The VA's Veteran Health Administration, in support of the Open Data Initiative, is providing the Veterans Affairs Suicide Prevention Synthetic Dataset (VASPSD). The...
Nanoparticle-organic pollutant interaction dataset
U.S. Environmental Protection Agency — Dataset presents concentrations of organic pollutants, such as polyaromatic hydrocarbon compounds, in water samples. Water samples of known volume and concentration...
Maxwell's Multipole Vectors and the CMB
Weeks, Jeffrey R.
2004-01-01
The recently re-discovered multipole vector approach to understanding the harmonic decomposition of the cosmic microwave background traces its roots to Maxwell's Treatise on Electricity and Magnetism. Taking Maxwell's directional derivative approach as a starting point, the present article develops a fast algorithm for computing multipole vectors, with an exposition that is both simpler and better motivated than in the author's previous work. Tests show the resulting algorithm, coded up as a ...
DEFF Research Database (Denmark)
Elleby, Anita; Ingwersen, Peter
2010-01-01
; the Cumulated Publication Point Indicator (CPPI), which graphically illustrates the cumulated gain of obtained vs. ideal points, both seen as vectors; and the normalized Cumulated Publication Point Index (nCPPI) that represents the cumulated gain of publication success as index values, either graphically......The paper presents comparative analyses of two publication point systems, The Norwegian and the in-house system from the interdisciplinary Danish Institute of International Studies (DIIS), used as case in the study for publications published 2006, and compares central citation-based indicators...... with novel publication point indicators (PPIs) that are formalized and exemplified. Two diachronic citation windows are applied: 2006-07 and 2006-08. Web of Science (WoS) as well as Google Scholar (GS) are applied to observe the cite delay and citedness for the different document types published by DIIS...
Clifford Fourier transform on vector fields.
Ebling, Julia; Scheuermann, Gerik
2005-01-01
Image processing and computer vision have robust methods for feature extraction and the computation of derivatives of scalar fields. Furthermore, interpolation and the effects of applying a filter can be analyzed in detail and can be advantages when applying these methods to vector fields to obtain a solid theoretical basis for feature extraction. We recently introduced the Clifford convolution, which is an extension of the classical convolution on scalar fields and provides a unified notation for the convolution of scalar and vector fields. It has attractive geometric properties that allow pattern matching on vector fields. In image processing, the convolution and the Fourier transform operators are closely related by the convolution theorem and, in this paper, we extend the Fourier transform to include general elements of Clifford Algebra, called multivectors, including scalars and vectors. The resulting convolution and derivative theorems are extensions of those for convolution and the Fourier transform on scalar fields. The Clifford Fourier transform allows a frequency analysis of vector fields and the behavior of vector-valued filters. In frequency space, vectors are transformed into general multivectors of the Clifford Algebra. Many basic vector-valued patterns, such as source, sink, saddle points, and potential vortices, can be described by a few multivectors in frequency space.
Music Signal Processing Using Vector Product Neural Networks
Fan, Z. C.; Chan, T. S.; Yang, Y. H.; Jang, J. S. R.
2017-05-01
We propose a novel neural network model for music signal processing using vector product neurons and dimensionality transformations. Here, the inputs are first mapped from real values into three-dimensional vectors then fed into a three-dimensional vector product neural network where the inputs, outputs, and weights are all three-dimensional values. Next, the final outputs are mapped back to the reals. Two methods for dimensionality transformation are proposed, one via context windows and the other via spectral coloring. Experimental results on the iKala dataset for blind singing voice separation confirm the efficacy of our model.
Shi, Yingzhong; Chung, Fu-Lai; Wang, Shitong
2015-09-01
Recently, a time-adaptive support vector machine (TA-SVM) is proposed for handling nonstationary datasets. While attractive performance has been reported and the new classifier is distinctive in simultaneously solving several SVM subclassifiers locally and globally by using an elegant SVM formulation in an alternative kernel space, the coupling of subclassifiers brings in the computation of matrix inversion, thus resulting to suffer from high computational burden in large nonstationary dataset applications. To overcome this shortcoming, an improved TA-SVM (ITA-SVM) is proposed using a common vector shared by all the SVM subclassifiers involved. ITA-SVM not only keeps an SVM formulation, but also avoids the computation of matrix inversion. Thus, we can realize its fast version, that is, improved time-adaptive core vector machine (ITA-CVM) for large nonstationary datasets by using the CVM technique. ITA-CVM has the merit of asymptotic linear time complexity for large nonstationary datasets as well as inherits the advantage of TA-SVM. The effectiveness of the proposed classifiers ITA-SVM and ITA-CVM is also experimentally confirmed.
SIMADL: Simulated Activities of Daily Living Dataset
Directory of Open Access Journals (Sweden)
Talal Alshammari
2018-04-01
Full Text Available With the realisation of the Internet of Things (IoT paradigm, the analysis of the Activities of Daily Living (ADLs, in a smart home environment, is becoming an active research domain. The existence of representative datasets is a key requirement to advance the research in smart home design. Such datasets are an integral part of the visualisation of new smart home concepts as well as the validation and evaluation of emerging machine learning models. Machine learning techniques that can learn ADLs from sensor readings are used to classify, predict and detect anomalous patterns. Such techniques require data that represent relevant smart home scenarios, for training, testing and validation. However, the development of such machine learning techniques is limited by the lack of real smart home datasets, due to the excessive cost of building real smart homes. This paper provides two datasets for classification and anomaly detection. The datasets are generated using OpenSHS, (Open Smart Home Simulator, which is a simulation software for dataset generation. OpenSHS records the daily activities of a participant within a virtual environment. Seven participants simulated their ADLs for different contexts, e.g., weekdays, weekends, mornings and evenings. Eighty-four files in total were generated, representing approximately 63 days worth of activities. Forty-two files of classification of ADLs were simulated in the classification dataset and the other forty-two files are for anomaly detection problems in which anomalous patterns were simulated and injected into the anomaly detection dataset.
ASSISTments Dataset from Multiple Randomized Controlled Experiments
Selent, Douglas; Patikorn, Thanaporn; Heffernan, Neil
2016-01-01
In this paper, we present a dataset consisting of data generated from 22 previously and currently running randomized controlled experiments inside the ASSISTments online learning platform. This dataset provides data mining opportunities for researchers to analyze ASSISTments data in a convenient format across multiple experiments at the same time.…
Synthetic and Empirical Capsicum Annuum Image Dataset
Barth, R.
2016-01-01
This dataset consists of per-pixel annotated synthetic (10500) and empirical images (50) of Capsicum annuum, also known as sweet or bell pepper, situated in a commercial greenhouse. Furthermore, the source models to generate the synthetic images are included. The aim of the datasets are to
Supergravity inspired vector curvaton
International Nuclear Information System (INIS)
Dimopoulos, Konstantinos
2007-01-01
It is investigated whether a massive Abelian vector field, whose gauge kinetic function is growing during inflation, can be responsible for the generation of the curvature perturbation in the Universe. Particle production is studied and it is shown that the vector field can obtain a scale-invariant superhorizon spectrum of perturbations with a reasonable choice of kinetic function. After inflation the vector field begins coherent oscillations, during which it corresponds to pressureless isotropic matter. When the vector field dominates the Universe, its perturbations give rise to the observed curvature perturbation following the curvaton scenario. It is found that this is possible if, after the end of inflation, the mass of the vector field increases at a phase transition at temperature of order 1 TeV or lower. Inhomogeneous reheating, whereby the vector field modulates the decay rate of the inflaton, is also studied
Design of an audio advertisement dataset
Fu, Yutao; Liu, Jihong; Zhang, Qi; Geng, Yuting
2015-12-01
Since more and more advertisements swarm into radios, it is necessary to establish an audio advertising dataset which could be used to analyze and classify the advertisement. A method of how to establish a complete audio advertising dataset is presented in this paper. The dataset is divided into four different kinds of advertisements. Each advertisement's sample is given in *.wav file format, and annotated with a txt file which contains its file name, sampling frequency, channel number, broadcasting time and its class. The classifying rationality of the advertisements in this dataset is proved by clustering the different advertisements based on Principal Component Analysis (PCA). The experimental results show that this audio advertisement dataset offers a reliable set of samples for correlative audio advertisement experimental studies.
Becciolini, Diego; Franzosi, Diogo Buarque; Foadi, Roshan; Frandsen, Mads T.; Hapola, Tuomas; Sannino, Francesco
2015-07-01
We analyze the Large Hadron Collider (LHC) phenomenology of heavy vector resonances with a S U (2 )L×S U (2 )R spectral global symmetry. This symmetry partially protects the electroweak S parameter from large contributions of the vector resonances. The resulting custodial vector model spectrum and interactions with the standard model fields lead to distinct signatures at the LHC in the diboson, dilepton, and associated Higgs channels.
HITZER, Eckhard MS
2002-01-01
This paper treats the fundamentals of the vector differential calculus part of universal geometric calculus. Geometric calculus simplifies and unifies the structure and notation of mathematics for all of science and engineering, and for technological applications. In order to make the treatment self-contained, I first compile all important geometric algebra relationships,which are necesssary for vector differential calculus. Then differentiation by vectors is introduced and a host of major ve...
Directory of Open Access Journals (Sweden)
Jean-François Degbomont
2010-10-01
Full Text Available This paper addresses the symbolic representation of non-convex real polyhedra, i.e., sets of real vectors satisfying arbitrary Boolean combinations of linear constraints. We develop an original data structure for representing such sets, based on an implicit and concise encoding of a known structure, the Real Vector Automaton. The resulting formalism provides a canonical representation of polyhedra, is closed under Boolean operators, and admits an efficient decision procedure for testing the membership of a vector.
Automatic building extraction from LiDAR data fusion of point and grid-based features
Du, Shouji; Zhang, Yunsheng; Zou, Zhengrong; Xu, Shenghua; He, Xue; Chen, Siyang
2017-08-01
This paper proposes a method for extracting buildings from LiDAR point cloud data by combining point-based and grid-based features. To accurately discriminate buildings from vegetation, a point feature based on the variance of normal vectors is proposed. For a robust building extraction, a graph cuts algorithm is employed to combine the used features and consider the neighbor contexture information. As grid feature computing and a graph cuts algorithm are performed on a grid structure, a feature-retained DSM interpolation method is proposed in this paper. The proposed method is validated by the benchmark ISPRS Test Project on Urban Classification and 3D Building Reconstruction and compared to the state-art-of-the methods. The evaluation shows that the proposed method can obtain a promising result both at area-level and at object-level. The method is further applied to the entire ISPRS dataset and to a real dataset of the Wuhan City. The results show a completeness of 94.9% and a correctness of 92.2% at the per-area level for the former dataset and a completeness of 94.4% and a correctness of 95.8% for the latter one. The proposed method has a good potential for large-size LiDAR data.
Georeferenced Population Datasets of Mexico (GEO-MEX): Urban Place GIS Coverage of Mexico
National Aeronautics and Space Administration — The Urban Place GIS Coverage of Mexico is a vector based point Geographic Information System (GIS) coverage of 696 urban places in Mexico. Each Urban Place is...
International Nuclear Information System (INIS)
Brown, F.B.
1981-01-01
Examination of the global algorithms and local kernels of conventional general-purpose Monte Carlo codes shows that multigroup Monte Carlo methods have sufficient structure to permit efficient vectorization. A structured multigroup Monte Carlo algorithm for vector computers is developed in which many particle events are treated at once on a cell-by-cell basis. Vectorization of kernels for tracking and variance reduction is described, and a new method for discrete sampling is developed to facilitate the vectorization of collision analysis. To demonstrate the potential of the new method, a vectorized Monte Carlo code for multigroup radiation transport analysis was developed. This code incorporates many features of conventional general-purpose production codes, including general geometry, splitting and Russian roulette, survival biasing, variance estimation via batching, a number of cutoffs, and generalized tallies of collision, tracklength, and surface crossing estimators with response functions. Predictions of vectorized performance characteristics for the CYBER-205 were made using emulated coding and a dynamic model of vector instruction timing. Computation rates were examined for a variety of test problems to determine sensitivities to batch size and vector lengths. Significant speedups are predicted for even a few hundred particles per batch, and asymptotic speedups by about 40 over equivalent Amdahl 470V/8 scalar codes arepredicted for a few thousand particles per batch. The principal conclusion is that vectorization of a general-purpose multigroup Monte Carlo code is well worth the significant effort required for stylized coding and major algorithmic changes
Vectors and their applications
Pettofrezzo, Anthony J
2005-01-01
Geared toward undergraduate students, this text illustrates the use of vectors as a mathematical tool in plane synthetic geometry, plane and spherical trigonometry, and analytic geometry of two- and three-dimensional space. Its rigorous development includes a complete treatment of the algebra of vectors in the first two chapters.Among the text's outstanding features are numbered definitions and theorems in the development of vector algebra, which appear in italics for easy reference. Most of the theorems include proofs, and coordinate position vectors receive an in-depth treatment. Key concept
Symbolic computer vector analysis
Stoutemyer, D. R.
1977-01-01
A MACSYMA program is described which performs symbolic vector algebra and vector calculus. The program can combine and simplify symbolic expressions including dot products and cross products, together with the gradient, divergence, curl, and Laplacian operators. The distribution of these operators over sums or products is under user control, as are various other expansions, including expansion into components in any specific orthogonal coordinate system. There is also a capability for deriving the scalar or vector potential of a vector field. Examples include derivation of the partial differential equations describing fluid flow and magnetohydrodynamics, for 12 different classic orthogonal curvilinear coordinate systems.
Evaluation of Uncertainty in Precipitation Datasets for New Mexico, USA
Besha, A. A.; Steele, C. M.; Fernald, A.
2014-12-01
Climate change, population growth and other factors are endangering water availability and sustainability in semiarid/arid areas particularly in the southwestern United States. Wide coverage of spatial and temporal measurements of precipitation are key for regional water budget analysis and hydrological operations which themselves are valuable tool for water resource planning and management. Rain gauge measurements are usually reliable and accurate at a point. They measure rainfall continuously, but spatial sampling is limited. Ground based radar and satellite remotely sensed precipitation have wide spatial and temporal coverage. However, these measurements are indirect and subject to errors because of equipment, meteorological variability, the heterogeneity of the land surface itself and lack of regular recording. This study seeks to understand precipitation uncertainty and in doing so, lessen uncertainty propagation into hydrological applications and operations. We reviewed, compared and evaluated the TRMM (Tropical Rainfall Measuring Mission) precipitation products, NOAA's (National Oceanic and Atmospheric Administration) Global Precipitation Climatology Centre (GPCC) monthly precipitation dataset, PRISM (Parameter elevation Regression on Independent Slopes Model) data and data from individual climate stations including Cooperative Observer Program (COOP), Remote Automated Weather Stations (RAWS), Soil Climate Analysis Network (SCAN) and Snowpack Telemetry (SNOTEL) stations. Though not yet finalized, this study finds that the uncertainty within precipitation estimates datasets is influenced by regional topography, season, climate and precipitation rate. Ongoing work aims to further evaluate precipitation datasets based on the relative influence of these phenomena so that we can identify the optimum datasets for input to statewide water budget analysis.
The NASA Subsonic Jet Particle Image Velocimetry (PIV) Dataset
Bridges, James; Wernet, Mark P.
2011-01-01
Many tasks in fluids engineering require prediction of turbulence of jet flows. The present document documents the single-point statistics of velocity, mean and variance, of cold and hot jet flows. The jet velocities ranged from 0.5 to 1.4 times the ambient speed of sound, and temperatures ranged from unheated to static temperature ratio 2.7. Further, the report assesses the accuracies of the data, e.g., establish uncertainties for the data. This paper covers the following five tasks: (1) Document acquisition and processing procedures used to create the particle image velocimetry (PIV) datasets. (2) Compare PIV data with hotwire and laser Doppler velocimetry (LDV) data published in the open literature. (3) Compare different datasets acquired at the same flow conditions in multiple tests to establish uncertainties. (4) Create a consensus dataset for a range of hot jet flows, including uncertainty bands. (5) Analyze this consensus dataset for self-consistency and compare jet characteristics to those of the open literature. The final objective was fulfilled by using the potential core length and the spread rate of the half-velocity radius to collapse of the mean and turbulent velocity fields over the first 20 jet diameters.
Predicting post-translational lysine acetylation using support vector machines
DEFF Research Database (Denmark)
Gnad, Florian; Ren, Shubin; Choudhary, Chunaram
2010-01-01
spectrometry to identify 3600 lysine acetylation sites on 1750 human proteins covering most of the previously annotated sites and providing the most comprehensive acetylome so far. This dataset should provide an excellent source to train support vector machines (SVMs) allowing the high accuracy in silico...
The Kinetics Human Action Video Dataset
Kay, Will; Carreira, Joao; Simonyan, Karen; Zhang, Brian; Hillier, Chloe; Vijayanarasimhan, Sudheendra; Viola, Fabio; Green, Tim; Back, Trevor; Natsev, Paul; Suleyman, Mustafa; Zisserman, Andrew
2017-01-01
We describe the DeepMind Kinetics human action video dataset. The dataset contains 400 human action classes, with at least 400 video clips for each action. Each clip lasts around 10s and is taken from a different YouTube video. The actions are human focussed and cover a broad range of classes including human-object interactions such as playing instruments, as well as human-human interactions such as shaking hands. We describe the statistics of the dataset, how it was collected, and give some ...
Vector-Vector Scattering on the Lattice
Romero-López, Fernando; Urbach, Carsten; Rusetsky, Akaki
2018-03-01
In this work we present an extension of the LüScher formalism to include the interaction of particles with spin, focusing on the scattering of two vector particles. The derived formalism will be applied to Scalar QED in the Higgs Phase, where the U(1) gauge boson acquires mass.
Kernel-based discriminant feature extraction using a representative dataset
Li, Honglin; Sancho Gomez, Jose-Luis; Ahalt, Stanley C.
2002-07-01
Discriminant Feature Extraction (DFE) is widely recognized as an important pre-processing step in classification applications. Most DFE algorithms are linear and thus can only explore the linear discriminant information among the different classes. Recently, there has been several promising attempts to develop nonlinear DFE algorithms, among which is Kernel-based Feature Extraction (KFE). The efficacy of KFE has been experimentally verified by both synthetic data and real problems. However, KFE has some known limitations. First, KFE does not work well for strongly overlapped data. Second, KFE employs all of the training set samples during the feature extraction phase, which can result in significant computation when applied to very large datasets. Finally, KFE can result in overfitting. In this paper, we propose a substantial improvement to KFE that overcomes the above limitations by using a representative dataset, which consists of critical points that are generated from data-editing techniques and centroid points that are determined by using the Frequency Sensitive Competitive Learning (FSCL) algorithm. Experiments show that this new KFE algorithm performs well on significantly overlapped datasets, and it also reduces computational complexity. Further, by controlling the number of centroids, the overfitting problem can be effectively alleviated.
Selection vector filter framework
Lukac, Rastislav; Plataniotis, Konstantinos N.; Smolka, Bogdan; Venetsanopoulos, Anastasios N.
2003-10-01
We provide a unified framework of nonlinear vector techniques outputting the lowest ranked vector. The proposed framework constitutes a generalized filter class for multichannel signal processing. A new class of nonlinear selection filters are based on the robust order-statistic theory and the minimization of the weighted distance function to other input samples. The proposed method can be designed to perform a variety of filtering operations including previously developed filtering techniques such as vector median, basic vector directional filter, directional distance filter, weighted vector median filters and weighted directional filters. A wide range of filtering operations is guaranteed by the filter structure with two independent weight vectors for angular and distance domains of the vector space. In order to adapt the filter parameters to varying signal and noise statistics, we provide also the generalized optimization algorithms taking the advantage of the weighted median filters and the relationship between standard median filter and vector median filter. Thus, we can deal with both statistical and deterministic aspects of the filter design process. It will be shown that the proposed method holds the required properties such as the capability of modelling the underlying system in the application at hand, the robustness with respect to errors in the model of underlying system, the availability of the training procedure and finally, the simplicity of filter representation, analysis, design and implementation. Simulation studies also indicate that the new filters are computationally attractive and have excellent performance in environments corrupted by bit errors and impulsive noise.
International Nuclear Information System (INIS)
Clark, T.E.; Love, S.T.; Nitta, Muneto; Veldhuis, T. ter; Xiong, C.
2009-01-01
Local oscillations of the brane world are manifested as massive vector fields. Their coupling to the Standard Model can be obtained using the method of nonlinear realizations of the spontaneously broken higher-dimensional space-time symmetries, and to an extent, are model independent. Phenomenological limits on these vector field parameters are obtained using LEP collider data and dark matter constraints
Properties of vector and axial-vector mesons from a generalized Nambu-Jona-Lasinio model
International Nuclear Information System (INIS)
Bernard, V.; Meissner, U.G.; Massachusetts Inst. of Tech., Cambridge; Massachusetts Inst. of Tech., Cambridge
1988-01-01
We construct a generalized Nambu-Jona-Lasinio lagrangian including scalar, pseudoscalar, vector and axial-vector mesons. We specialize to the two-flavor case. The properties of the structured vacuum as well as meson masses and coupling constants are calculated giving an overall agreement within 20% of the experimental data. We investigate the meson properties at finite density. In contrast to the mass of the scalar σ-meson, which decreases sharply with increasing density, the vector meson masses are almost independent of density. Furthermore, the vector-meson-quark coupling constants are also stable against density changes. We point out that these results imply a softening of the nuclear equation of state at high densities. Furthermore, we discuss the breakdown of the KFSR relation on the quark level as well as other deviations from phenomenological concepts such as universality and vector meson dominance. (orig.)
Complex Polynomial Vector Fields
DEFF Research Database (Denmark)
The two branches of dynamical systems, continuous and discrete, correspond to the study of differential equations (vector fields) and iteration of mappings respectively. In holomorphic dynamics, the systems studied are restricted to those described by holomorphic (complex analytic) functions...... or meromorphic (allowing poles as singularities) functions. There already exists a well-developed theory for iterative holomorphic dynamical systems, and successful relations found between iteration theory and flows of vector fields have been one of the main motivations for the recent interest in holomorphic...... vector fields. Since the class of complex polynomial vector fields in the plane is natural to consider, it is remarkable that its study has only begun very recently. There are numerous fundamental questions that are still open, both in the general classification of these vector fields, the decomposition...
Complex Polynomial Vector Fields
DEFF Research Database (Denmark)
Dias, Kealey
vector fields. Since the class of complex polynomial vector fields in the plane is natural to consider, it is remarkable that its study has only begun very recently. There are numerous fundamental questions that are still open, both in the general classification of these vector fields, the decomposition...... of parameter spaces into structurally stable domains, and a description of the bifurcations. For this reason, the talk will focus on these questions for complex polynomial vector fields.......The two branches of dynamical systems, continuous and discrete, correspond to the study of differential equations (vector fields) and iteration of mappings respectively. In holomorphic dynamics, the systems studied are restricted to those described by holomorphic (complex analytic) functions...
BASE MAP DATASET, LOS ANGELES COUNTY, CALIFORNIA
Federal Emergency Management Agency, Department of Homeland Security — FEMA Framework Basemap datasets comprise six of the seven FGDC themes of geospatial data that are used by most GIS applications (Note: the seventh framework theme,...
BASE MAP DATASET, CHEROKEE COUNTY, SOUTH CAROLINA
Federal Emergency Management Agency, Department of Homeland Security — FEMA Framework Basemap datasets comprise six of the seven FGDC themes of geospatial data that are used by most GIS applications (Note: the seventh framework theme,...
SIAM 2007 Text Mining Competition dataset
National Aeronautics and Space Administration — Subject Area: Text Mining Description: This is the dataset used for the SIAM 2007 Text Mining competition. This competition focused on developing text mining...
Harvard Aging Brain Study : Dataset and accessibility
Dagley, Alexander; LaPoint, Molly; Huijbers, Willem; Hedden, Trey; McLaren, Donald G.; Chatwal, Jasmeer P.; Papp, Kathryn V.; Amariglio, Rebecca E.; Blacker, Deborah; Rentz, Dorene M.; Johnson, Keith A.; Sperling, Reisa A.; Schultz, Aaron P.
2017-01-01
The Harvard Aging Brain Study is sharing its data with the global research community. The longitudinal dataset consists of a 284-subject cohort with the following modalities acquired: demographics, clinical assessment, comprehensive neuropsychological testing, clinical biomarkers, and neuroimaging.
BASE MAP DATASET, HONOLULU COUNTY, HAWAII, USA
Federal Emergency Management Agency, Department of Homeland Security — FEMA Framework Basemap datasets comprise six of the seven FGDC themes of geospatial data that are used by most GIS applications (Note: the seventh framework theme,...
BASE MAP DATASET, EDGEFIELD COUNTY, SOUTH CAROLINA
Federal Emergency Management Agency, Department of Homeland Security — FEMA Framework Basemap datasets comprise six of the seven FGDC themes of geospatial data that are used by most GIS applications (Note: the seventh framework theme,...
Simulation of Smart Home Activity Datasets
Directory of Open Access Journals (Sweden)
Jonathan Synnott
2015-06-01
Full Text Available A globally ageing population is resulting in an increased prevalence of chronic conditions which affect older adults. Such conditions require long-term care and management to maximize quality of life, placing an increasing strain on healthcare resources. Intelligent environments such as smart homes facilitate long-term monitoring of activities in the home through the use of sensor technology. Access to sensor datasets is necessary for the development of novel activity monitoring and recognition approaches. Access to such datasets is limited due to issues such as sensor cost, availability and deployment time. The use of simulated environments and sensors may address these issues and facilitate the generation of comprehensive datasets. This paper provides a review of existing approaches for the generation of simulated smart home activity datasets, including model-based approaches and interactive approaches which implement virtual sensors, environments and avatars. The paper also provides recommendation for future work in intelligent environment simulation.
Simulation of Smart Home Activity Datasets.
Synnott, Jonathan; Nugent, Chris; Jeffers, Paul
2015-06-16
A globally ageing population is resulting in an increased prevalence of chronic conditions which affect older adults. Such conditions require long-term care and management to maximize quality of life, placing an increasing strain on healthcare resources. Intelligent environments such as smart homes facilitate long-term monitoring of activities in the home through the use of sensor technology. Access to sensor datasets is necessary for the development of novel activity monitoring and recognition approaches. Access to such datasets is limited due to issues such as sensor cost, availability and deployment time. The use of simulated environments and sensors may address these issues and facilitate the generation of comprehensive datasets. This paper provides a review of existing approaches for the generation of simulated smart home activity datasets, including model-based approaches and interactive approaches which implement virtual sensors, environments and avatars. The paper also provides recommendation for future work in intelligent environment simulation.
Environmental Dataset Gateway (EDG) REST Interface
U.S. Environmental Protection Agency — Use the Environmental Dataset Gateway (EDG) to find and access EPA's environmental resources. Many options are available for easily reusing EDG content in other...
BASE MAP DATASET, INYO COUNTY, OKLAHOMA
Federal Emergency Management Agency, Department of Homeland Security — FEMA Framework Basemap datasets comprise six of the seven FGDC themes of geospatial data that are used by most GIS applications (Note: the seventh framework theme,...
BASE MAP DATASET, JACKSON COUNTY, OKLAHOMA
Federal Emergency Management Agency, Department of Homeland Security — FEMA Framework Basemap datasets comprise six of the seven FGDC themes of geospatial data that are used by most GIS applications (Note: the seventh framework theme,...
BASE MAP DATASET, SANTA CRIZ COUNTY, CALIFORNIA
Federal Emergency Management Agency, Department of Homeland Security — FEMA Framework Basemap datasets comprise six of the seven FGDC themes of geospatial data that are used by most GIS applications (Note: the seventh framework theme,...
Climate Prediction Center IR 4km Dataset
National Oceanic and Atmospheric Administration, Department of Commerce — CPC IR 4km dataset was created from all available individual geostationary satellite data which have been merged to form nearly seamless global (60N-60S) IR...
BASE MAP DATASET, MAYES COUNTY, OKLAHOMA, USA
Federal Emergency Management Agency, Department of Homeland Security — FEMA Framework Basemap datasets comprise six of the seven FGDC themes of geospatial data that are used by most GIS applications: cadastral, geodetic control,...
BASE MAP DATASET, KINGFISHER COUNTY, OKLAHOMA
Federal Emergency Management Agency, Department of Homeland Security — FEMA Framework Basemap datasets comprise six of the seven FGDC themes of geospatial data that are used by most GIS applications (Note: the seventh framework theme,...
Charmless Hadronic B Decays into Vector, Axial Vector and Tensor Final States at BaBar
International Nuclear Information System (INIS)
Gandini, Paolo
2012-01-01
We present experimental measurements of branching fraction and longitudinal polarization fraction in charmless hadronic B decays into vector, axial vector and tensor final states with the final dataset of BABAR. Measurements of such kind of decays are a powerful tool both to test the Standard Model and search possible sources of new physics. In this document we present a short review of the last experimental results at BABAR concerning charmless quasi two-body decays in final states containing particles with spin 1 or spin 2 and different parities. This kind of decays has received considerable theoretical interest in the last few years and this particular attention has led to interesting experimental results at the current b-factories. In fact, the study of longitudinal polarization fraction f L in charmless B decays to vector vector (VV), vector axial-vector (VA) and axial-vector axial-vector (AA) mesons provides information on the underlying helicity structure of the decay mechanism. Naive helicity conservation arguments predict a dominant longitudinal polarization fraction f L ∼ 1 for both tree and penguin dominated decays and this pattern seems to be confirmed by tree-dominated B → ρρ and B + → (Omega)ρ + decays. Other penguin dominated decays, instead, show a different behavior: the measured value of f L ∼ 0.5 in B → φK* decays is in contrast with naive Standard Model (SM) calculations. Several solutions have been proposed such as the introduction of non-factorizable terms and penguin-annihilation amplitudes, while other explanations invoke new physics. New modes have been investigated to shed more light on the problem.
A Hamilton-like vector for the special-relativistic Coulomb problem
International Nuclear Information System (INIS)
Munoz, Gerardo; Pavic, Ivana
2006-01-01
A relativistic point charge moving in a Coulomb potential does not admit a conserved Hamilton vector. Despite this fact, a Hamilton-like vector may be developed that proves useful in the derivation and analysis of the particle's orbit
Supplier Short Term Load Forecasting Using Support Vector Regression and Exogenous Input
Matijaš, Marin; Vukićcević, Milan; Krajcar, Slavko
2011-09-01
In power systems, task of load forecasting is important for keeping equilibrium between production and consumption. With liberalization of electricity markets, task of load forecasting changed because each market participant has to forecast their own load. Consumption of end-consumers is stochastic in nature. Due to competition, suppliers are not in a position to transfer their costs to end-consumers; therefore it is essential to keep forecasting error as low as possible. Numerous papers are investigating load forecasting from the perspective of the grid or production planning. We research forecasting models from the perspective of a supplier. In this paper, we investigate different combinations of exogenous input on the simulated supplier loads and show that using points of delivery as a feature for Support Vector Regression leads to lower forecasting error, while adding customer number in different datasets does the opposite.
Fractal vector optical fields.
Pan, Yue; Gao, Xu-Zhen; Cai, Meng-Qiang; Zhang, Guan-Lin; Li, Yongnan; Tu, Chenghou; Wang, Hui-Tian
2016-07-15
We introduce the concept of a fractal, which provides an alternative approach for flexibly engineering the optical fields and their focal fields. We propose, design, and create a new family of optical fields-fractal vector optical fields, which build a bridge between the fractal and vector optical fields. The fractal vector optical fields have polarization states exhibiting fractal geometry, and may also involve the phase and/or amplitude simultaneously. The results reveal that the focal fields exhibit self-similarity, and the hierarchy of the fractal has the "weeding" role. The fractal can be used to engineer the focal field.
Towards extending IFC with point cloud data
Krijnen, T.F.; Beetz, J.; Ochmann, S.; Vock, R.; Wessel, R.
2015-01-01
In this paper we suggest an extension to the Industry Foundation Classes model to integrate point cloud datasets. The proposal includes a schema extension to the core model allowing the storage of points either as Cartesian coordinates, points in parametric space of a surface associated with a
Full Text Available ... en español Blog About OnSafety CPSC Stands for Safety The Tipping Point Home > 60 Seconds of Safety (Videos) > The Tipping Point The Tipping Point by ... danger death electrical fall furniture head injury product safety television tipover tv Watch the video in Adobe ...
The divergence theorem for unbounded vector fields
De Pauw, Thierry; Pfeffer, Washek F.
2007-01-01
In the context of Lebesgue integration, we derive the divergence theorem for unbounded vector. elds that can have singularities at every point of a compact set whose Minkowski content of codimension greater than two is. nite. The resulting integration by parts theorem is applied to removable sets of holomorphic and harmonic functions.
Automatic Registration of Vehicle-borne Mobile Mapping Laser Point Cloud and Sequent Panoramas
Directory of Open Access Journals (Sweden)
CHEN Chi
2018-02-01
Full Text Available An automatic registration method of mobile mapping system laser point cloud and sequence panoramic image is proposed in this paper.Firstly,hierarchical object extraction method is applied on LiDAR data to extract the building façade and outline polygons are generated to construct the skyline vectors.A virtual imaging method is proposed to solve the distortion on panoramas and corners on skylines are further detected on the virtual images combining segmentation and corner detection results.Secondly,the detected skyline vectors are taken as the registration primitives.Registration graphs are built according to the extracted skyline vector and further matched under graph edit distance minimization criteria.The matched conjugate primitives are utilized to solve the 2D-3D rough registration model to obtain the initial transformation between the sequence panoramic image coordinate system and the LiDAR point cloud coordinate system.Finally,to reduce the impact of registration primitives extraction and matching error on the registration results,the optimal transformation between the multi view stereo matching dens point cloud generated from the virtual imaging of the sequent panoramas and the LiDAR point cloud are solved by a 3D-3D ICP registration algorithm variant,thus,refine the exterior orientation parameters of panoramas indirectly.Experiments are undertaken to validate the proposed method and the results show that 1.5 pixel level registration results are achieved on the experiment dataset.The registration results can be applied to point cloud and panoramas fusion applications such as true color point cloud generation.
Density Based Support Vector Machines for Classification
Zahra Nazari; Dongshik Kang
2015-01-01
Support Vector Machines (SVM) is the most successful algorithm for classification problems. SVM learns the decision boundary from two classes (for Binary Classification) of training points. However, sometimes there are some less meaningful samples amongst training points, which are corrupted by noises or misplaced in wrong side, called outliers. These outliers are affecting on margin and classification performance, and machine should better to discard them. SVM as a popular and widely used cl...
Noncausal Bayesian Vector Autoregression
DEFF Research Database (Denmark)
Lanne, Markku; Luoto, Jani
We propose a Bayesian inferential procedure for the noncausal vector autoregressive (VAR) model that is capable of capturing nonlinearities and incorporating effects of missing variables. In particular, we devise a fast and reliable posterior simulator that yields the predictive distribution...
Curjel, C. R.
1990-01-01
Presented are activities that help students understand the idea of a vector field. Included are definitions, flow lines, tangential and normal components along curves, flux and work, field conservation, and differential equations. (KR)
Sesquilinear uniform vector integral
Indian Academy of Sciences (India)
theory, together with his integral, dominate contemporary mathematics. ... directions belonging to Bartle and Dinculeanu (see [1], [6], [7] and [2]). ... in this manner, namely he integrated vector functions with respect to measures of bounded.
Czech Academy of Sciences Publication Activity Database
Krejčí, Pavel
1991-01-01
Roč. 2, - (1991), s. 281-292 ISSN 0956-7925 Keywords : vector hysteresis operator * hysteresis potential * differential inequality Subject RIV: BA - General Mathematics http://www.math.cas.cz/~krejci/b15p.pdf
Comparison of Shallow Survey 2012 Multibeam Datasets
Ramirez, T. M.
2012-12-01
The purpose of the Shallow Survey common dataset is a comparison of the different technologies utilized for data acquisition in the shallow survey marine environment. The common dataset consists of a series of surveys conducted over a common area of seabed using a variety of systems. It provides equipment manufacturers the opportunity to showcase their latest systems while giving hydrographic researchers and scientists a chance to test their latest algorithms on the dataset so that rigorous comparisons can be made. Five companies collected data for the Common Dataset in the Wellington Harbor area in New Zealand between May 2010 and May 2011; including Kongsberg, Reson, R2Sonic, GeoAcoustics, and Applied Acoustics. The Wellington harbor and surrounding coastal area was selected since it has a number of well-defined features, including the HMNZS South Seas and HMNZS Wellington wrecks, an armored seawall constructed of Tetrapods and Akmons, aquifers, wharves and marinas. The seabed inside the harbor basin is largely fine-grained sediment, with gravel and reefs around the coast. The area outside the harbor on the southern coast is an active environment, with moving sand and exposed reefs. A marine reserve is also in this area. For consistency between datasets, the coastal research vessel R/V Ikatere and crew were used for all surveys conducted for the common dataset. Using Triton's Perspective processing software multibeam datasets collected for the Shallow Survey were processed for detail analysis. Datasets from each sonar manufacturer were processed using the CUBE algorithm developed by the Center for Coastal and Ocean Mapping/Joint Hydrographic Center (CCOM/JHC). Each dataset was gridded at 0.5 and 1.0 meter resolutions for cross comparison and compliance with International Hydrographic Organization (IHO) requirements. Detailed comparisons were made of equipment specifications (transmit frequency, number of beams, beam width), data density, total uncertainty, and
Columbia River ESI: NESTS (Nest Points)
National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains sensitive biological resource data for bird nesting sites in the Columbia River area. Vector points in this data set represent locations of...
Southeast Alaska ESI: FISHPT (Fish Points)
National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains biological resource data for anadromous fish streams in Southeast Alaska. Vector points in this data set represent locations of fish streams....
Louisiana ESI: SOCECON (Socioeconomic Resource Points)
National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains human-use resource data for airport, heliport, marina, and boat ramp locations in Louisiana. Vector points in this data set represent the...
North Slope, Alaska ESI: FACILITY (Facility Points)
National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains data for oil field facilities for the North Slope of Alaska. Vector points in this data set represent oil field facility locations. This data...
Cosmological Solutions of Tensor–Vector Theories of Gravity by ...
Indian Academy of Sciences (India)
We consider tensor–vector theories by varying the space- time–matter coupling ... solutions by considering the character of critical points of the theory and their stability .... light (Magueijo 2003) that has arisen from the possibility of varying fine structure constant. ... Vector-like dark energy displays a series of properties that.
Support vector machines applications
Guo, Guodong
2014-01-01
Support vector machines (SVM) have both a solid mathematical background and good performance in practical applications. This book focuses on the recent advances and applications of the SVM in different areas, such as image processing, medical practice, computer vision, pattern recognition, machine learning, applied statistics, business intelligence, and artificial intelligence. The aim of this book is to create a comprehensive source on support vector machine applications, especially some recent advances.
International Nuclear Information System (INIS)
Akama, K.; Hattori, T.; Yasue, M.
1991-01-01
An exotic composite vector boson V is introduced in two dynamical models of composite quarks, leptons, W, and Z. One is based on four-Fermi interactions, in which composite vector bosons are regarded as fermion-antifermion bound states and the other is based on the confining SU(2) L gauge model, in which they are given by scalar-antiscalar bound states. Both approaches describe the same effective interactions for the sector of composite quarks, leptons, W, Z, γ, and V
Melillo Fenech, Tanya
2010-01-01
A vector-borne disease is one in which the pathogenic microorganism is transmitted from an infected individual to another individual by an arthropod or other agent. The transmission depends upon the attributes and requirements of at least three different Iiving organisms : the pathologic agent which is either a virus, protozoa, bacteria or helminth (worm); the vector, which is commonly an arthropod such as ticks or mosquitoes; and the human host.
International Nuclear Information System (INIS)
Yan, Zhenya
2011-01-01
The coupled nonlinear volatility and option pricing model presented recently by Ivancevic is investigated, which generates a leverage effect, i.e., stock volatility is (negatively) correlated to stock returns, and can be regarded as a coupled nonlinear wave alternative of the Black–Scholes option pricing model. In this Letter, we analytically propose vector financial rogue waves of the coupled nonlinear volatility and option pricing model without an embedded w-learning. Moreover, we exhibit their dynamical behaviors for chosen different parameters. The vector financial rogue wave (rogon) solutions may be used to describe the possible physical mechanisms for the rogue wave phenomena and to further excite the possibility of relative researches and potential applications of vector rogue waves in the financial markets and other related fields. -- Highlights: ► We investigate the coupled nonlinear volatility and option pricing model. ► We analytically present vector financial rogue waves. ► The vector financial rogue waves may be used to describe the extreme events in financial markets. ► This results may excite the relative researches and potential applications of vector rogue waves.
Dataset of statements on policy integration of selected intergovernmental organizations
Directory of Open Access Journals (Sweden)
Jale Tosun
2018-04-01
Full Text Available This article describes data for 78 intergovernmental organizations (IGOs working on topics related to energy governance, environmental protection, and the economy. The number of IGOs covered also includes organizations active in other sectors. The point of departure for data construction was the Correlates of War dataset, from which we selected this sample of IGOs. We updated and expanded the empirical information on the IGOs selected by manual coding. Most importantly, we collected the primary law texts of the individual IGOs in order to code whether they commit themselves to environmental policy integration (EPI, climate policy integration (CPI and/or energy policy integration (EnPI.
Multiresolution persistent homology for excessively large biomolecular datasets
Energy Technology Data Exchange (ETDEWEB)
Xia, Kelin; Zhao, Zhixiong [Department of Mathematics, Michigan State University, East Lansing, Michigan 48824 (United States); Wei, Guo-Wei, E-mail: wei@math.msu.edu [Department of Mathematics, Michigan State University, East Lansing, Michigan 48824 (United States); Department of Electrical and Computer Engineering, Michigan State University, East Lansing, Michigan 48824 (United States); Department of Biochemistry and Molecular Biology, Michigan State University, East Lansing, Michigan 48824 (United States)
2015-10-07
Although persistent homology has emerged as a promising tool for the topological simplification of complex data, it is computationally intractable for large datasets. We introduce multiresolution persistent homology to handle excessively large datasets. We match the resolution with the scale of interest so as to represent large scale datasets with appropriate resolution. We utilize flexibility-rigidity index to access the topological connectivity of the data set and define a rigidity density for the filtration analysis. By appropriately tuning the resolution of the rigidity density, we are able to focus the topological lens on the scale of interest. The proposed multiresolution topological analysis is validated by a hexagonal fractal image which has three distinct scales. We further demonstrate the proposed method for extracting topological fingerprints from DNA molecules. In particular, the topological persistence of a virus capsid with 273 780 atoms is successfully analyzed which would otherwise be inaccessible to the normal point cloud method and unreliable by using coarse-grained multiscale persistent homology. The proposed method has also been successfully applied to the protein domain classification, which is the first time that persistent homology is used for practical protein domain analysis, to our knowledge. The proposed multiresolution topological method has potential applications in arbitrary data sets, such as social networks, biological networks, and graphs.
A cross-country Exchange Market Pressure (EMP dataset
Directory of Open Access Journals (Sweden)
Mohit Desai
2017-06-01
Full Text Available The data presented in this article are related to the research article titled - “An exchange market pressure measure for cross country analysis” (Patnaik et al. [1]. In this article, we present the dataset for Exchange Market Pressure values (EMP for 139 countries along with their conversion factors, ρ (rho. Exchange Market Pressure, expressed in percentage change in exchange rate, measures the change in exchange rate that would have taken place had the central bank not intervened. The conversion factor ρ can interpreted as the change in exchange rate associated with $1 billion of intervention. Estimates of conversion factor ρ allow us to calculate a monthly time series of EMP for 139 countries. Additionally, the dataset contains the 68% confidence interval (high and low values for the point estimates of ρ’s. Using the standard errors of estimates of ρ’s, we obtain one sigma intervals around mean estimates of EMP values. These values are also reported in the dataset.
Coal demand prediction based on a support vector machine model
Energy Technology Data Exchange (ETDEWEB)
Jia, Cun-liang; Wu, Hai-shan; Gong, Dun-wei [China University of Mining & Technology, Xuzhou (China). School of Information and Electronic Engineering
2007-01-15
A forecasting model for coal demand of China using a support vector regression was constructed. With the selected embedding dimension, the output vectors and input vectors were constructed based on the coal demand of China from 1980 to 2002. After compared with lineal kernel and Sigmoid kernel, a radial basis function(RBF) was adopted as the kernel function. By analyzing the relationship between the error margin of prediction and the model parameters, the proper parameters were chosen. The support vector machines (SVM) model with multi-input and single output was proposed. Compared the predictor based on RBF neural networks with test datasets, the results show that the SVM predictor has higher precision and greater generalization ability. In the end, the coal demand from 2003 to 2006 is accurately forecasted. l0 refs., 2 figs., 4 tabs.
3DSEM: A 3D microscopy dataset
Directory of Open Access Journals (Sweden)
Ahmad P. Tafti
2016-03-01
Full Text Available The Scanning Electron Microscope (SEM as a 2D imaging instrument has been widely used in many scientific disciplines including biological, mechanical, and materials sciences to determine the surface attributes of microscopic objects. However the SEM micrographs still remain 2D images. To effectively measure and visualize the surface properties, we need to truly restore the 3D shape model from 2D SEM images. Having 3D surfaces would provide anatomic shape of micro-samples which allows for quantitative measurements and informative visualization of the specimens being investigated. The 3DSEM is a dataset for 3D microscopy vision which is freely available at [1] for any academic, educational, and research purposes. The dataset includes both 2D images and 3D reconstructed surfaces of several real microscopic samples. Keywords: 3D microscopy dataset, 3D microscopy vision, 3D SEM surface reconstruction, Scanning Electron Microscope (SEM
Data Mining for Imbalanced Datasets: An Overview
Chawla, Nitesh V.
A dataset is imbalanced if the classification categories are not approximately equally represented. Recent years brought increased interest in applying machine learning techniques to difficult "real-world" problems, many of which are characterized by imbalanced data. Additionally the distribution of the testing data may differ from that of the training data, and the true misclassification costs may be unknown at learning time. Predictive accuracy, a popular choice for evaluating performance of a classifier, might not be appropriate when the data is imbalanced and/or the costs of different errors vary markedly. In this Chapter, we discuss some of the sampling techniques used for balancing the datasets, and the performance measures more appropriate for mining imbalanced datasets.
Genomics dataset of unidentified disclosed isolates
Directory of Open Access Journals (Sweden)
Bhagwan N. Rekadwad
2016-09-01
Full Text Available Analysis of DNA sequences is necessary for higher hierarchical classification of the organisms. It gives clues about the characteristics of organisms and their taxonomic position. This dataset is chosen to find complexities in the unidentified DNA in the disclosed patents. A total of 17 unidentified DNA sequences were thoroughly analyzed. The quick response codes were generated. AT/GC content of the DNA sequences analysis was carried out. The QR is helpful for quick identification of isolates. AT/GC content is helpful for studying their stability at different temperatures. Additionally, a dataset on cleavage code and enzyme code studied under the restriction digestion study, which helpful for performing studies using short DNA sequences was reported. The dataset disclosed here is the new revelatory data for exploration of unique DNA sequences for evaluation, identification, comparison and analysis. Keywords: BioLABs, Blunt ends, Genomics, NEB cutter, Restriction digestion, Short DNA sequences, Sticky ends
Directory of Open Access Journals (Sweden)
Bisanzio Donal
2011-12-01
Full Text Available Abstract Background West Nile Virus (WNV transmission in Italy was first reported in 1998 as an equine outbreak near the swamps of Padule di Fucecchio, Tuscany. No other cases were identified during the following decade until 2008, when horse and human outbreaks were reported in Emilia Romagna, North Italy. Since then, WNV outbreaks have occurred annually, spreading from their initial northern foci throughout the country. Following the outbreak in 1998 the Italian public health authority defined a surveillance plan to detect WNV circulation in birds, horses and mosquitoes. By applying spatial statistical analysis (spatial point pattern analysis and models (Bayesian GLMM models to a longitudinal dataset on the abundance of the three putative WNV vectors [Ochlerotatus caspius (Pallas 1771, Culex pipiens (Linnaeus 1758 and Culex modestus (Ficalbi 1890] in eastern Piedmont, we quantified their abundance and distribution in space and time and generated prediction maps outlining the areas with the highest vector productivity and potential for WNV introduction and amplification. Results The highest abundance and significant spatial clusters of Oc. caspius and Cx. modestus were in proximity to rice fields, and for Cx. pipiens, in proximity to highly populated urban areas. The GLMM model showed the importance of weather conditions and environmental factors in predicting mosquito abundance. Distance from the preferential breeding sites and elevation were negatively associated with the number of collected mosquitoes. The Normalized Difference Vegetation Index (NDVI was positively correlated with mosquito abundance in rice fields (Oc. caspius and Cx. modestus. Based on the best models, we developed prediction maps for the year 2010 outlining the areas where high abundance of vectors could favour the introduction and amplification of WNV. Conclusions Our findings provide useful information for surveillance activities aiming to identify locations where the
Harvard Aging Brain Study: Dataset and accessibility.
Dagley, Alexander; LaPoint, Molly; Huijbers, Willem; Hedden, Trey; McLaren, Donald G; Chatwal, Jasmeer P; Papp, Kathryn V; Amariglio, Rebecca E; Blacker, Deborah; Rentz, Dorene M; Johnson, Keith A; Sperling, Reisa A; Schultz, Aaron P
2017-01-01
The Harvard Aging Brain Study is sharing its data with the global research community. The longitudinal dataset consists of a 284-subject cohort with the following modalities acquired: demographics, clinical assessment, comprehensive neuropsychological testing, clinical biomarkers, and neuroimaging. To promote more extensive analyses, imaging data was designed to be compatible with other publicly available datasets. A cloud-based system enables access to interested researchers with blinded data available contingent upon completion of a data usage agreement and administrative approval. Data collection is ongoing and currently in its fifth year. Copyright © 2015 Elsevier Inc. All rights reserved.
On sample size and different interpretations of snow stability datasets
Schirmer, M.; Mitterer, C.; Schweizer, J.
2009-04-01
aspect distributions to the large dataset. We used 100 different subsets for each sample size. Statistical variations obtained in the complete dataset were also tested on the smaller subsets using the Mann-Whitney or the Kruskal-Wallis test. For each subset size, the number of subsets were counted in which the significance level was reached. For these tests no nominal data scale was assumed. (iii) For the same subsets described above, the distribution of the aspect median was determined. A count of how often this distribution was substantially different from the distribution obtained with the complete dataset was made. Since two valid stability interpretations were available (an objective and a subjective interpretation as described above), the effect of the arbitrary choice of the interpretation on spatial variability results was tested. In over one third of the cases the two interpretations came to different results. The effect of these differences were studied in a similar method as described in (iii): the distribution of the aspect median was determined for subsets of the complete dataset using both interpretations, compared against each other as well as to the results of the complete dataset. For the complete dataset the two interpretations showed mainly identical results. Therefore the subset size was determined from the point at which the results of the two interpretations converged. A universal result for the optimal subset size cannot be presented since results differed between different situations contained in the dataset. The optimal subset size is thus dependent on stability variation in a given situation, which is unknown initially. There are indications that for some situations even the complete dataset might be not large enough. At a subset size of approximately 25, the significant differences between aspect groups (as determined using the whole dataset) were only obtained in one out of five situations. In some situations, up to 20% of the subsets showed a
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 5. Fixed Points - From Russia with Love - A Primer of Fixed Point Theory. A K Vijaykumar. Book Review Volume 5 Issue 5 May 2000 pp 101-102. Fulltext. Click here to view fulltext PDF. Permanent link:
Full Text Available ... OnSafety CPSC Stands for Safety The Tipping Point Home > 60 Seconds of Safety (Videos) > The Tipping Point ... 24 hours a day. For young children whose home is a playground, it’s the best way to ...
Full Text Available ... 60 Seconds of Safety (Videos) > The Tipping Point The Tipping Point by CPSC Blogger September 22, 2009 appliance child Childproofing CPSC danger death electrical fall furniture head injury product safety television tipover tv Watch the video in Adobe Flash ...
Vectorization in quantum chemistry
International Nuclear Information System (INIS)
Saunders, V.R.
1987-01-01
It is argued that the optimal vectorization algorithm for many steps (and sub-steps) in a typical ab initio calculation of molecular electronic structure is quite strongly dependent on the target vector machine. Details such as the availability (or lack) of a given vector construct in the hardware, vector startup times and asymptotic rates must all be considered when selecting the optimal algorithm. Illustrations are drawn from: gaussian integral evaluation, fock matrix construction, 4-index transformation of molecular integrals, direct-CI methods, the matrix multiply operation. A cross comparison of practical implementations on the CDC Cyber 205, the Cray-IS and Cray-XMP machines is presented. To achieve portability while remaining optimal on a wide range of machines it is necessary to code all available algorithms in a machine independent manner, and to select the appropriate algorithm using a procedure which is based on machine dependent parameters. Most such parameters concern the timing of certain vector loop kernals, which can usually be derived from a 'bench-marking' routine executed prior to the calculation proper
Random Coefficient Logit Model for Large Datasets
C. Hernández-Mireles (Carlos); D. Fok (Dennis)
2010-01-01
textabstractWe present an approach for analyzing market shares and products price elasticities based on large datasets containing aggregate sales data for many products, several markets and for relatively long time periods. We consider the recently proposed Bayesian approach of Jiang et al [Jiang,
Thesaurus Dataset of Educational Technology in Chinese
Wu, Linjing; Liu, Qingtang; Zhao, Gang; Huang, Huan; Huang, Tao
2015-01-01
The thesaurus dataset of educational technology is a knowledge description of educational technology in Chinese. The aims of this thesaurus were to collect the subject terms in the domain of educational technology, facilitate the standardization of terminology and promote the communication between Chinese researchers and scholars from various…
Determination of key parameters of vector multifractal vector fields
Schertzer, D. J. M.; Tchiguirinskaia, I.
2017-12-01
For too long time, multifractal analyses and simulations have been restricted to scalar-valued fields (Schertzer and Tchiguirinskaia, 2017a,b). For instance, the wind velocity multifractality has been mostly analysed in terms of scalar structure functions and with the scalar energy flux. This restriction has had the unfortunate consequences that multifractals were applicable to their full extent in geophysics, whereas it has inspired them. Indeed a key question in geophysics is the complexity of the interactions between various fields or they components. Nevertheless, sophisticated methods have been developed to determine the key parameters of scalar valued fields. In this communication, we first present the vector extensions of the universal multifractal analysis techniques to multifractals whose generator belong to a Levy-Clifford algebra (Schertzer and Tchiguirinskaia, 2015). We point out further extensions noting the increased complexity. For instance, the (scalar) index of multifractality becomes a matrice. Schertzer, D. and Tchiguirinskaia, I. (2015) `Multifractal vector fields and stochastic Clifford algebra', Chaos: An Interdisciplinary Journal of Nonlinear Science, 25(12), p. 123127. doi: 10.1063/1.4937364. Schertzer, D. and Tchiguirinskaia, I. (2017) `An Introduction to Multifractals and Scale Symmetry Groups', in Ghanbarian, B. and Hunt, A. (eds) Fractals: Concepts and Applications in Geosciences. CRC Press, p. (in press). Schertzer, D. and Tchiguirinskaia, I. (2017b) `Pandora Box of Multifractals: Barely Open ?', in Tsonis, A. A. (ed.) 30 Years of Nonlinear Dynamics in Geophysics. Berlin: Springer, p. (in press).
Kernel method for clustering based on optimal target vector
International Nuclear Information System (INIS)
Angelini, Leonardo; Marinazzo, Daniele; Pellicoro, Mario; Stramaglia, Sebastiano
2006-01-01
We introduce Ising models, suitable for dichotomic clustering, with couplings that are (i) both ferro- and anti-ferromagnetic (ii) depending on the whole data-set and not only on pairs of samples. Couplings are determined exploiting the notion of optimal target vector, here introduced, a link between kernel supervised and unsupervised learning. The effectiveness of the method is shown in the case of the well-known iris data-set and in benchmarks of gene expression levels, where it works better than existing methods for dichotomic clustering
Vector Fields on Product Manifolds
Kurz, Stefan
2011-01-01
This short report establishes some basic properties of smooth vector fields on product manifolds. The main results are: (i) On a product manifold there always exists a direct sum decomposition into horizontal and vertical vector fields. (ii) Horizontal and vertical vector fields are naturally isomorphic to smooth families of vector fields defined on the factors. Vector fields are regarded as derivations of the algebra of smooth functions.
Line Width Recovery after Vectorization of Engineering Drawings
Directory of Open Access Journals (Sweden)
Gramblička Matúš
2016-12-01
Full Text Available Vectorization is the conversion process of a raster image representation into a vector representation. The contemporary commercial vectorization software applications do not provide sufficiently high quality outputs for such images as do mechanical engineering drawings. Line width preservation is one of the problems. There are applications which need to know the line width after vectorization because this line attribute carries the important semantic information for the next 3D model generation. This article describes the algorithm that is able to recover line width of individual lines in the vectorized engineering drawings. Two approaches are proposed, one examines the line width at three points, whereas the second uses a variable number of points depending on the line length. The algorithm is tested on real mechanical engineering drawings.
Parallel Framework for Dimensionality Reduction of Large-Scale Datasets
Directory of Open Access Journals (Sweden)
Sai Kiranmayee Samudrala
2015-01-01
Full Text Available Dimensionality reduction refers to a set of mathematical techniques used to reduce complexity of the original high-dimensional data, while preserving its selected properties. Improvements in simulation strategies and experimental data collection methods are resulting in a deluge of heterogeneous and high-dimensional data, which often makes dimensionality reduction the only viable way to gain qualitative and quantitative understanding of the data. However, existing dimensionality reduction software often does not scale to datasets arising in real-life applications, which may consist of thousands of points with millions of dimensions. In this paper, we propose a parallel framework for dimensionality reduction of large-scale data. We identify key components underlying the spectral dimensionality reduction techniques, and propose their efficient parallel implementation. We show that the resulting framework can be used to process datasets consisting of millions of points when executed on a 16,000-core cluster, which is beyond the reach of currently available methods. To further demonstrate applicability of our framework we perform dimensionality reduction of 75,000 images representing morphology evolution during manufacturing of organic solar cells in order to identify how processing parameters affect morphology evolution.
Applicability of vector processing to large-scale nuclear codes
International Nuclear Information System (INIS)
Ishiguro, Misako; Harada, Hiroo; Matsuura, Toshihiko; Okuda, Motoi; Ohta, Fumio; Umeya, Makoto.
1982-03-01
To meet the growing trend of computational requirements in JAERI, introduction of a high-speed computer with vector processing faculty (a vector processor) is desirable in the near future. To make effective use of a vector processor, appropriate optimization of nuclear codes to pipelined-vector architecture is vital, which will pose new problems concerning code development and maintenance. In this report, vector processing efficiency is assessed with respect to large-scale nuclear codes by examining the following items: 1) The present feature of computational load in JAERI is analyzed by compiling the computer utilization statistics. 2) Vector processing efficiency is estimated for the ten heavily-used nuclear codes by analyzing their dynamic behaviors run on a scalar machine. 3) Vector processing efficiency is measured for the other five nuclear codes by using the current vector processors, FACOM 230-75 APU and CRAY-1. 4) Effectiveness of applying a high-speed vector processor to nuclear codes is evaluated by taking account of the characteristics in JAERI jobs. Problems of vector processors are also discussed from the view points of code performance and ease of use. (author)
The index of a vector field under blow ups
International Nuclear Information System (INIS)
Seade, J.
1991-08-01
A useful technique when studying the behaviour of holomorphic vector fields around their isolated singularities is that of blowing up the singular points. On the other hand, the most basic invariant of a vector field with isolated singularities is its local index, as defined by Poincare and Hopf. It is thus natural to ask how does the index of a vector field behaves under blowing ups? The purpose of this work is to study and answer this question, by taking a rather general point of view and bearing in mind that complex manifolds have a powerful birational invariant, the Todd genus. 20 refs
Analysis of Naïve Bayes Algorithm for Email Spam Filtering across Multiple Datasets
Fitriah Rusland, Nurul; Wahid, Norfaradilla; Kasim, Shahreen; Hafit, Hanayanti
2017-08-01
E-mail spam continues to become a problem on the Internet. Spammed e-mail may contain many copies of the same message, commercial advertisement or other irrelevant posts like pornographic content. In previous research, different filtering techniques are used to detect these e-mails such as using Random Forest, Naïve Bayesian, Support Vector Machine (SVM) and Neutral Network. In this research, we test Naïve Bayes algorithm for e-mail spam filtering on two datasets and test its performance, i.e., Spam Data and SPAMBASE datasets [8]. The performance of the datasets is evaluated based on their accuracy, recall, precision and F-measure. Our research use WEKA tool for the evaluation of Naïve Bayes algorithm for e-mail spam filtering on both datasets. The result shows that the type of email and the number of instances of the dataset has an influence towards the performance of Naïve Bayes.
Application of Vector Triggering Random Decrement
DEFF Research Database (Denmark)
Asmussen, J. C.; Ibrahim, S. R.; Brincker, Rune
result is a Random Decrement function from each measurement. In traditional Random Decrement estimation the triggering condition is a scalar condition, which should only be fulfilled in a single measurement. In vector triggering Random Decrement the triggering condition is a vector condition......This paper deals with applications of the vector triggering Random Decrement technique. This technique is new and developed with the aim of minimizing estimation time and identification errors. The theory behind the technique is discussed in an accompanying paper. The results presented...... in this paper should be regarded as a further documentation of the technique. The key point in Random Decrement estimation is the formulation of a triggering condition. If the triggering condition is fulfilled a time segment from each measurement is picked out and averaged with previous time segments. The final...
Application of Vector Triggering Random Decrement
DEFF Research Database (Denmark)
Asmussen, J. C.; Ibrahim, S. R.; Brincker, Rune
1997-01-01
result is a Random Decrement function from each measurement. In traditional Random Decrement estimation the triggering condition is a scalar condition, which should only be fulfilled in a single measurement. In vector triggering Random Decrement the triggering condition is a vector condition......This paper deals with applications of the vector triggering Random Decrement technique. This technique is new and developed with the aim of minimizing estimation time and identification errors. The theory behind the technique is discussed in an accompanying paper. The results presented...... in this paper should be regarded as a further documentation of the technique. The key point in Random Decrement estimation is the formulation of a triggering condition. If the triggering condition is fulfilled a time segment from each measurement is picked out and averaged with previous time segments. The final...
Bunyavirus-Vector Interactions
Directory of Open Access Journals (Sweden)
Kate McElroy Horne
2014-11-01
Full Text Available The Bunyaviridae family is comprised of more than 350 viruses, of which many within the Hantavirus, Orthobunyavirus, Nairovirus, Tospovirus, and Phlebovirus genera are significant human or agricultural pathogens. The viruses within the Orthobunyavirus, Nairovirus, and Phlebovirus genera are transmitted by hematophagous arthropods, such as mosquitoes, midges, flies, and ticks, and their associated arthropods not only serve as vectors but also as virus reservoirs in many cases. This review presents an overview of several important emerging or re-emerging bunyaviruses and describes what is known about bunyavirus-vector interactions based on epidemiological, ultrastructural, and genetic studies of members of this virus family.
Yurinsky, Vadim Vladimirovich
1995-01-01
Surveys the methods currently applied to study sums of infinite-dimensional independent random vectors in situations where their distributions resemble Gaussian laws. Covers probabilities of large deviations, Chebyshev-type inequalities for seminorms of sums, a method of constructing Edgeworth-type expansions, estimates of characteristic functions for random vectors obtained by smooth mappings of infinite-dimensional sums to Euclidean spaces. A self-contained exposition of the modern research apparatus around CLT, the book is accessible to new graduate students, and can be a useful reference for researchers and teachers of the subject.
Duality in vector optimization
Bot, Radu Ioan
2009-01-01
This book presents fundamentals and comprehensive results regarding duality for scalar, vector and set-valued optimization problems in a general setting. After a preliminary chapter dedicated to convex analysis and minimality notions of sets with respect to partial orderings induced by convex cones a chapter on scalar conjugate duality follows. Then investigations on vector duality based on scalar conjugacy are made. Weak, strong and converse duality statements are delivered and connections to classical results from the literature are emphasized. One chapter is exclusively consecrated to the s
Multithreading in vector processors
Evangelinos, Constantinos; Kim, Changhoan; Nair, Ravi
2018-01-16
In one embodiment, a system includes a processor having a vector processing mode and a multithreading mode. The processor is configured to operate on one thread per cycle in the multithreading mode. The processor includes a program counter register having a plurality of program counters, and the program counter register is vectorized. Each program counter in the program counter register represents a distinct corresponding thread of a plurality of threads. The processor is configured to execute the plurality of threads by activating the plurality of program counters in a round robin cycle.
Eisenman, Richard L
2005-01-01
This outstanding text and reference applies matrix ideas to vector methods, using physical ideas to illustrate and motivate mathematical concepts but employing a mathematical continuity of development rather than a physical approach. The author, who taught at the U.S. Air Force Academy, dispenses with the artificial barrier between vectors and matrices--and more generally, between pure and applied mathematics.Motivated examples introduce each idea, with interpretations of physical, algebraic, and geometric contexts, in addition to generalizations to theorems that reflect the essential structur
Free topological vector spaces
Gabriyelyan, Saak S.; Morris, Sidney A.
2016-01-01
We define and study the free topological vector space $\\mathbb{V}(X)$ over a Tychonoff space $X$. We prove that $\\mathbb{V}(X)$ is a $k_\\omega$-space if and only if $X$ is a $k_\\omega$-space. If $X$ is infinite, then $\\mathbb{V}(X)$ contains a closed vector subspace which is topologically isomorphic to $\\mathbb{V}(\\mathbb{N})$. It is proved that if $X$ is a $k$-space, then $\\mathbb{V}(X)$ is locally convex if and only if $X$ is discrete and countable. If $X$ is a metrizable space it is shown ...
Chemoselective ligation and antigen vectorization.
Gras-Masse, H
2001-01-01
The interest in cocktail-lipopeptide vaccines has now been confirmed by phase I clinical trials: highly diversified B-, T-helper or cytotoxic T-cell epitopes can be combined with a lipophilic vector for the induction of B- and T-cell responses of predetermined specificity. With the goal of producing an improved vaccine that should ideally induce a multispecific response in non-selected populations, increasing the diversity of the immunizing mixture represents one of the most obvious strategies.The selective delivery of antigens to professional antigen-presenting cells represents another promising approach for the improvement of vaccine efficacy. In this context, the mannose-receptor represents an attractive entry point for the targeting to dendritic cells of antigens linked to clustered glycosides or glycomimetics. In all cases, highly complex but fully characterized molecules must be produced. To develop a modular and flexible strategy which could be generally applicable to a large set of peptide antigens, we elected to explore the potentialities of chemoselective ligation methods. The hydrazone bond was found particularly reliable and fully compatible with sulphide ligation. Hydrazone/thioether orthogonal ligation systems could be developed to account for the nature of the antigens and the solubility of the vector systems. Copyright 2001 The International Association for Biologicals.
Goldsmith, Shelly
1999-01-01
Dew Point was a solo exhibition originating at PriceWaterhouseCoopers Headquarters Gallery, London, UK and toured to the Centre de Documentacio i Museu Textil, Terrassa, Spain and Gallery Aoyama, Tokyo, Japan.
Full Text Available ... Point by CPSC Blogger September 22, 2009 appliance child Childproofing CPSC danger death electrical fall furniture head injury product safety television tipover tv Watch the video in Adobe Flash ...
... Point by CPSC Blogger September 22, 2009 appliance child Childproofing CPSC danger death electrical fall furniture head injury product safety television tipover tv Watch the video in Adobe Flash ...
Full Text Available ... Point by CPSC Blogger September 22, 2009 appliance child Childproofing CPSC danger death electrical fall furniture head ... see news reports about horrible accidents involving young children and furniture, appliance and tv tip-overs. The ...
Full Text Available ... Point by CPSC Blogger September 22, 2009 appliance child Childproofing CPSC danger death electrical fall furniture head ... TV falls with about the same force as child falling from the third story of a building. ...
Full Text Available ... Tipping Point by CPSC Blogger September 22, 2009 appliance child Childproofing CPSC danger death electrical fall furniture ... about horrible accidents involving young children and furniture, appliance and tv tip-overs. The force of a ...
Fuzzy support vector machine for microarray imbalanced data classification
Ladayya, Faroh; Purnami, Santi Wulan; Irhamah
2017-11-01
DNA microarrays are data containing gene expression with small sample sizes and high number of features. Furthermore, imbalanced classes is a common problem in microarray data. This occurs when a dataset is dominated by a class which have significantly more instances than the other minority classes. Therefore, it is needed a classification method that solve the problem of high dimensional and imbalanced data. Support Vector Machine (SVM) is one of the classification methods that is capable of handling large or small samples, nonlinear, high dimensional, over learning and local minimum issues. SVM has been widely applied to DNA microarray data classification and it has been shown that SVM provides the best performance among other machine learning methods. However, imbalanced data will be a problem because SVM treats all samples in the same importance thus the results is bias for minority class. To overcome the imbalanced data, Fuzzy SVM (FSVM) is proposed. This method apply a fuzzy membership to each input point and reformulate the SVM such that different input points provide different contributions to the classifier. The minority classes have large fuzzy membership so FSVM can pay more attention to the samples with larger fuzzy membership. Given DNA microarray data is a high dimensional data with a very large number of features, it is necessary to do feature selection first using Fast Correlation based Filter (FCBF). In this study will be analyzed by SVM, FSVM and both methods by applying FCBF and get the classification performance of them. Based on the overall results, FSVM on selected features has the best classification performance compared to SVM.
Multiview vector-valued manifold regularization for multilabel image classification.
Luo, Yong; Tao, Dacheng; Xu, Chang; Xu, Chao; Liu, Hong; Wen, Yonggang
2013-05-01
In computer vision, image datasets used for classification are naturally associated with multiple labels and comprised of multiple views, because each image may contain several objects (e.g., pedestrian, bicycle, and tree) and is properly characterized by multiple visual features (e.g., color, texture, and shape). Currently, available tools ignore either the label relationship or the view complementarily. Motivated by the success of the vector-valued function that constructs matrix-valued kernels to explore the multilabel structure in the output space, we introduce multiview vector-valued manifold regularization (MV(3)MR) to integrate multiple features. MV(3)MR exploits the complementary property of different features and discovers the intrinsic local geometry of the compact support shared by different features under the theme of manifold regularization. We conduct extensive experiments on two challenging, but popular, datasets, PASCAL VOC' 07 and MIR Flickr, and validate the effectiveness of the proposed MV(3)MR for image classification.
Kim, Changjae; Habib, Ayman; Pyeon, Muwook; Kwon, Goo-rak; Jung, Jaehoon; Heo, Joon
2016-01-22
Diverse approaches to laser point segmentation have been proposed since the emergence of the laser scanning system. Most of these segmentation techniques, however, suffer from limitations such as sensitivity to the choice of seed points, lack of consideration of the spatial relationships among points, and inefficient performance. In an effort to overcome these drawbacks, this paper proposes a segmentation methodology that: (1) reduces the dimensions of the attribute space; (2) considers the attribute similarity and the proximity of the laser point simultaneously; and (3) works well with both airborne and terrestrial laser scanning data. A neighborhood definition based on the shape of the surface increases the homogeneity of the laser point attributes. The magnitude of the normal position vector is used as an attribute for reducing the dimension of the accumulator array. The experimental results demonstrate, through both qualitative and quantitative evaluations, the outcomes' high level of reliability. The proposed segmentation algorithm provided 96.89% overall correctness, 95.84% completeness, a 0.25 m overall mean value of centroid difference, and less than 1° of angle difference. The performance of the proposed approach was also verified with a large dataset and compared with other approaches. Additionally, the evaluation of the sensitivity of the thresholds was carried out. In summary, this paper proposes a robust and efficient segmentation methodology for abstraction of an enormous number of laser points into plane information.
Directory of Open Access Journals (Sweden)
Changjae Kim
2016-01-01
Full Text Available Diverse approaches to laser point segmentation have been proposed since the emergence of the laser scanning system. Most of these segmentation techniques, however, suffer from limitations such as sensitivity to the choice of seed points, lack of consideration of the spatial relationships among points, and inefficient performance. In an effort to overcome these drawbacks, this paper proposes a segmentation methodology that: (1 reduces the dimensions of the attribute space; (2 considers the attribute similarity and the proximity of the laser point simultaneously; and (3 works well with both airborne and terrestrial laser scanning data. A neighborhood definition based on the shape of the surface increases the homogeneity of the laser point attributes. The magnitude of the normal position vector is used as an attribute for reducing the dimension of the accumulator array. The experimental results demonstrate, through both qualitative and quantitative evaluations, the outcomes’ high level of reliability. The proposed segmentation algorithm provided 96.89% overall correctness, 95.84% completeness, a 0.25 m overall mean value of centroid difference, and less than 1° of angle difference. The performance of the proposed approach was also verified with a large dataset and compared with other approaches. Additionally, the evaluation of the sensitivity of the thresholds was carried out. In summary, this paper proposes a robust and efficient segmentation methodology for abstraction of an enormous number of laser points into plane information.
DEFF Research Database (Denmark)
2000-01-01
Using a pulsed ultrasound field, the two-dimensional velocity vector can be determined with the invention. The method uses a transversally modulated ultrasound field for probing the moving medium under investigation. A modified autocorrelation approach is used in the velocity estimation. The new...
Production of lentiviral vectors
Directory of Open Access Journals (Sweden)
Otto-Wilhelm Merten
2016-01-01
Full Text Available Lentiviral vectors (LV have seen considerably increase in use as gene therapy vectors for the treatment of acquired and inherited diseases. This review presents the state of the art of the production of these vectors with particular emphasis on their large-scale production for clinical purposes. In contrast to oncoretroviral vectors, which are produced using stable producer cell lines, clinical-grade LV are in most of the cases produced by transient transfection of 293 or 293T cells grown in cell factories. However, more recent developments, also, tend to use hollow fiber reactor, suspension culture processes, and the implementation of stable producer cell lines. As is customary for the biotech industry, rather sophisticated downstream processing protocols have been established to remove any undesirable process-derived contaminant, such as plasmid or host cell DNA or host cell proteins. This review compares published large-scale production and purification processes of LV and presents their process performances. Furthermore, developments in the domain of stable cell lines and their way to the use of production vehicles of clinical material will be presented.
Indian Academy of Sciences (India)
The Gram-Schmidt process is one of the first things one learns in a course ... We might want to stay as close to the experimental data as possible when converting these vectors to orthonormal ones demanded by the model. The process of finding the closest or- thonormal .... is obtained by writing the matrix A = [aI, an], then.
Champenois, Gilles
2007-01-01
The mnesor theory is the adaptation of vectors to artificial intelligence. The scalar field is replaced by a lattice. Addition becomes idempotent and multiplication is interpreted as a selection operation. We also show that mnesors can be the foundation for a linear calculus.
Treiman, Jay S
2014-01-01
Calculus with Vectors grew out of a strong need for a beginning calculus textbook for undergraduates who intend to pursue careers in STEM. fields. The approach introduces vector-valued functions from the start, emphasizing the connections between one-variable and multi-variable calculus. The text includes early vectors and early transcendentals and includes a rigorous but informal approach to vectors. Examples and focused applications are well presented along with an abundance of motivating exercises. All three-dimensional graphs have rotatable versions included as extra source materials and may be freely downloaded and manipulated with Maple Player; a free Maple Player App is available for the iPad on iTunes. The approaches taken to topics such as the derivation of the derivatives of sine and cosine, the approach to limits, and the use of "tables" of integration have been modified from the standards seen in other textbooks in order to maximize the ease with which students may comprehend the material. Additio...
Indian Academy of Sciences (India)
[G] Giannessi F, Theorems of alternative, quadratic programs and complementarity problems, in: Variational Inequalities and Complementarity Problems (eds) R W Cottle, F Giannessi and J L Lions (New York: Wiley) (1980) pp. 151±186. [K1] Kazmi K R, Existence of solutions for vector optimization, Appl. Math. Lett. 9 (1996).
Centers for Disease Control (CDC) Podcasts
2011-04-18
This podcast discusses emerging vector-borne pathogens, their role as prominent contributors to emerging infectious diseases, how they're spread, and the ineffectiveness of mosquito control methods. Created: 4/18/2011 by National Center for Emerging Zoonotic and Infectious Diseases (NCEZID). Date Released: 4/27/2011.
Sharing Video Datasets in Design Research
DEFF Research Database (Denmark)
Christensen, Bo; Abildgaard, Sille Julie Jøhnk
2017-01-01
This paper examines how design researchers, design practitioners and design education can benefit from sharing a dataset. We present the Design Thinking Research Symposium 11 (DTRS11) as an exemplary project that implied sharing video data of design processes and design activity in natural settings...... with a large group of fellow academics from the international community of Design Thinking Research, for the purpose of facilitating research collaboration and communication within the field of Design and Design Thinking. This approach emphasizes the social and collaborative aspects of design research, where...... a multitude of appropriate perspectives and methods may be utilized in analyzing and discussing the singular dataset. The shared data is, from this perspective, understood as a design object in itself, which facilitates new ways of working, collaborating, studying, learning and educating within the expanding...
Automatic processing of multimodal tomography datasets.
Parsons, Aaron D; Price, Stephen W T; Wadeson, Nicola; Basham, Mark; Beale, Andrew M; Ashton, Alun W; Mosselmans, J Frederick W; Quinn, Paul D
2017-01-01
With the development of fourth-generation high-brightness synchrotrons on the horizon, the already large volume of data that will be collected on imaging and mapping beamlines is set to increase by orders of magnitude. As such, an easy and accessible way of dealing with such large datasets as quickly as possible is required in order to be able to address the core scientific problems during the experimental data collection. Savu is an accessible and flexible big data processing framework that is able to deal with both the variety and the volume of data of multimodal and multidimensional scientific datasets output such as those from chemical tomography experiments on the I18 microfocus scanning beamline at Diamond Light Source.
Interpolation of diffusion weighted imaging datasets
DEFF Research Database (Denmark)
Dyrby, Tim B; Lundell, Henrik; Burke, Mark W
2014-01-01
anatomical details and signal-to-noise-ratio for reliable fibre reconstruction. We assessed the potential benefits of interpolating DWI datasets to a higher image resolution before fibre reconstruction using a diffusion tensor model. Simulations of straight and curved crossing tracts smaller than or equal......Diffusion weighted imaging (DWI) is used to study white-matter fibre organisation, orientation and structural connectivity by means of fibre reconstruction algorithms and tractography. For clinical settings, limited scan time compromises the possibilities to achieve high image resolution for finer...... interpolation methods fail to disentangle fine anatomical details if PVE is too pronounced in the original data. As for validation we used ex-vivo DWI datasets acquired at various image resolutions as well as Nissl-stained sections. Increasing the image resolution by a factor of eight yielded finer geometrical...
Automated Coarse Registration of Point Clouds in 3d Urban Scenes Using Voxel Based Plane Constraint
Xu, Y.; Boerner, R.; Yao, W.; Hoegner, L.; Stilla, U.
2017-09-01
For obtaining a full coverage of 3D scans in a large-scale urban area, the registration between point clouds acquired via terrestrial laser scanning (TLS) is normally mandatory. However, due to the complex urban environment, the automatic registration of different scans is still a challenging problem. In this work, we propose an automatic marker free method for fast and coarse registration between point clouds using the geometric constrains of planar patches under a voxel structure. Our proposed method consists of four major steps: the voxelization of the point cloud, the approximation of planar patches, the matching of corresponding patches, and the estimation of transformation parameters. In the voxelization step, the point cloud of each scan is organized with a 3D voxel structure, by which the entire point cloud is partitioned into small individual patches. In the following step, we represent points of each voxel with the approximated plane function, and select those patches resembling planar surfaces. Afterwards, for matching the corresponding patches, a RANSAC-based strategy is applied. Among all the planar patches of a scan, we randomly select a planar patches set of three planar surfaces, in order to build a coordinate frame via their normal vectors and their intersection points. The transformation parameters between scans are calculated from these two coordinate frames. The planar patches set with its transformation parameters owning the largest number of coplanar patches are identified as the optimal candidate set for estimating the correct transformation parameters. The experimental results using TLS datasets of different scenes reveal that our proposed method can be both effective and efficient for the coarse registration task. Especially, for the fast orientation between scans, our proposed method can achieve a registration error of less than around 2 degrees using the testing datasets, and much more efficient than the classical baseline methods.
Directory of Open Access Journals (Sweden)
Rostami Hamidey
2015-01-01
Full Text Available In many modern manufacturing industries, data that characterize the manufacturing process are electronically collected and stored in databases. Due to advances in data collection systems and analysis tools, data mining (DM has widely been applied for quality assessment (QA in manufacturing industries. In DM, the choice of technique to be used in analyzing a dataset and assessing the quality depend on the understanding of the analyst. On the other hand, with the advent of improved and efficient prediction techniques, there is a need for an analyst to know which tool performs better for a particular type of dataset. Although a few review papers have recently been published to discuss DM applications in manufacturing for QA, this paper provides an extensive review to investigate the application of a special DM technique, namely support vector machine (SVM to deal with QA problems. This review provides a comprehensive analysis of the literature from various points of view as DM concepts, data preprocessing, DM applications for each quality task, SVM preliminaries, and application results. Summary tables and figures are also provided besides to the analyses. Finally, conclusions and future research directions are provided.
Data assimilation and model evaluation experiment datasets
Lai, Chung-Cheng A.; Qian, Wen; Glenn, Scott M.
1994-01-01
The Institute for Naval Oceanography, in cooperation with Naval Research Laboratories and universities, executed the Data Assimilation and Model Evaluation Experiment (DAMEE) for the Gulf Stream region during fiscal years 1991-1993. Enormous effort has gone into the preparation of several high-quality and consistent datasets for model initialization and verification. This paper describes the preparation process, the temporal and spatial scopes, the contents, the structure, etc., of these datasets. The goal of DAMEE and the need of data for the four phases of experiment are briefly stated. The preparation of DAMEE datasets consisted of a series of processes: (1) collection of observational data; (2) analysis and interpretation; (3) interpolation using the Optimum Thermal Interpolation System package; (4) quality control and re-analysis; and (5) data archiving and software documentation. The data products from these processes included a time series of 3D fields of temperature and salinity, 2D fields of surface dynamic height and mixed-layer depth, analysis of the Gulf Stream and rings system, and bathythermograph profiles. To date, these are the most detailed and high-quality data for mesoscale ocean modeling, data assimilation, and forecasting research. Feedback from ocean modeling groups who tested this data was incorporated into its refinement. Suggestions for DAMEE data usages include (1) ocean modeling and data assimilation studies, (2) diagnosis and theoretical studies, and (3) comparisons with locally detailed observations.
A hybrid organic-inorganic perovskite dataset
Kim, Chiho; Huan, Tran Doan; Krishnan, Sridevi; Ramprasad, Rampi
2017-05-01
Hybrid organic-inorganic perovskites (HOIPs) have been attracting a great deal of attention due to their versatility of electronic properties and fabrication methods. We prepare a dataset of 1,346 HOIPs, which features 16 organic cations, 3 group-IV cations and 4 halide anions. Using a combination of an atomic structure search method and density functional theory calculations, the optimized structures, the bandgap, the dielectric constant, and the relative energies of the HOIPs are uniformly prepared and validated by comparing with relevant experimental and/or theoretical data. We make the dataset available at Dryad Digital Repository, NoMaD Repository, and Khazana Repository (http://khazana.uconn.edu/), hoping that it could be useful for future data-mining efforts that can explore possible structure-property relationships and phenomenological models. Progressive extension of the dataset is expected as new organic cations become appropriate within the HOIP framework, and as additional properties are calculated for the new compounds found.
Hairy Slices: Evaluating the Perceptual Effectiveness of Cutting Plane Glyphs for 3D Vector Fields.
Stevens, Andrew H; Butkiewicz, Thomas; Ware, Colin
2017-01-01
Three-dimensional vector fields are common datasets throughout the sciences. Visualizing these fields is inherently difficult due to issues such as visual clutter and self-occlusion. Cutting planes are often used to overcome these issues by presenting more manageable slices of data. The existing literature provides many techniques for visualizing the flow through these cutting planes; however, there is a lack of empirical studies focused on the underlying perceptual cues that make popular techniques successful. This paper presents a quantitative human factors study that evaluates static monoscopic depth and orientation cues in the context of cutting plane glyph designs for exploring and analyzing 3D flow fields. The goal of the study was to ascertain the relative effectiveness of various techniques for portraying the direction of flow through a cutting plane at a given point, and to identify the visual cues and combinations of cues involved, and how they contribute to accurate performance. It was found that increasing the dimensionality of line-based glyphs into tubular structures enhances their ability to convey orientation through shading, and that increasing their diameter intensifies this effect. These tube-based glyphs were also less sensitive to visual clutter issues at higher densities. Adding shadows to lines was also found to increase perception of flow direction. Implications of the experimental results are discussed and extrapolated into a number of guidelines for designing more perceptually effective glyphs for 3D vector field visualizations.
Directory of Open Access Journals (Sweden)
D. Patella
2008-06-01
Full Text Available We expand the theory of probability tomography to the integration of different geophysical datasets. The aim of the new method is to improve the information quality using a conjoint occurrence probability function addressed to highlight the existence of common sources of anomalies. The new method is tested on gravity, magnetic and self-potential datasets collected in the volcanic area of Mt. Vesuvius (Naples, and on gravity and dipole geoelectrical datasets collected in the volcanic area of Mt. Etna (Sicily. The application demonstrates that, from a probabilistic point of view, the integrated analysis can delineate the signature of some important volcanic targets better than the analysis of the tomographic image of each dataset considered separately.
Directory of Open Access Journals (Sweden)
Amir Ahmad
2016-01-01
Full Text Available The early diagnosis of breast cancer is an important step in a fight against the disease. Machine learning techniques have shown promise in improving our understanding of the disease. As medical datasets consist of data points which cannot be precisely assigned to a class, fuzzy methods have been useful for studying of these datasets. Sometimes breast cancer datasets are described by categorical features. Many fuzzy clustering algorithms have been developed for categorical datasets. However, in most of these methods Hamming distance is used to define the distance between the two categorical feature values. In this paper, we use a probabilistic distance measure for the distance computation among a pair of categorical feature values. Experiments demonstrate that the distance measure performs better than Hamming distance for Wisconsin breast cancer data.
Ding, Zi'ang
2016-01-01
Both vector and tensor fields are important mathematical tools used to describe the physics of many phenomena in science and engineering. Effective vector and tensor field visualization techniques are therefore needed to interpret and analyze the corresponding data and achieve new insight into the considered problem. This dissertation is concerned with the extraction of important structural properties from vector and tensor datasets. Specifically, we present a unified approach for the charact...
Directory of Open Access Journals (Sweden)
Ravindra Kumar
2017-09-01
Full Text Available Background The endoplasmic reticulum plays an important role in many cellular processes, which includes protein synthesis, folding and post-translational processing of newly synthesized proteins. It is also the site for quality control of misfolded proteins and entry point of extracellular proteins to the secretory pathway. Hence at any given point of time, endoplasmic reticulum contains two different cohorts of proteins, (i proteins involved in endoplasmic reticulum-specific function, which reside in the lumen of the endoplasmic reticulum, called as endoplasmic reticulum resident proteins and (ii proteins which are in process of moving to the extracellular space. Thus, endoplasmic reticulum resident proteins must somehow be distinguished from newly synthesized secretory proteins, which pass through the endoplasmic reticulum on their way out of the cell. Approximately only 50% of the proteins used in this study as training data had endoplasmic reticulum retention signal, which shows that these signals are not essentially present in all endoplasmic reticulum resident proteins. This also strongly indicates the role of additional factors in retention of endoplasmic reticulum-specific proteins inside the endoplasmic reticulum. Methods This is a support vector machine based method, where we had used different forms of protein features as inputs for support vector machine to develop the prediction models. During training leave-one-out approach of cross-validation was used. Maximum performance was obtained with a combination of amino acid compositions of different part of proteins. Results In this study, we have reported a novel support vector machine based method for predicting endoplasmic reticulum resident proteins, named as ERPred. During training we achieved a maximum accuracy of 81.42% with leave-one-out approach of cross-validation. When evaluated on independent dataset, ERPred did prediction with sensitivity of 72.31% and specificity of 83
Leaking Underground Storage Tank Points, Region 9 Indian Country, 2017, US EPA Region 9
U.S. Environmental Protection Agency — This GIS dataset contains point features that represent Leaking Underground Storage Tanks in US EPA Region 9 Indian Country. This dataset contains facility name and...
Indian Country Leaking Underground Storage Tank (LUST) Points, Region 9, 2016, US EPA Region 9
U.S. Environmental Protection Agency — This GIS dataset contains point features that represent Leaking Underground Storage Tanks in US EPA Region 9 Indian Country. This dataset contains facility name and...
International Nuclear Information System (INIS)
Huang, Qiu; Peng, Qiyu; Huang, Bin; Cheryauka, Arvi; Gullberg, Grant T.
2008-01-01
The measurement of flow obtained using continuous wave Doppler ultrasound is formulated as a directional projection of a flow vector field. When a continuous ultrasound wave bounces against a flowing particle, a signal is backscattered. This signal obtains a Doppler frequency shift proportional to the speed of the particle along the ultrasound beam. This occurs for each particle along the beam, giving rise to a Doppler velocity spectrum. The first moment of the spectrum provides the directional projection of the flow along the ultrasound beam. Signals reflected from points further away from the detector will have lower amplitude than signals reflected from points closer to the detector. The effect is very much akin to that modeled by the attenuated Radon transform in emission computed tomography.A least-squares method was adopted to reconstruct a 2D vector field from directional projection measurements. Attenuated projections of only the longitudinal projections of the vector field were simulated. The components of the vector field were reconstructed using the gradient algorithm to minimize a least-squares criterion. This result was compared with the reconstruction of longitudinal projections of the vector field without attenuation. If attenuation is known, the algorithm was able to accurately reconstruct both components of the full vector field from only one set of directional projection measurements. A better reconstruction was obtained with attenuation than without attenuation implying that attenuation provides important information for the reconstruction of flow vector fields.This confirms previous work where we showed that knowledge of the attenuation distribution helps in the reconstruction of MRI diffusion tensor fields from fewer than the required measurements. In the application of ultrasound the attenuation distribution is obtained with pulse wave transmission computed tomography and flow information is obtained with continuous wave Doppler
Zero-point energy in spheroidal geometries
Kitson, A. R.; Signal, A. I.
2005-01-01
We study the zero-point energy of a massless scalar field subject to spheroidal boundary conditions. Using the zeta-function method, the zero-point energy is evaluated for small ellipticity. Axially symmetric vector fields are also considered. The results are interpreted within the context of QCD flux tubes and the MIT bag model.
Impedance analysis of acupuncture points and pathways
International Nuclear Information System (INIS)
Teplan, Michal; Kukucka, Marek; Ondrejkovicová, Alena
2011-01-01
Investigation of impedance characteristics of acupuncture points from acoustic to radio frequency range is addressed. Discernment and localization of acupuncture points in initial single subject study was unsuccessfully attempted by impedance map technique. Vector impedance analyses determined possible resonant zones in MHz region.
Application of Bred Vectors To Data Assimilation
Corazza, M.; Kalnay, E.; Patil, Dj
We introduced a statistic, the BV-dimension, to measure the effective local finite-time dimensionality of the atmosphere. We show that this dimension is often quite low, and suggest that this finding has important implications for data assimilation and the accuracy of weather forecasting (Patil et al, 2001). The original database for this study was the forecasts of the NCEP global ensemble forecasting system. The initial differences between the control forecast and the per- turbed forecasts are called bred vectors. The control and perturbed initial conditions valid at time t=n(t are evolved using the forecast model until time t=(n+1) (t. The differences between the perturbed and the control forecasts are scaled down to their initial amplitude, and constitute the bred vectors valid at (n+1) (t. Their growth rate is typically about 1.5/day. The bred vectors are similar by construction to leading Lya- punov vectors except that they have small but finite amplitude, and they are valid at finite times. The original NCEP ensemble data set has 5 independent bred vectors. We define a local bred vector at each grid point by choosing the 5 by 5 grid points centered at the grid point (a region of about 1100km by 1100km), and using the north-south and east- west velocity components at 500mb pressure level to form a 50 dimensional column vector. Since we have k=5 global bred vectors, we also have k local bred vectors at each grid point. We estimate the effective dimensionality of the subspace spanned by the local bred vectors by performing a singular value decomposition (EOF analysis). The k local bred vector columns form a 50xk matrix M. The singular values s(i) of M measure the extent to which the k column unit vectors making up the matrix M point in the direction of v(i). We define the bred vector dimension as BVDIM={Sum[s(i)]}^2/{Sum[s(i)]^2} For example, if 4 out of the 5 vectors lie along v, and one lies along v, the BV- dimension would be BVDIM[sqrt(4), 1, 0
FASTQSim: platform-independent data characterization and in silico read generation for NGS datasets.
Shcherbina, Anna
2014-08-15
High-throughput next generation sequencing technologies have enabled rapid characterization of clinical and environmental samples. Consequently, the largest bottleneck to actionable data has become sample processing and bioinformatics analysis, creating a need for accurate and rapid algorithms to process genetic data. Perfectly characterized in silico datasets are a useful tool for evaluating the performance of such algorithms. Background contaminating organisms are observed in sequenced mixtures of organisms. In silico samples provide exact truth. To create the best value for evaluating algorithms, in silico data should mimic actual sequencer data as closely as possible. FASTQSim is a tool that provides the dual functionality of NGS dataset characterization and metagenomic data generation. FASTQSim is sequencing platform-independent, and computes distributions of read length, quality scores, indel rates, single point mutation rates, indel size, and similar statistics for any sequencing platform. To create training or testing datasets, FASTQSim has the ability to convert target sequences into in silico reads with specific error profiles obtained in the characterization step. FASTQSim enables users to assess the quality of NGS datasets. The tool provides information about read length, read quality, repetitive and non-repetitive indel profiles, and single base pair substitutions. FASTQSim allows the user to simulate individual read datasets that can be used as standardized test scenarios for planning sequencing projects or for benchmarking metagenomic software. In this regard, in silico datasets generated with the FASTQsim tool hold several advantages over natural datasets: they are sequencing platform independent, extremely well characterized, and less expensive to generate. Such datasets are valuable in a number of applications, including the training of assemblers for multiple platforms, benchmarking bioinformatics algorithm performance, and creating challenge
Geometrical Modification of Learning Vector Quantization Method for Solving Classification Problems
Directory of Open Access Journals (Sweden)
Korhan GÜNEL
2016-09-01
Full Text Available In this paper, a geometrical scheme is presented to show how to overcome an encountered problem arising from the use of generalized delta learning rule within competitive learning model. It is introduced a theoretical methodology for describing the quantization of data via rotating prototype vectors on hyper-spheres.The proposed learning algorithm is tested and verified on different multidimensional datasets including a binary class dataset and two multiclass datasets from the UCI repository, and a multiclass dataset constructed by us. The proposed method is compared with some baseline learning vector quantization variants in literature for all domains. Large number of experiments verify the performance of our proposed algorithm with acceptable accuracy and macro f1 scores.
Vector mass in curved space-times
International Nuclear Information System (INIS)
Maia, M.D.
The use of the Poincare-symmetry appears to be incompatible with the presence of the gravitational field. The consequent problem of the definition of the mass operator is analysed and an alternative definition based on constant curvature tangent spaces is proposed. In the case where the space-time has no killing vector fields, four independent mass operators can be defined at each point. (Author) [pt
Gschwind, Michael K [Chappaqua, NY
2011-03-01
Mechanisms for implementing a floating point only single instruction multiple data instruction set architecture are provided. A processor is provided that comprises an issue unit, an execution unit coupled to the issue unit, and a vector register file coupled to the execution unit. The execution unit has logic that implements a floating point (FP) only single instruction multiple data (SIMD) instruction set architecture (ISA). The floating point vector registers of the vector register file store both scalar and floating point values as vectors having a plurality of vector elements. The processor may be part of a data processing system.
Predicting weather regime transitions in Northern Hemisphere datasets
Energy Technology Data Exchange (ETDEWEB)
Kondrashov, D. [University of California, Department of Atmospheric and Oceanic Sciences and Institute of Geophysics and Planetary Physics, Los Angeles, CA (United States); Shen, J. [UCLA, Department of Statistics, Los Angeles, CA (United States); Berk, R. [UCLA, Department of Statistics, Los Angeles, CA (United States); University of Pennsylvania, Department of Criminology, Philadelphia, PA (United States); D' Andrea, F.; Ghil, M. [Ecole Normale Superieure, Departement Terre-Atmosphere-Ocean and Laboratoire de Meteorologie Dynamique (CNRS and IPSL), Paris Cedex 05 (France)
2007-10-15
A statistical learning method called random forests is applied to the prediction of transitions between weather regimes of wintertime Northern Hemisphere (NH) atmospheric low-frequency variability. A dataset composed of 55 winters of NH 700-mb geopotential height anomalies is used in the present study. A mixture model finds that the three Gaussian components that were statistically significant in earlier work are robust; they are the Pacific-North American (PNA) regime, its approximate reverse (the reverse PNA, or RNA), and the blocked phase of the North Atlantic Oscillation (BNAO). The most significant and robust transitions in the Markov chain generated by these regimes are PNA {yields} BNAO, PNA {yields} RNA and BNAO {yields} PNA. The break of a regime and subsequent onset of another one is forecast for these three transitions. Taking the relative costs of false positives and false negatives into account, the random-forests method shows useful forecasting skill. The calculations are carried out in the phase space spanned by a few leading empirical orthogonal functions of dataset variability. Plots of estimated response functions to a given predictor confirm the crucial influence of the exit angle on a preferred transition path. This result points to the dynamic origin of the transitions. (orig.)
Development of a SPARK Training Dataset
Energy Technology Data Exchange (ETDEWEB)
Sayre, Amanda M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Olson, Jarrod R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
2015-03-01
In its first five years, the National Nuclear Security Administration’s (NNSA) Next Generation Safeguards Initiative (NGSI) sponsored more than 400 undergraduate, graduate, and post-doctoral students in internships and research positions (Wyse 2012). In the past seven years, the NGSI program has, and continues to produce a large body of scientific, technical, and policy work in targeted core safeguards capabilities and human capital development activities. Not only does the NGSI program carry out activities across multiple disciplines, but also across all U.S. Department of Energy (DOE)/NNSA locations in the United States. However, products are not readily shared among disciplines and across locations, nor are they archived in a comprehensive library. Rather, knowledge of NGSI-produced literature is localized to the researchers, clients, and internal laboratory/facility publication systems such as the Electronic Records and Information Capture Architecture (ERICA) at the Pacific Northwest National Laboratory (PNNL). There is also no incorporated way of analyzing existing NGSI literature to determine whether the larger NGSI program is achieving its core safeguards capabilities and activities. A complete library of NGSI literature could prove beneficial to a cohesive, sustainable, and more economical NGSI program. The Safeguards Platform for Automated Retrieval of Knowledge (SPARK) has been developed to be a knowledge storage, retrieval, and analysis capability to capture safeguards knowledge to exist beyond the lifespan of NGSI. During the development process, it was necessary to build a SPARK training dataset (a corpus of documents) for initial entry into the system and for demonstration purposes. We manipulated these data to gain new information about the breadth of NGSI publications, and they evaluated the science-policy interface at PNNL as a practical demonstration of SPARK’s intended analysis capability. The analysis demonstration sought to answer the
Development of a SPARK Training Dataset
International Nuclear Information System (INIS)
Sayre, Amanda M.; Olson, Jarrod R.
2015-01-01
In its first five years, the National Nuclear Security Administration's (NNSA) Next Generation Safeguards Initiative (NGSI) sponsored more than 400 undergraduate, graduate, and post-doctoral students in internships and research positions (Wyse 2012). In the past seven years, the NGSI program has, and continues to produce a large body of scientific, technical, and policy work in targeted core safeguards capabilities and human capital development activities. Not only does the NGSI program carry out activities across multiple disciplines, but also across all U.S. Department of Energy (DOE)/NNSA locations in the United States. However, products are not readily shared among disciplines and across locations, nor are they archived in a comprehensive library. Rather, knowledge of NGSI-produced literature is localized to the researchers, clients, and internal laboratory/facility publication systems such as the Electronic Records and Information Capture Architecture (ERICA) at the Pacific Northwest National Laboratory (PNNL). There is also no incorporated way of analyzing existing NGSI literature to determine whether the larger NGSI program is achieving its core safeguards capabilities and activities. A complete library of NGSI literature could prove beneficial to a cohesive, sustainable, and more economical NGSI program. The Safeguards Platform for Automated Retrieval of Knowledge (SPARK) has been developed to be a knowledge storage, retrieval, and analysis capability to capture safeguards knowledge to exist beyond the lifespan of NGSI. During the development process, it was necessary to build a SPARK training dataset (a corpus of documents) for initial entry into the system and for demonstration purposes. We manipulated these data to gain new information about the breadth of NGSI publications, and they evaluated the science-policy interface at PNNL as a practical demonstration of SPARK's intended analysis capability. The analysis demonstration sought to answer
2015-09-28
buoyant underwater vehicle with an interior space in which a length of said underwater vehicle is equal to one tenth of the acoustic wavelength...underwater vehicle with an interior space in which a length of said underwater vehicle is equal to one tenth of the acoustic wavelength; an...unmanned underwater vehicle that can function as an acoustic vector sensor. (2) Description of the Prior Art [0004] It is known that a propagating
Reciprocity in Vector Acoustics
2017-03-01
Green’s Theorem to the left hand side of Equation (3.2) converts it to a surface integral that vanishes for the impedance boundary conditions one...There are situations where this assumption does not hold, such as at boundaries between layers or in an inhomogeneous layer , because the density gradient...instead of requiring one model run for each source location. Application of the vector-scalar reciprocity principle is demonstrated with analytic
Support vector machine for diagnosis cancer disease: A comparative study
Directory of Open Access Journals (Sweden)
Nasser H. Sweilam
2010-12-01
Full Text Available Support vector machine has become an increasingly popular tool for machine learning tasks involving classification, regression or novelty detection. Training a support vector machine requires the solution of a very large quadratic programming problem. Traditional optimization methods cannot be directly applied due to memory restrictions. Up to now, several approaches exist for circumventing the above shortcomings and work well. Another learning algorithm, particle swarm optimization, Quantum-behave Particle Swarm for training SVM is introduced. Another approach named least square support vector machine (LSSVM and active set strategy are introduced. The obtained results by these methods are tested on a breast cancer dataset and compared with the exact solution model problem.
Tensor Calculus: Unlearning Vector Calculus
Lee, Wha-Suck; Engelbrecht, Johann; Moller, Rita
2018-01-01
Tensor calculus is critical in the study of the vector calculus of the surface of a body. Indeed, tensor calculus is a natural step-up for vector calculus. This paper presents some pitfalls of a traditional course in vector calculus in transitioning to tensor calculus. We show how a deeper emphasis on traditional topics such as the Jacobian can…
From racks to pointed Hopf algebras
Andruskiewitsch, Nicolás; Graña, Matı́as
2003-01-01
A fundamental step in the classification of finite-dimensional complex pointed Hopf algebras is the determination of all finite-dimensional Nichols algebras of braided vector spaces arising from groups. The most important class of braided vector spaces arising from groups is the class of braided vector spaces (CX, c^q), where C is the field of complex numbers, X is a rack and q is a 2-cocycle on X with values in C^*. Racks and cohomology of racks appeared also in the work of topologists. This...
USING LEARNING VECTOR QUANTIZATION METHOD FOR AUTOMATED IDENTIFICATION OF MYCOBACTERIUM TUBERCULOSIS
Directory of Open Access Journals (Sweden)
Endah Purwanti
2012-01-01
Full Text Available In this paper, we are developing an automated method for the detection of tubercle bacilli in clinical specimens, principally the sputum. This investigation is the first attempt to automatically identify TB bacilli in sputum using image processing and learning vector quantization (LVQ techniques. The evaluation of the learning vector quantization (LVQ was carried out on Tuberculosis dataset show that average of accuracy is 91,33%.
Two datasets of defect reports labeled by a crowd of annotators of unknown reliability
Directory of Open Access Journals (Sweden)
Jerónimo Hernández-González
2018-06-01
Full Text Available Classifying software defects according to any defined taxonomy is not straightforward. In order to be used for automatizing the classification of software defects, two sets of defect reports were collected from public issue tracking systems from two different real domains. Due to the lack of a domain expert, the collected defects were categorized by a set of annotators of unknown reliability according to their impact from IBM's orthogonal defect classification taxonomy. Both datasets are prepared to solve the defect classification problem by means of techniques of the learning from crowds paradigm (Hernández-González et al. [1].Two versions of both datasets are publicly shared. In the first version, the raw data is given: the text description of defects together with the category assigned by each annotator. In the second version, the text of each defect has been transformed to a descriptive vector using text-mining techniques.
Vehicle Classification Using an Imbalanced Dataset Based on a Single Magnetic Sensor
Directory of Open Access Journals (Sweden)
Chang Xu
2018-05-01
Full Text Available This paper aims to improve the accuracy of automatic vehicle classifiers for imbalanced datasets. Classification is made through utilizing a single anisotropic magnetoresistive sensor, with the models of vehicles involved being classified into hatchbacks, sedans, buses, and multi-purpose vehicles (MPVs. Using time domain and frequency domain features in combination with three common classification algorithms in pattern recognition, we develop a novel feature extraction method for vehicle classification. These three common classification algorithms are the k-nearest neighbor, the support vector machine, and the back-propagation neural network. Nevertheless, a problem remains with the original vehicle magnetic dataset collected being imbalanced, and may lead to inaccurate classification results. With this in mind, we propose an approach called SMOTE, which can further boost the performance of classifiers. Experimental results show that the k-nearest neighbor (KNN classifier with the SMOTE algorithm can reach a classification accuracy of 95.46%, thus minimizing the effect of the imbalance.
Vehicle Classification Using an Imbalanced Dataset Based on a Single Magnetic Sensor.
Xu, Chang; Wang, Yingguan; Bao, Xinghe; Li, Fengrong
2018-05-24
This paper aims to improve the accuracy of automatic vehicle classifiers for imbalanced datasets. Classification is made through utilizing a single anisotropic magnetoresistive sensor, with the models of vehicles involved being classified into hatchbacks, sedans, buses, and multi-purpose vehicles (MPVs). Using time domain and frequency domain features in combination with three common classification algorithms in pattern recognition, we develop a novel feature extraction method for vehicle classification. These three common classification algorithms are the k-nearest neighbor, the support vector machine, and the back-propagation neural network. Nevertheless, a problem remains with the original vehicle magnetic dataset collected being imbalanced, and may lead to inaccurate classification results. With this in mind, we propose an approach called SMOTE, which can further boost the performance of classifiers. Experimental results show that the k-nearest neighbor (KNN) classifier with the SMOTE algorithm can reach a classification accuracy of 95.46%, thus minimizing the effect of the imbalance.
Polarization speckles and generalized Stokes vector wave: a review [invited
DEFF Research Database (Denmark)
Takeda, Mitsuo; Wang, Wei; Hanson, Steen Grüner
2010-01-01
We review some of the statistical properties of polarization-related speckle phenomena, with an introduction of a less known concept of polarization speckles and their spatial degree of polarization. As a useful means to characterize twopoint vector field correlations, we review the generalized...... Stokes parameters proposed by Korotkova and Wolf, and introduce its time-domain representation to describe the space-time evolution of the correlation between random electric vector fields at two different space-time points. This time-domain generalized Stokes vector, with components similar to those...... of the beam coherence polarization matrix proposed by Gori, is shown to obey the wave equation in exact analogy to a coherence function of scalar fields. Because of this wave nature, the time-domain generalized Stokes vector is referred to as generalized Stokes vector wave in this paper....
International Nuclear Information System (INIS)
Yang, Jing; Li, Yuan-Yuan; Li, Yi-Xue; Ye, Zhi-Qiang
2012-01-01
Highlights: ► Proper dataset partition can improve the prediction of deleterious nsSNPs. ► Partition according to original residue type at nsSNP is a good criterion. ► Similar strategy is supposed promising in other machine learning problems. -- Abstract: Many non-synonymous SNPs (nsSNPs) are associated with diseases, and numerous machine learning methods have been applied to train classifiers for sorting disease-associated nsSNPs from neutral ones. The continuously accumulated nsSNP data allows us to further explore better prediction approaches. In this work, we partitioned the training data into 20 subsets according to either original or substituted amino acid type at the nsSNP site. Using support vector machine (SVM), training classification models on each subset resulted in an overall accuracy of 76.3% or 74.9% depending on the two different partition criteria, while training on the whole dataset obtained an accuracy of only 72.6%. Moreover, the dataset was also randomly divided into 20 subsets, but the corresponding accuracy was only 73.2%. Our results demonstrated that partitioning the whole training dataset into subsets properly, i.e., according to the residue type at the nsSNP site, will improve the performance of the trained classifiers significantly, which should be valuable in developing better tools for predicting the disease-association of nsSNPs.
Directory of Open Access Journals (Sweden)
Jinya Su
2017-11-01
Full Text Available Hyperspectral images (HSI provide rich information which may not be captured by other sensing technologies and therefore gradually find a wide range of applications. However, they also generate a large amount of irrelevant or redundant data for a specific task. This causes a number of issues including significantly increased computation time, complexity and scale of prediction models mapping the data to semantics (e.g., classification, and the need of a large amount of labelled data for training. Particularly, it is generally difficult and expensive for experts to acquire sufficient training samples in many applications. This paper addresses these issues by exploring a number of classical dimension reduction algorithms in machine learning communities for HSI classification. To reduce the size of training dataset, feature selection (e.g., mutual information, minimal redundancy maximal relevance and feature extraction (e.g., Principal Component Analysis (PCA, Kernel PCA are adopted to augment a baseline classification method, Support Vector Machine (SVM. The proposed algorithms are evaluated using a real HSI dataset. It is shown that PCA yields the most promising performance in reducing the number of features or spectral bands. It is observed that while significantly reducing the computational complexity, the proposed method can achieve better classification results over the classic SVM on a small training dataset, which makes it suitable for real-time applications or when only limited training data are available. Furthermore, it can also achieve performances similar to the classic SVM on large datasets but with much less computing time.
Hierarchal scalar and vector tetrahedra
International Nuclear Information System (INIS)
Webb, J.P.; Forghani, B.
1993-01-01
A new set of scalar and vector tetrahedral finite elements are presented. The elements are hierarchal, allowing mixing of polynomial orders; scalar orders up to 3 and vector orders up to 2 are defined. The vector elements impose tangential continuity on the field but not normal continuity, making them suitable for representing the vector electric or magnetic field. Further, the scalar and vector elements are such that they can easily be used in the same mesh, a requirement of many quasi-static formulations. Results are presented for two 50 Hz problems: the Bath Cube, and TEAM Problem 7
Leishmaniasis vector behaviour in Kenya
International Nuclear Information System (INIS)
Mutinga, M.J.
1980-01-01
Leishmaniasis in Kenya exists in two forms: cutaneous and visceral. The vectors of visceral leishmaniasis have been the subject of investigation by various researchers since World War II, when the outbreak of the disease was first noticed. The vectors of cutaneous leishmaniasis were first worked on only a decade ago after the discovery of the disease focus in Mt. Elgon. The vector behaviour of these diseases, namely Phlebotomus pedifer, the vector of cutaneous leishmaniasis, and Phlebotomus martini, the vector of visceral leishmaniasis, are discussed in detail. P. pedifer has been found to breed and bite inside caves, whereas P. martini mainly bites inside houses. (author)
Quality Controlling CMIP datasets at GFDL
Horowitz, L. W.; Radhakrishnan, A.; Balaji, V.; Adcroft, A.; Krasting, J. P.; Nikonov, S.; Mason, E. E.; Schweitzer, R.; Nadeau, D.
2017-12-01
As GFDL makes the switch from model development to production in light of the Climate Model Intercomparison Project (CMIP), GFDL's efforts are shifted to testing and more importantly establishing guidelines and protocols for Quality Controlling and semi-automated data publishing. Every CMIP cycle introduces key challenges and the upcoming CMIP6 is no exception. The new CMIP experimental design comprises of multiple MIPs facilitating research in different focus areas. This paradigm has implications not only for the groups that develop the models and conduct the runs, but also for the groups that monitor, analyze and quality control the datasets before data publishing, before their knowledge makes its way into reports like the IPCC (Intergovernmental Panel on Climate Change) Assessment Reports. In this talk, we discuss some of the paths taken at GFDL to quality control the CMIP-ready datasets including: Jupyter notebooks, PrePARE, LAMP (Linux, Apache, MySQL, PHP/Python/Perl): technology-driven tracker system to monitor the status of experiments qualitatively and quantitatively, provide additional metadata and analysis services along with some in-built controlled-vocabulary validations in the workflow. In addition to this, we also discuss the integration of community-based model evaluation software (ESMValTool, PCMDI Metrics Package, and ILAMB) as part of our CMIP6 workflow.
Integrated remotely sensed datasets for disaster management
McCarthy, Timothy; Farrell, Ronan; Curtis, Andrew; Fotheringham, A. Stewart
2008-10-01
Video imagery can be acquired from aerial, terrestrial and marine based platforms and has been exploited for a range of remote sensing applications over the past two decades. Examples include coastal surveys using aerial video, routecorridor infrastructures surveys using vehicle mounted video cameras, aerial surveys over forestry and agriculture, underwater habitat mapping and disaster management. Many of these video systems are based on interlaced, television standards such as North America's NTSC and European SECAM and PAL television systems that are then recorded using various video formats. This technology has recently being employed as a front-line, remote sensing technology for damage assessment post-disaster. This paper traces the development of spatial video as a remote sensing tool from the early 1980s to the present day. The background to a new spatial-video research initiative based at National University of Ireland, Maynooth, (NUIM) is described. New improvements are proposed and include; low-cost encoders, easy to use software decoders, timing issues and interoperability. These developments will enable specialists and non-specialists collect, process and integrate these datasets within minimal support. This integrated approach will enable decision makers to access relevant remotely sensed datasets quickly and so, carry out rapid damage assessment during and post-disaster.
International Nuclear Information System (INIS)
Hu, Nan; Cerviño, Laura; Segars, Paul; Lewis, John; Shan, Jinlu; Jiang, Steve; Zheng, Xiaolin; Wang, Ge
2014-01-01
With the rapidly increasing application of adaptive radiotherapy, large datasets of organ geometries based on the patient’s anatomy are desired to support clinical application or research work, such as image segmentation, re-planning, and organ deformation analysis. Sometimes only limited datasets are available in clinical practice. In this study, we propose a new method to generate large datasets of organ geometries to be utilized in adaptive radiotherapy. Given a training dataset of organ shapes derived from daily cone-beam CT, we align them into a common coordinate frame and select one of the training surfaces as reference surface. A statistical shape model of organs was constructed, based on the establishment of point correspondence between surfaces and non-uniform rational B-spline (NURBS) representation. A principal component analysis is performed on the sampled surface points to capture the major variation modes of each organ. A set of principal components and their respective coefficients, which represent organ surface deformation, were obtained, and a statistical analysis of the coefficients was performed. New sets of statistically equivalent coefficients can be constructed and assigned to the principal components, resulting in a larger geometry dataset for the patient’s organs. These generated organ geometries are realistic and statistically representative
DEFF Research Database (Denmark)
More, Simon J.; Bicout, Dominique; Bøtner, Anette
2017-01-01
After a request from the Europea n Commission, EFSA’s Panel on Animal Health and Welfaresummarised the main characteristics of 36 vector-borne disease s (VBDs) in 36 web-based storymaps.The risk of introduction in the EU through movement of livestock or pets was assessed for eac h of the36 VBDs......-agents for which the rate of introduction wasestimated to be very low, no further asse ssments were made. Due to the uncertainty related to someparameters used for the risk assessment or the instable or unpredictability disease situation in some ofthe source regions, it is recommended to update the assessment when...
International Nuclear Information System (INIS)
Rodríguez, Yeinzon; Navarro, Andrés A.
2017-01-01
An alternative for the construction of fundamental theories is the introduction of Galileons. These are fields whose action leads to non higher than second-order equations of motion. As this is a necessary but not sufficient condition to make the Hamiltonian bounded from below, as long as the action is not degenerate, the Galileon construction is a way to avoid pathologies both at the classical and quantum levels. Galileon actions are, therefore, of great interest in many branches of physics, specially in high energy physics and cosmology. This proceedings contribution presents the generalities of the construction of both scalar and vector Galileons following two different but complimentary routes. (paper)
International Nuclear Information System (INIS)
Otsason, J.
1998-01-01
The Vector Pipeline project linking the Chicago supply hub to markets in eastern Canada, the northeastern U.S. and the Mid-Atlantic states, is described. Subsidiary objectives of the promoters are to match market timing to upstream pipelines and market requirements, and to provide low cost expandability to complement upstream expandability. The presentation includes description of the project, costs, leased facilities, rates and tariffs, right of way considerations, storage facilities and a project schedule. Construction is to begin in March 1999 and the line should be in service in November 1999
Strontium removal jar test dataset for all figures and tables.
U.S. Environmental Protection Agency — The datasets where used to generate data to demonstrate strontium removal under various water quality and treatment conditions. This dataset is associated with the...
To discover novel PPI signaling hubs for lung cancer, CTD2 Center at Emory utilized large-scale genomics datasets and literature to compile a set of lung cancer-associated genes. A library of expression vectors were generated for these genes and utilized for detecting pairwise PPIs with cell lysate-based TR-FRET assays in high-throughput screening format. Read the abstract.
Predicting dataset popularity for the CMS experiment
INSPIRE-00005122; Li, Ting; Giommi, Luca; Bonacorsi, Daniele; Wildish, Tony
2016-01-01
The CMS experiment at the LHC accelerator at CERN relies on its computing infrastructure to stay at the frontier of High Energy Physics, searching for new phenomena and making discoveries. Even though computing plays a significant role in physics analysis we rarely use its data to predict the system behavior itself. A basic information about computing resources, user activities and site utilization can be really useful for improving the throughput of the system and its management. In this paper, we discuss a first CMS analysis of dataset popularity based on CMS meta-data which can be used as a model for dynamic data placement and provide the foundation of data-driven approach for the CMS computing infrastructure.
Predicting dataset popularity for the CMS experiment
International Nuclear Information System (INIS)
Kuznetsov, V.; Li, T.; Giommi, L.; Bonacorsi, D.; Wildish, T.
2016-01-01
The CMS experiment at the LHC accelerator at CERN relies on its computing infrastructure to stay at the frontier of High Energy Physics, searching for new phenomena and making discoveries. Even though computing plays a significant role in physics analysis we rarely use its data to predict the system behavior itself. A basic information about computing resources, user activities and site utilization can be really useful for improving the throughput of the system and its management. In this paper, we discuss a first CMS analysis of dataset popularity based on CMS meta-data which can be used as a model for dynamic data placement and provide the foundation of data-driven approach for the CMS computing infrastructure. (paper)
Internationally coordinated glacier monitoring: strategy and datasets
Hoelzle, Martin; Armstrong, Richard; Fetterer, Florence; Gärtner-Roer, Isabelle; Haeberli, Wilfried; Kääb, Andreas; Kargel, Jeff; Nussbaumer, Samuel; Paul, Frank; Raup, Bruce; Zemp, Michael
2014-05-01
(c) the Randolph Glacier Inventory (RGI), a new and globally complete digital dataset of outlines from about 180,000 glaciers with some meta-information, which has been used for many applications relating to the IPCC AR5 report. Concerning glacier changes, a database (Fluctuations of Glaciers) exists containing information about mass balance, front variations including past reconstructed time series, geodetic changes and special events. Annual mass balance reporting contains information for about 125 glaciers with a subset of 37 glaciers with continuous observational series since 1980 or earlier. Front variation observations of around 1800 glaciers are available from most of the mountain ranges world-wide. This database was recently updated with 26 glaciers having an unprecedented dataset of length changes from from reconstructions of well-dated historical evidence going back as far as the 16th century. Geodetic observations of about 430 glaciers are available. The database is completed by a dataset containing information on special events including glacier surges, glacier lake outbursts, ice avalanches, eruptions of ice-clad volcanoes, etc. related to about 200 glaciers. A special database of glacier photographs contains 13,000 pictures from around 500 glaciers, some of them dating back to the 19th century. A key challenge is to combine and extend the traditional observations with fast evolving datasets from new technologies.
MIPS bacterial genomes functional annotation benchmark dataset.
Tetko, Igor V; Brauner, Barbara; Dunger-Kaltenbach, Irmtraud; Frishman, Goar; Montrone, Corinna; Fobo, Gisela; Ruepp, Andreas; Antonov, Alexey V; Surmeli, Dimitrij; Mewes, Hans-Wernen
2005-05-15
Any development of new methods for automatic functional annotation of proteins according to their sequences requires high-quality data (as benchmark) as well as tedious preparatory work to generate sequence parameters required as input data for the machine learning methods. Different program settings and incompatible protocols make a comparison of the analyzed methods difficult. The MIPS Bacterial Functional Annotation Benchmark dataset (MIPS-BFAB) is a new, high-quality resource comprising four bacterial genomes manually annotated according to the MIPS functional catalogue (FunCat). These resources include precalculated sequence parameters, such as sequence similarity scores, InterPro domain composition and other parameters that could be used to develop and benchmark methods for functional annotation of bacterial protein sequences. These data are provided in XML format and can be used by scientists who are not necessarily experts in genome annotation. BFAB is available at http://mips.gsf.de/proj/bfab
2006 Fynmeet sea clutter measurement trial: Datasets
CSIR Research Space (South Africa)
Herselman, PLR
2007-09-06
Full Text Available -011............................................................................................................................................................................................. 25 iii Dataset CAD14-001 0 5 10 15 20 25 30 35 10 20 30 40 50 60 70 80 90 R an ge G at e # Time [s] A bs ol ut e R an ge [m ] RCS [dBm2] vs. time and range for f1 = 9.000 GHz - CAD14-001 2400 2600 2800... 40 10 20 30 40 50 60 70 80 90 R an ge G at e # Time [s] A bs ol ut e R an ge [m ] RCS [dBm2] vs. time and range for f1 = 9.000 GHz - CAD14-002 2400 2600 2800 3000 3200 3400 3600 -30 -25 -20 -15 -10 -5 0 5 10...
A new bed elevation dataset for Greenland
Directory of Open Access Journals (Sweden)
J. L. Bamber
2013-03-01
Full Text Available We present a new bed elevation dataset for Greenland derived from a combination of multiple airborne ice thickness surveys undertaken between the 1970s and 2012. Around 420 000 line kilometres of airborne data were used, with roughly 70% of this having been collected since the year 2000, when the last comprehensive compilation was undertaken. The airborne data were combined with satellite-derived elevations for non-glaciated terrain to produce a consistent bed digital elevation model (DEM over the entire island including across the glaciated–ice free boundary. The DEM was extended to the continental margin with the aid of bathymetric data, primarily from a compilation for the Arctic. Ice thickness was determined where an ice shelf exists from a combination of surface elevation and radar soundings. The across-track spacing between flight lines warranted interpolation at 1 km postings for significant sectors of the ice sheet. Grids of ice surface elevation, error estimates for the DEM, ice thickness and data sampling density were also produced alongside a mask of land/ocean/grounded ice/floating ice. Errors in bed elevation range from a minimum of ±10 m to about ±300 m, as a function of distance from an observation and local topographic variability. A comparison with the compilation published in 2001 highlights the improvement in resolution afforded by the new datasets, particularly along the ice sheet margin, where ice velocity is highest and changes in ice dynamics most marked. We estimate that the volume of ice included in our land-ice mask would raise mean sea level by 7.36 m, excluding any solid earth effects that would take place during ice sheet decay.
Vector-vector production in photon-photon interactions
International Nuclear Information System (INIS)
Ronan, M.T.
1988-01-01
Measurements of exclusive untagged /rho/ 0 /rho/ 0 , /rho//phi/, K/sup *//bar K//sup */, and /rho/ω production and tagged /rho/ 0 /rho/ 0 production in photon-photon interactions by the TPC/Two-Gamma experiment are reviewed. Comparisons to the results of other experiments and to models of vector-vector production are made. Fits to the data following a four quark model prescription for vector meson pair production are also presented. 10 refs., 9 figs
Wind Integration National Dataset Toolkit | Grid Modernization | NREL
Integration National Dataset Toolkit Wind Integration National Dataset Toolkit The Wind Integration National Dataset (WIND) Toolkit is an update and expansion of the Eastern Wind Integration Data Set and Western Wind Integration Data Set. It supports the next generation of wind integration studies. WIND
Solar Integration National Dataset Toolkit | Grid Modernization | NREL
Solar Integration National Dataset Toolkit Solar Integration National Dataset Toolkit NREL is working on a Solar Integration National Dataset (SIND) Toolkit to enable researchers to perform U.S . regional solar generation integration studies. It will provide modeled, coherent subhourly solar power data
Technical note: An inorganic water chemistry dataset (1972–2011 ...
African Journals Online (AJOL)
A national dataset of inorganic chemical data of surface waters (rivers, lakes, and dams) in South Africa is presented and made freely available. The dataset comprises more than 500 000 complete water analyses from 1972 up to 2011, collected from more than 2 000 sample monitoring stations in South Africa. The dataset ...
QSAR ligand dataset for modelling mutagenicity, genotoxicity, and rodent carcinogenicity
Directory of Open Access Journals (Sweden)
Davy Guan
2018-04-01
Full Text Available Five datasets were constructed from ligand and bioassay result data from the literature. These datasets include bioassay results from the Ames mutagenicity assay, Greenscreen GADD-45a-GFP assay, Syrian Hamster Embryo (SHE assay, and 2 year rat carcinogenicity assay results. These datasets provide information about chemical mutagenicity, genotoxicity and carcinogenicity.
Somoano, Brian; Chan, Joanna; Morganroth, Greg
2011-01-01
Facial rejuvenation using local anesthesia has evolved in the past decade as a safer option for patients seeking fewer complications and minimal downtime. Mini- and short-scar face lifts using more conservative incision lengths and extent of undermining can be effective in the younger patient with lower face laxity and minimal loose, elastotic neck skin. By incorporating both an anterior and posterior approach and using an incision length between the mini and more traditional face lift, the Vertical Vector Face Lift can achieve longer-lasting and natural results with lesser cost and risk. Submentoplasty and liposuction of the neck and jawline, fundamental components of the vertical vector face lift, act synergistically with superficial musculoaponeurotic system plication to reestablish a more youthful, sculpted cervicomental angle, even in patients with prominent jowls. Dramatic results can be achieved in the right patient by combining with other procedures such as injectable fillers, chin implants, laser resurfacing, or upper and lower blepharoplasties. © 2011 Wiley Periodicals, Inc.
Vector control in leishmaniasis.
Kishore, K; Kumar, V; Kesari, S; Dinesh, D S; Kumar, A J; Das, P; Bhattacharya, S K
2006-03-01
Indoor residual spraying is a simple and cost effective method of controlling endophilic vectors and DDT remains the insecticide of choice for the control of leishmaniasis. However resistance to insecticide is likely to become more widespread in the population especially in those areas in which insecticide has been used for years. In this context use of slow release emulsified suspension (SRES) may be the best substitute. In this review spraying frequencies of DDT and new schedule of spray have been discussed. Role of biological control and environment management in the control of leishmaniasis has been emphasized. Allethrin (coil) 0.1 and 1.6 per cent prallethrin (liquid) have been found to be effective repellents against Phlebotomus argentipes, the vector of Indian kalaazar. Insecticide impregnated bednets is another area which requires further research on priority basis for the control of leishmaniasis. Role of satellite remote sensing for early prediction of disease by identifying the sandflygenic conditions cannot be undermined. In future synthetic pheromons can be exploited in the control of leishmaniasis.
Progressive Classification Using Support Vector Machines
Wagstaff, Kiri; Kocurek, Michael
2009-01-01
An algorithm for progressive classification of data, analogous to progressive rendering of images, makes it possible to compromise between speed and accuracy. This algorithm uses support vector machines (SVMs) to classify data. An SVM is a machine learning algorithm that builds a mathematical model of the desired classification concept by identifying the critical data points, called support vectors. Coarse approximations to the concept require only a few support vectors, while precise, highly accurate models require far more support vectors. Once the model has been constructed, the SVM can be applied to new observations. The cost of classifying a new observation is proportional to the number of support vectors in the model. When computational resources are limited, an SVM of the appropriate complexity can be produced. However, if the constraints are not known when the model is constructed, or if they can change over time, a method for adaptively responding to the current resource constraints is required. This capability is particularly relevant for spacecraft (or any other real-time systems) that perform onboard data analysis. The new algorithm enables the fast, interactive application of an SVM classifier to a new set of data. The classification process achieved by this algorithm is characterized as progressive because a coarse approximation to the true classification is generated rapidly and thereafter iteratively refined. The algorithm uses two SVMs: (1) a fast, approximate one and (2) slow, highly accurate one. New data are initially classified by the fast SVM, producing a baseline approximate classification. For each classified data point, the algorithm calculates a confidence index that indicates the likelihood that it was classified correctly in the first pass. Next, the data points are sorted by their confidence indices and progressively reclassified by the slower, more accurate SVM, starting with the items most likely to be incorrectly classified. The user
The Lunar Source Disk: Old Lunar Datasets on a New CD-ROM
Hiesinger, H.
1998-01-01
datasets to a selected standard geometry in order to create an "image-cube"-like data pool for further interpretation. The starting point was a number of datasets on a CD-ROM published by the Lunar Consortium. The task of creating an uniform data pool was further complicated by some missing or wrong references and keys on the Lunar Consortium CD as well as erroneous reproduction of some datasets in the literature.
Oil palm mapping for Malaysia using PALSAR-2 dataset
Gong, P.; Qi, C. Y.; Yu, L.; Cracknell, A.
2016-12-01
Oil palm is one of the most productive vegetable oil crops in the world. The main oil palm producing areas are distributed in humid tropical areas such as Malaysia, Indonesia, Thailand, western and central Africa, northern South America, and central America. Increasing market demands, high yields and low production costs of palm oil are the primary factors driving large-scale commercial cultivation of oil palm, especially in Malaysia and Indonesia. Global demand for palm oil has grown exponentially during the last 50 years, and the expansion of oil palm plantations is linked directly to the deforestation of natural forests. Satellite remote sensing plays an important role in monitoring expansion of oil palm. However, optical remote sensing images are difficult to acquire in the Tropics because of the frequent occurrence of thick cloud cover. This problem has led to the use of data obtained by synthetic aperture radar (SAR), which is a sensor capable of all-day/all-weather observation for studies in the Tropics. In this study, the ALOS-2 (Advanced Land Observing Satellite) PALSAR-2 (Phased Array type L-band SAR) datasets for year 2015 were used as an input to a support vector machine (SVM) based machine learning algorithm. Oil palm/non-oil palm samples were collected using a hexagonal equal-area sampling design. High-resolution images in Google Earth and PALSAR-2 imagery were used in human photo-interpretation to separate oil palm from others (i.e. cropland, forest, grassland, shrubland, water, hard surface and bareland). The characteristics of oil palms from various aspects, including PALSAR-2 backscattering coefficients (HH, HV), terrain and climate by using this sample set were further explored to post-process the SVM output. The average accuracy of oil palm type is better than 80% in the final oil palm map for Malaysia.
Experimental demonstration of vector E x vector B plasma divertor
International Nuclear Information System (INIS)
Strait, E.J.; Kerst, D.W.; Sprott, J.C.
1977-01-01
The vector E x vector B drift due to an applied radial electric field in a tokamak with poloidal divertor can speed the flow of plasma out of the scrape-off region, and provide a means of externally controlling the flow rate and thus the width of the density fall-off. An experiment in the Wisconsin levitated toroidal octupole, using vector E x vector B drifts alone, demonstrates divertor-like behavior, including 70% reduction of plasma density near the wall and 40% reduction of plasma flux to the wall, with no adverse effects on confinement of the main plasma
Planar simplification and texturing of dense point cloud maps
Ma, L.; Whelan, T.; Bondarau, Y.; With, de P.H.N.; McDonald, J.
2013-01-01
Dense RGB-D based SLAM techniques and highfidelity LIDAR scanners are examples from an abundant set of systems capable of providing multi-million point datasets. These large datasets quickly become difficult to process and work with due to the sheer volume of data, which typically contains
Incremental and batch planar simplification of dense point cloud maps
Whelan, T.; Ma, L.; Bondarev, E.; With, de P.H.N.; McDonald, J.
2015-01-01
Dense RGB-D SLAM techniques and high-fidelity LIDAR scanners are examples from an abundant set of systems capable of providing multi-million point datasets. These datasets quickly become difficult to process due to the sheer volume of data, typically containing significant redundant information,
Spacelike conformal Killing vectors and spacelike congruences
International Nuclear Information System (INIS)
Mason, D.P.; Tsamparlis, M.
1985-01-01
Necessary and sufficient conditions are derived for space-time to admit a spacelike conformal motion with symmetry vector parallel to a unit spacelike vector field n/sup a/. These conditions are expressed in terms of the shear and expansion of the spacelike congruence generated by n/sup a/ and in terms of the four-velocity of the observer employed at any given point of the congruence. It is shown that either the expansion or the rotation of this spacelike congruence must vanish if Dn/sup a//dp = 0, where p denotes arc length measured along the integral curves of n/sup a/, and also that there exist no proper spacelike homothetic motions with constant expansion. Propagation equations for the projection tensor and the rotation tensor are derived and it is proved that every isometric spacelike congruence is rigid. Fluid space-times are studied in detail. A relation is established between spacelike conformal motions and material curves in the fluid: if a fluid space-time admits a spacelike conformal Killing vector parallel to n/sup a/ and n/sub a/u/sup a/ = 0, where u/sup a/ is the fluid four-velocity, then the integral curves of n/sup a/ are material curves in an irrotational fluid, while if the fluid vorticity is nonzero, then the integral curves of n/sup a/ are material curves if and only if they are vortex lines. An alternative derivation, based on the theory of spacelike congruences, of some of the results of Collins [J. Math. Phys. 25, 995 (1984)] on conformal Killing vectors parallel to the local vorticity vector in shear-free perfect fluids with zero magnetic Weyl tensor is given
Video Vectorization via Tetrahedral Remeshing.
Wang, Chuan; Zhu, Jie; Guo, Yanwen; Wang, Wenping
2017-02-09
We present a video vectorization method that generates a video in vector representation from an input video in raster representation. A vector-based video representation offers the benefits of vector graphics, such as compactness and scalability. The vector video we generate is represented by a simplified tetrahedral control mesh over the spatial-temporal video volume, with color attributes defined at the mesh vertices. We present novel techniques for simplification and subdivision of a tetrahedral mesh to achieve high simplification ratio while preserving features and ensuring color fidelity. From an input raster video, our method is capable of generating a compact video in vector representation that allows a faithful reconstruction with low reconstruction errors.
Extended vector-tensor theories
Energy Technology Data Exchange (ETDEWEB)
Kimura, Rampei; Naruko, Atsushi; Yoshida, Daisuke, E-mail: rampei@th.phys.titech.ac.jp, E-mail: naruko@th.phys.titech.ac.jp, E-mail: yoshida@th.phys.titech.ac.jp [Department of Physics, Tokyo Institute of Technology, 2-12-1 Ookayama, Meguro-ku, Tokyo 152-8551 (Japan)
2017-01-01
Recently, several extensions of massive vector theory in curved space-time have been proposed in many literatures. In this paper, we consider the most general vector-tensor theories that contain up to two derivatives with respect to metric and vector field. By imposing a degeneracy condition of the Lagrangian in the context of ADM decomposition of space-time to eliminate an unwanted mode, we construct a new class of massive vector theories where five degrees of freedom can propagate, corresponding to three for massive vector modes and two for massless tensor modes. We find that the generalized Proca and the beyond generalized Proca theories up to the quartic Lagrangian, which should be included in this formulation, are degenerate theories even in curved space-time. Finally, introducing new metric and vector field transformations, we investigate the properties of thus obtained theories under such transformations.
Improving the lattice axial vector current
International Nuclear Information System (INIS)
Horsley, R.; Perlt, H.; Schiller, A.; Zanotti, J.M.
2015-11-01
For Wilson and clover fermions traditional formulations of the axial vector current do not respect the continuum Ward identity which relates the divergence of that current to the pseudoscalar density. Here we propose to use a point-split or one-link axial vector current whose divergence exactly satisfies a lattice Ward identity, involving the pseudoscalar density and a number of irrelevant operators. We check in one-loop lattice perturbation theory with SLiNC fermion and gauge plaquette action that this is indeed the case including order O(a) effects. Including these operators the axial Ward identity remains renormalisation invariant. First preliminary results of a nonperturbative check of the Ward identity are also presented.
The significance of vector magnetic field measurements
Hagyard, M. J.
1990-01-01
Observations of four flaring solar active regions, obtained during 1980-1986 with the NASA Marshall vector magnetograph (Hagyard et al., 1982 and 1985), are presented graphically and characterized in detail, with reference to nearly simultaneous Big Bear Solar Observatory and USAF ASW H-alpha images. It is shown that the flares occurred where local photospheric magnetic fields differed most from the potential field, with initial brightening on either side of a magnetic-neutral line near the point of maximum angular shear (rather than that of maximum magnetic-field strength, typically 1 kG or greater). Particular emphasis is placed on the fact that these significant nonpotential features were detected only by measuring all three components of the vector magnetic field.
Generalized Toeplitz operators and cyclic vectors
International Nuclear Information System (INIS)
Gassier, G.; Mahzouli, H.; Zerouali, E.H.
2003-04-01
We give in this paper some asymptotic Von Neumann inequalities for power bounded operators in the class C ρ intersection C 1 . and some spacial von Neumann inequalities associated with non zero elements of the point spectrum, when it is non void, of generalized Toeplitz operators. Introducing perturbed kernel, we consider classes C R which extend the classical classes C ρ . We give results about absolute continuity with respect to the Haar measure for operators in class C R intersection C 1 . This allows us to give new results on cyclic vectors for such operators and provides invariant subspaces for their powers. Relationships between cyclic vectors for T and T* involving generalized Toeplitz operators are given and the commutativity of {T}', the commutant of T is discussed. (author)
Dual Vector Spaces and Physical Singularities
Rowlands, Peter
Though we often refer to 3-D vector space as constructed from points, there is no mechanism from within its definition for doing this. In particular, space, on its own, cannot accommodate the singularities that we call fundamental particles. This requires a commutative combination of space as we know it with another 3-D vector space, which is dual to the first (in a physical sense). The combination of the two spaces generates a nilpotent quantum mechanics/quantum field theory, which incorporates exact supersymmetry and ultimately removes the anomalies due to self-interaction. Among the many natural consequences of the dual space formalism are half-integral spin for fermions, zitterbewegung, Berry phase and a zero norm Berwald-Moor metric for fermionic states.
Vector-Quantization using Information Theoretic Concepts
DEFF Research Database (Denmark)
Lehn-Schiøler, Tue; Hegde, Anant; Erdogmus, Deniz
2005-01-01
interpretation and relies on minimization of a well defined cost-function. It is also shown how the potential field approach can be linked to information theory by use of the Parzen density estimator. In the light of information theory it becomes clear that minimizing the free energy of the system is in fact......The process of representing a large data set with a smaller number of vectors in the best possible way, also known as vector quantization, has been intensively studied in the recent years. Very efficient algorithms like the Kohonen Self Organizing Map (SOM) and the Linde Buzo Gray (LBG) algorithm...... have been devised. In this paper a physical approach to the problem is taken, and it is shown that by considering the processing elements as points moving in a potential field an algorithm equally efficient as the before mentioned can be derived. Unlike SOM and LBG this algorithm has a clear physical...
Vector and axial-vector charmoniumlike states
Chen, Wei; Zhu, Shi-Lin
2011-02-01
After constructing all the tetraquark interpolating currents with JPC=1-+, 1--, 1++ and 1+- in a systematic way, we investigate the two-point correlation functions to extract the masses of the charmoniumlike states with QCD sum rule. For the 1-- qcq¯c¯ charmoniumlike state, mX=4.6˜4.7GeV, which implies a possible tetraquark interpretation for the state Y(4660). The masses for both the 1++ qcq¯c¯ and scs¯c¯ charmoniumlike states are around 4.0˜4.2GeV, which are slightly above the mass of X(3872). For the 1-+ and 1+- qcq¯c¯ charmoniumlike states, the extracted masses are around 4.5˜4.7GeV and 4.0˜4.2GeV, respectively. As a by-product, the bottomoniumlike states are also studied. We also discuss the possible decay modes and experimental search of the charmoniumlike states.
Effective population sizes of a major vector of human diseases, Aedes aegypti.
Saarman, Norah P; Gloria-Soria, Andrea; Anderson, Eric C; Evans, Benjamin R; Pless, Evlyn; Cosme, Luciano V; Gonzalez-Acosta, Cassandra; Kamgang, Basile; Wesson, Dawn M; Powell, Jeffrey R
2017-12-01
The effective population size ( N e ) is a fundamental parameter in population genetics that determines the relative strength of selection and random genetic drift, the effect of migration, levels of inbreeding, and linkage disequilibrium. In many cases where it has been estimated in animals, N e is on the order of 10%-20% of the census size. In this study, we use 12 microsatellite markers and 14,888 single nucleotide polymorphisms (SNPs) to empirically estimate N e in Aedes aegypti , the major vector of yellow fever, dengue, chikungunya, and Zika viruses. We used the method of temporal sampling to estimate N e on a global dataset made up of 46 samples of Ae. aegypti that included multiple time points from 17 widely distributed geographic localities. Our N e estimates for Ae. aegypti fell within a broad range (~25-3,000) and averaged between 400 and 600 across all localities and time points sampled. Adult census size (N c ) estimates for this species range between one and five thousand, so the N e / N c ratio is about the same as for most animals. These N e values are lower than estimates available for other insects and have important implications for the design of genetic control strategies to reduce the impact of this species of mosquito on human health.
3D Model Retrieval Based on Vector Quantisation Index Histograms
International Nuclear Information System (INIS)
Lu, Z M; Luo, H; Pan, J S
2006-01-01
This paper proposes a novel technique to retrieval 3D mesh models using vector quantisation index histograms. Firstly, points are sampled uniformly on mesh surface. Secondly, to a point five features representing global and local properties are extracted. Thus feature vectors of points are obtained. Third, we select several models from each class, and employ their feature vectors as a training set. After training using LBG algorithm, a public codebook is constructed. Next, codeword index histograms of the query model and those in database are computed. The last step is to compute the distance between histograms of the query and those of the models in database. Experimental results show the effectiveness of our method
Principal-vector-directed fringe-tracking technique.
Zhang, Zhihui; Guo, Hongwei
2014-11-01
Fringe tracking is one of the most straightforward techniques for analyzing a single fringe pattern. This work presents a principal-vector-directed fringe-tracking technique. It uses Gaussian derivatives for estimating fringe gradients and uses hysteresis thresholding for segmenting singular points, thus improving the principal component analysis method. Using it allows us to estimate the principal vectors of fringes from a pattern with high noise. The fringe-tracking procedure is directed by these principal vectors, so that erroneous results induced by noise and other error-inducing factors are avoided. At the same time, the singular point regions of the fringe pattern are identified automatically. Using them allows us to determine paths through which the "seed" point for each fringe skeleton is easy to find, thus alleviating the computational burden in processing the fringe pattern. The results of a numerical simulation and experiment demonstrate this method to be valid.
Optimality Conditions in Vector Optimization
Jiménez, Manuel Arana; Lizana, Antonio Rufián
2011-01-01
Vector optimization is continuously needed in several science fields, particularly in economy, business, engineering, physics and mathematics. The evolution of these fields depends, in part, on the improvements in vector optimization in mathematical programming. The aim of this Ebook is to present the latest developments in vector optimization. The contributions have been written by some of the most eminent researchers in this field of mathematical programming. The Ebook is considered essential for researchers and students in this field.
Symmetric vectors and algebraic classification
International Nuclear Information System (INIS)
Leibowitz, E.
1980-01-01
The concept of symmetric vector field in Riemannian manifolds, which arises in the study of relativistic cosmological models, is analyzed. Symmetric vectors are tied up with the algebraic properties of the manifold curvature. A procedure for generating a congruence of symmetric fields out of a given pair is outlined. The case of a three-dimensional manifold of constant curvature (''isotropic universe'') is studied in detail, with all its symmetric vector fields being explicitly constructed
Vector continued fractions using a generalized inverse
International Nuclear Information System (INIS)
Haydock, Roger; Nex, C M M; Wexler, Geoffrey
2004-01-01
A real vector space combined with an inverse (involution) for vectors is sufficient to define a vector continued fraction whose parameters consist of vector shifts and changes of scale. The choice of sign for different components of the vector inverse permits construction of vector analogues of the Jacobi continued fraction. These vector Jacobi fractions are related to vector and scalar-valued polynomial functions of the vectors, which satisfy recurrence relations similar to those of orthogonal polynomials. The vector Jacobi fraction has strong convergence properties which are demonstrated analytically, and illustrated numerically
Directory of Open Access Journals (Sweden)
Nathan J Kenny
2016-10-01
Full Text Available Abstract Background The gastropod mollusc Biomphalaria glabrata is well known as a vector for the tropical disease schistosomiasis, which affects nearly 200 million people worldwide. Despite intensive study, our understanding of the genetic basis of B. glabrata development, growth and disease resistance is constrained by limited genetic resources, constraints for which next-generation sequencing methods provide a ready solution. Methods Illumina sequencing and de novo assembly using the Trinity program was used to generate a high-quality transcriptomic dataset spanning the entirety of in ovo development in schistosomiasis-free B. glabrata. This was subjected to automated (KEGG, BLAST2GO and manual annotation efforts, allowing insight into the gene complements of this species in a number of contexts. Results Excellent dataset recovery was observed, with 133,084 contigs produced of mean size 2219.48 bp. 80,952 (60.8 % returned a BLASTx hit with an E value of less than 10-3, and 74,492 (55.97 % were either mapped or assigned a GO identity using the BLAST2GO program. The CEGMA set of core eukaryotic genes was found to be 99.6 % present, indicating exceptional transcriptome completeness. We were able to identify a wealth of disease-pathway related genes within our dataset, including the Wnt, apoptosis and Notch pathways. This provides an invaluable reference point for further work into molluscan development and evolution, for studying the impact of schistosomiasis in this species, and perhaps providing targets for the treatment of this widespread disease. Conclusions Here we present a deep transcriptome of an embryonic sample of schistosomiasis-free B. glabrata, presenting a comprehensive dataset for comparison to disease-affected specimens and from which conclusions can be drawn about the genetics of this widespread medical model. Furthermore, the dataset provided by this sequencing provides a useful reference point for comparison to other mollusc
NERIES: Seismic Data Gateways and User Composed Datasets Metadata Management
Spinuso, Alessandro; Trani, Luca; Kamb, Linus; Frobert, Laurent
2010-05-01
One of the NERIES EC project main objectives is to establish and improve the networking of seismic waveform data exchange and access among four main data centers in Europe: INGV, GFZ, ORFEUS and IPGP. Besides the implementation of the data backbone, several investigations and developments have been conducted in order to offer to the users the data available from this network, either programmatically or interactively. One of the challenges is to understand how to enable users` activities such as discovering, aggregating, describing and sharing datasets to obtain a decrease in the replication of similar data queries towards the network, exempting the data centers to guess and create useful pre-packed products. We`ve started to transfer this task more and more towards the users community, where the users` composed data products could be extensively re-used. The main link to the data is represented by a centralized webservice (SeismoLink) acting like a single access point to the whole data network. Users can download either waveform data or seismic station inventories directly from their own software routines by connecting to this webservice, which routes the request to the data centers. The provenance of the data is maintained and transferred to the users in the form of URIs, that identify the dataset and implicitly refer to the data provider. SeismoLink, combined with other webservices (eg EMSC-QuakeML earthquakes catalog service), is used from a community gateway such as the NERIES web portal (http://www.seismicportal.eu). Here the user interacts with a map based portlet which allows the dynamic composition of a data product, binding seismic event`s parameters with a set of seismic stations. The requested data is collected by the back-end processes of the portal, preserved and offered to the user in a personal data cart, where metadata can be generated interactively on-demand. The metadata, expressed in RDF, can also be remotely ingested. They offer rating
System for Automated Calibration of Vector Modulators
Lux, James; Boas, Amy; Li, Samuel
2009-01-01
Vector modulators are used to impose baseband modulation on RF signals, but non-ideal behavior limits the overall performance. The non-ideal behavior of the vector modulator is compensated using data collected with the use of an automated test system driven by a LabVIEW program that systematically applies thousands of control-signal values to the device under test and collects RF measurement data. The technology innovation automates several steps in the process. First, an automated test system, using computer controlled digital-to-analog converters (DACs) and a computer-controlled vector network analyzer (VNA) systematically can apply different I and Q signals (which represent the complex number by which the RF signal is multiplied) to the vector modulator under test (VMUT), while measuring the RF performance specifically, gain and phase. The automated test system uses the LabVIEW software to control the test equipment, collect the data, and write it to a file. The input to the Lab - VIEW program is either user-input for systematic variation, or is provided in a file containing specific test values that should be fed to the VMUT. The output file contains both the control signals and the measured data. The second step is to post-process the file to determine the correction functions as needed. The result of the entire process is a tabular representation, which allows translation of a desired I/Q value to the required analog control signals to produce a particular RF behavior. In some applications, corrected performance is needed only for a limited range. If the vector modulator is being used as a phase shifter, there is only a need to correct I and Q values that represent points on a circle, not the entire plane. This innovation has been used to calibrate 2-GHz MMIC (monolithic microwave integrated circuit) vector modulators in the High EIRP Cluster Array project (EIRP is high effective isotropic radiated power). These calibrations were then used to create
U.S. Environmental Protection Agency — Point geospatial dataset representing locations of NPDES facilities, outfalls/dischargers, waste water treatment plant facilities and waste water treatment plants...
U.S. Environmental Protection Agency — Point geospatial dataset representing locations of NPDES Facilities. NPDES (National Pollution Discharge Elimination System) is an EPA permit program that regulates...
Superfund Removal Site Points, Region 9, 2012, US EPA Region 9
U.S. Environmental Protection Agency — Point geospatial dataset representing locations of CERCLA (Superfund) Removal sites. CERCLA (Comprehensive Environmental Response, Compensation, and Liability Act)...
U.S. Environmental Protection Agency — Point geospatial dataset representing locations of NPDES Facilities. NPDES (National Pollution Discharge Elimination System) is an EPA permit program that regulates...
U.S. Environmental Protection Agency — Point geospatial dataset representing locations of NPDES Facilities. NPDES (National Pollution Discharge Elimination System) is an EPA permit program that regulates...
International Nuclear Information System (INIS)
Nelson, Ann E.; Walsh, Jonathan
2008-01-01
We show that for a force mediated by a vector particle coupled to a conserved U(1) charge, the apparent range and strength can depend on the size and density of the source, and the proximity to other sources. This chameleon effect is due to screening from a light charged scalar. Such screening can weaken astrophysical constraints on new gauge bosons. As an example we consider the constraints on chameleonic gauged B-L. We show that although Casimir measurements greatly constrain any B-L force much stronger than gravity with range longer than 0.1 μm, there remains an experimental window for a long-range chameleonic B-L force. Such a force could be much stronger than gravity, and long or infinite range in vacuum, but have an effective range near the surface of the earth which is less than a micron.
Architecture and Vector Control
DEFF Research Database (Denmark)
von Seidlein, Lorenz; Knols, Bart GJ; Kirby, Matthew
2012-01-01
, closing of eaves and insecticide treated bednets. All of these interventions have an effect on the indoor climate. Temperature, humidity and airflow are critical for a comfortable climate. Air-conditioning and fans allow us to control indoor climate, but many people in Africa and Asia who carry the brunt...... of vector-borne diseases have no access to electricity. Many houses in the hot, humid regions of Asia have adapted to the environment, they are built of porous materials and are elevated on stilts features which allow a comfortable climate even in the presence of bednets and screens. In contrast, many...... buildings in Africa and Asia in respect to their indoor climate characteristics and finally, show how state-of-the-art 3D modelling can predict climate characteristics and help to optimize buildings....
International Nuclear Information System (INIS)
Ginelli, Francesco; Politi, Antonio; Chaté, Hugues; Livi, Roberto
2013-01-01
Recent years have witnessed a growing interest in covariant Lyapunov vectors (CLVs) which span local intrinsic directions in the phase space of chaotic systems. Here, we review the basic results of ergodic theory, with a specific reference to the implications of Oseledets’ theorem for the properties of the CLVs. We then present a detailed description of a ‘dynamical’ algorithm to compute the CLVs and show that it generically converges exponentially in time. We also discuss its numerical performance and compare it with other algorithms presented in the literature. We finally illustrate how CLVs can be used to quantify deviations from hyperbolicity with reference to a dissipative system (a chain of Hénon maps) and a Hamiltonian model (a Fermi–Pasta–Ulam chain). This article is part of a special issue of Journal of Physics A: Mathematical and Theoretical devoted to ‘Lyapunov analysis: from dynamical systems theory to applications’. (paper)
Full-Scale Approximations of Spatio-Temporal Covariance Models for Large Datasets
Zhang, Bohai
2014-01-01
Various continuously-indexed spatio-temporal process models have been constructed to characterize spatio-temporal dependence structures, but the computational complexity for model fitting and predictions grows in a cubic order with the size of dataset and application of such models is not feasible for large datasets. This article extends the full-scale approximation (FSA) approach by Sang and Huang (2012) to the spatio-temporal context to reduce computational complexity. A reversible jump Markov chain Monte Carlo (RJMCMC) algorithm is proposed to select knots automatically from a discrete set of spatio-temporal points. Our approach is applicable to nonseparable and nonstationary spatio-temporal covariance models. We illustrate the effectiveness of our method through simulation experiments and application to an ozone measurement dataset.
Chord Recognition Based on Temporal Correlation Support Vector Machine
Directory of Open Access Journals (Sweden)
Zhongyang Rao
2016-05-01
Full Text Available In this paper, we propose a method called temporal correlation support vector machine (TCSVM for automatic major-minor chord recognition in audio music. We first use robust principal component analysis to separate the singing voice from the music to reduce the influence of the singing voice and consider the temporal correlations of the chord features. Using robust principal component analysis, we expect the low-rank component of the spectrogram matrix to contain the musical accompaniment and the sparse component to contain the vocal signals. Then, we extract a new logarithmic pitch class profile (LPCP feature called enhanced LPCP from the low-rank part. To exploit the temporal correlation among the LPCP features of chords, we propose an improved support vector machine algorithm called TCSVM. We perform this study using the MIREX’09 (Music Information Retrieval Evaluation eXchange Audio Chord Estimation dataset. Furthermore, we conduct comprehensive experiments using different pitch class profile feature vectors to examine the performance of TCSVM. The results of our method are comparable to the state-of-the-art methods that entered the MIREX in 2013 and 2014 for the MIREX’09 Audio Chord Estimation task dataset.
Statistical segmentation of multidimensional brain datasets
Desco, Manuel; Gispert, Juan D.; Reig, Santiago; Santos, Andres; Pascau, Javier; Malpica, Norberto; Garcia-Barreno, Pedro
2001-07-01
This paper presents an automatic segmentation procedure for MRI neuroimages that overcomes part of the problems involved in multidimensional clustering techniques like partial volume effects (PVE), processing speed and difficulty of incorporating a priori knowledge. The method is a three-stage procedure: 1) Exclusion of background and skull voxels using threshold-based region growing techniques with fully automated seed selection. 2) Expectation Maximization algorithms are used to estimate the probability density function (PDF) of the remaining pixels, which are assumed to be mixtures of gaussians. These pixels can then be classified into cerebrospinal fluid (CSF), white matter and grey matter. Using this procedure, our method takes advantage of using the full covariance matrix (instead of the diagonal) for the joint PDF estimation. On the other hand, logistic discrimination techniques are more robust against violation of multi-gaussian assumptions. 3) A priori knowledge is added using Markov Random Field techniques. The algorithm has been tested with a dataset of 30 brain MRI studies (co-registered T1 and T2 MRI). Our method was compared with clustering techniques and with template-based statistical segmentation, using manual segmentation as a gold-standard. Our results were more robust and closer to the gold-standard.
ASSESSING SMALL SAMPLE WAR-GAMING DATASETS
Directory of Open Access Journals (Sweden)
W. J. HURLEY
2013-10-01
Full Text Available One of the fundamental problems faced by military planners is the assessment of changes to force structure. An example is whether to replace an existing capability with an enhanced system. This can be done directly with a comparison of measures such as accuracy, lethality, survivability, etc. However this approach does not allow an assessment of the force multiplier effects of the proposed change. To gauge these effects, planners often turn to war-gaming. For many war-gaming experiments, it is expensive, both in terms of time and dollars, to generate a large number of sample observations. This puts a premium on the statistical methodology used to examine these small datasets. In this paper we compare the power of three tests to assess population differences: the Wald-Wolfowitz test, the Mann-Whitney U test, and re-sampling. We employ a series of Monte Carlo simulation experiments. Not unexpectedly, we find that the Mann-Whitney test performs better than the Wald-Wolfowitz test. Resampling is judged to perform slightly better than the Mann-Whitney test.
Dynamical analysis for a vector-like dark energy
Energy Technology Data Exchange (ETDEWEB)
Landim, Ricardo C.G. [Instituto de Fisica, Universidade de Sao Paulo, Departamento de Fisica-Matematica, Sao Paulo, SP (Brazil)
2016-09-15
In this paper we perform a dynamical analysis for a vector field as a candidate for the dark energy, in the presence of a barotropic fluid. The vector is one component of the so-called cosmic triad, which is a set of three identical copies of an abelian field pointing mutually in orthogonal directions. In order to generalize the analysis, we also assumed the interaction between dark energy and the barotropic fluid, with a phenomenological coupling. Both matter and dark energy eras can be successfully described by the critical points, indicating that the dynamical system theory is a viable tool to analyze asymptotic states of such cosmological models. (orig.)
UniMiB SHAR: A Dataset for Human Activity Recognition Using Acceleration Data from Smartphones
Directory of Open Access Journals (Sweden)
Daniela Micucci
2017-10-01
Full Text Available Smartphones, smartwatches, fitness trackers, and ad-hoc wearable devices are being increasingly used to monitor human activities. Data acquired by the hosted sensors are usually processed by machine-learning-based algorithms to classify human activities. The success of those algorithms mostly depends on the availability of training (labeled data that, if made publicly available, would allow researchers to make objective comparisons between techniques. Nowadays, there are only a few publicly available data sets, which often contain samples from subjects with too similar characteristics, and very often lack specific information so that is not possible to select subsets of samples according to specific criteria. In this article, we present a new dataset of acceleration samples acquired with an Android smartphone designed for human activity recognition and fall detection. The dataset includes 11,771 samples of both human activities and falls performed by 30 subjects of ages ranging from 18 to 60 years. Samples are divided in 17 fine grained classes grouped in two coarse grained classes: one containing samples of 9 types of activities of daily living (ADL and the other containing samples of 8 types of falls. The dataset has been stored to include all the information useful to select samples according to different criteria, such as the type of ADL performed, the age, the gender, and so on. Finally, the dataset has been benchmarked with four different classifiers and with two different feature vectors. We evaluated four different classification tasks: fall vs. no fall, 9 activities, 8 falls, 17 activities and falls. For each classification task, we performed a 5-fold cross-validation (i.e., including samples from all the subjects in both the training and the test dataset and a leave-one-subject-out cross-validation (i.e., the test data include the samples of a subject only, and the training data, the samples of all the other subjects. Regarding the
[Research on developping the spectral dataset for Dunhuang typical colors based on color constancy].
Liu, Qiang; Wan, Xiao-Xia; Liu, Zhen; Li, Chan; Liang, Jin-Xing
2013-11-01
The present paper aims at developping a method to reasonably set up the typical spectral color dataset for different kinds of Chinese cultural heritage in color rendering process. The world famous wall paintings dating from more than 1700 years ago in Dunhuang Mogao Grottoes was taken as typical case in this research. In order to maintain the color constancy during the color rendering workflow of Dunhuang culture relics, a chromatic adaptation based method for developping the spectral dataset of typical colors for those wall paintings was proposed from the view point of human vision perception ability. Under the help and guidance of researchers in the art-research institution and protection-research institution of Dunhuang Academy and according to the existing research achievement of Dunhuang Research in the past years, 48 typical known Dunhuang pigments were chosen and 240 representative color samples were made with reflective spectral ranging from 360 to 750 nm was acquired by a spectrometer. In order to find the typical colors of the above mentioned color samples, the original dataset was devided into several subgroups by clustering analysis. The grouping number, together with the most typical samples for each subgroup which made up the firstly built typical color dataset, was determined by wilcoxon signed rank test according to the color inconstancy index comprehensively calculated under 6 typical illuminating conditions. Considering the completeness of gamut of Dunhuang wall paintings, 8 complementary colors was determined and finally the typical spectral color dataset was built up which contains 100 representative spectral colors. The analytical calculating results show that the median color inconstancy index of the built dataset in 99% confidence level by wilcoxon signed rank test was 3.28 and the 100 colors are distributing in the whole gamut uniformly, which ensures that this dataset can provide reasonable reference for choosing the color with highest
International Nuclear Information System (INIS)
Shaw, L; Mehari, F; Weckenmann, A; Ettl, S; Häusler, G
2013-01-01
Multisensor systems with optical 3D sensors are frequently employed to capture complete surface information by measuring workpieces from different views. During coarse and fine registration the resulting datasets are afterward transformed into one common coordinate system. Automatic fine registration methods are well established in dimensional metrology, whereas there is a deficit in automatic coarse registration methods. The advantage of a fully automatic registration procedure is twofold: it enables a fast and contact-free alignment and further a flexible application to datasets of any kind of optical 3D sensor. In this paper, an algorithm adapted for a robust automatic coarse registration is presented. The method was originally developed for the field of object reconstruction or localization. It is based on a segmentation of planes in the datasets to calculate the transformation parameters. The rotation is defined by the normals of three corresponding segmented planes of two overlapping datasets, while the translation is calculated via the intersection point of the segmented planes. First results have shown that the translation is strongly shape dependent: 3D data of objects with non-orthogonal planar flanks cannot be registered with the current method. In the novel supplement for the algorithm, the translation is additionally calculated via the distance between centroids of corresponding segmented planes, which results in more than one option for the transformation. A newly introduced measure considering the distance between the datasets after coarse registration evaluates the best possible transformation. Results of the robust automatic registration method are presented on the example of datasets taken from a cutting tool with a fringe-projection system and a focus-variation system. The successful application in dimensional metrology is proven with evaluations of shape parameters based on the registered datasets of a calibrated workpiece. (paper)
The standardised freight container: vector of vectors and vector-borne diseases.
Reiter, P
2010-04-01
The standardised freight container was one of the most important innovations of the 20th Century. Containerised cargoes travel from their point of origin to their destination by ship, road and rail as part of a single journey, without unpacking. This simple concept is the key element in cheap, rapid transport by land and sea, and has led to a phenomenal growth in global trade. Likewise, containerised air cargo has led to a remarkable increase in the inter-continental transportation of goods, particularly perishable items such as flowers, fresh vegetables and live animals. In both cases, containerisation offers great advantages in speed and security, but reduces the opportunity to inspect cargoes in transit. An inevitable consequence is the globalisation of undesirable species of animals, plants and pathogens. Moreover, cheap passenger flights offer worldwide travel for viral and parasitic pathogens in infected humans. The continued emergence of exotic pests, vectors and pathogens throughout the world is an unavoidable consequence of these advances in transportation technology.
Simplified Representation of Vector Fields
Telea, Alexandru; Wijk, Jarke J. van
1999-01-01
Vector field visualization remains a difficult task. Although many local and global visualization methods for vector fields such as flow data exist, they usually require extensive user experience on setting the visualization parameters in order to produce images communicating the desired insight. We
Estimation of Motion Vector Fields
DEFF Research Database (Denmark)
Larsen, Rasmus
1993-01-01
This paper presents an approach to the estimation of 2-D motion vector fields from time varying image sequences. We use a piecewise smooth model based on coupled vector/binary Markov random fields. We find the maximum a posteriori solution by simulated annealing. The algorithm generate sample...... fields by means of stochastic relaxation implemented via the Gibbs sampler....
GPU Accelerated Vector Median Filter
Aras, Rifat; Shen, Yuzhong
2011-01-01
Noise reduction is an important step for most image processing tasks. For three channel color images, a widely used technique is vector median filter in which color values of pixels are treated as 3-component vectors. Vector median filters are computationally expensive; for a window size of n x n, each of the n(sup 2) vectors has to be compared with other n(sup 2) - 1 vectors in distances. General purpose computation on graphics processing units (GPUs) is the paradigm of utilizing high-performance many-core GPU architectures for computation tasks that are normally handled by CPUs. In this work. NVIDIA's Compute Unified Device Architecture (CUDA) paradigm is used to accelerate vector median filtering. which has to the best of our knowledge never been done before. The performance of GPU accelerated vector median filter is compared to that of the CPU and MPI-based versions for different image and window sizes, Initial findings of the study showed 100x improvement of performance of vector median filter implementation on GPUs over CPU implementations and further speed-up is expected after more extensive optimizations of the GPU algorithm .
Archimedeanization of ordered vector spaces
Emelyanov, Eduard Yu.
2014-01-01
In the case of an ordered vector space with an order unit, the Archimedeanization method has been developed recently by V.I Paulsen and M. Tomforde. We present a general version of the Archimedeanization which covers arbitrary ordered vector spaces.
Geochemical Fingerprinting of Coltan Ores by Machine Learning on Uneven Datasets
International Nuclear Information System (INIS)
Savu-Krohn, Christian; Rantitsch, Gerd; Auer, Peter; Melcher, Frank; Graupner, Torsten
2011-01-01
Two modern machine learning techniques, Linear Programming Boosting (LPBoost) and Support Vector Machines (SVMs), are introduced and applied to a geochemical dataset of niobium–tantalum (“coltan”) ores from Central Africa to demonstrate how such information may be used to distinguish ore provenance, i.e., place of origin. The compositional data used include uni- and multivariate outliers and elemental distributions are not described by parametric frequency distribution functions. The “soft margin” techniques of LPBoost and SVMs can be applied to such data. Optimization of their learning parameters results in an average accuracy of up to c. 92%, if spot measurements are assessed to estimate the provenance of ore samples originating from two geographically defined source areas. A parameterized performance measure, together with common methods for its optimization, was evaluated to account for the presence of uneven datasets. Optimization of the classification function threshold improves the performance, as class importance is shifted towards one of those classes. For this dataset, the average performance of the SVMs is significantly better compared to that of LPBoost.
A Bayesian spatio-temporal geostatistical model with an auxiliary lattice for large datasets
Xu, Ganggang
2015-01-01
When spatio-temporal datasets are large, the computational burden can lead to failures in the implementation of traditional geostatistical tools. In this paper, we propose a computationally efficient Bayesian hierarchical spatio-temporal model in which the spatial dependence is approximated by a Gaussian Markov random field (GMRF) while the temporal correlation is described using a vector autoregressive model. By introducing an auxiliary lattice on the spatial region of interest, the proposed method is not only able to handle irregularly spaced observations in the spatial domain, but it is also able to bypass the missing data problem in a spatio-temporal process. Because the computational complexity of the proposed Markov chain Monte Carlo algorithm is of the order O(n) with n the total number of observations in space and time, our method can be used to handle very large spatio-temporal datasets with reasonable CPU times. The performance of the proposed model is illustrated using simulation studies and a dataset of precipitation data from the coterminous United States.
Standardization of GIS datasets for emergency preparedness of NPPs
International Nuclear Information System (INIS)
Saindane, Shashank S.; Suri, M.M.K.; Otari, Anil; Pradeepkumar, K.S.
2012-01-01
Probability of a major nuclear accident which can lead to large scale release of radioactivity into environment is extremely small by the incorporation of safety systems and defence-in-depth philosophy. Nevertheless emergency preparedness for implementation of counter measures to reduce the consequences are required for all major nuclear facilities. Iodine prophylaxis, Sheltering, evacuation etc. are protective measures to be implemented for members of public in the unlikely event of any significant releases from nuclear facilities. Bhabha Atomic Research Centre has developed a GIS supported Nuclear Emergency Preparedness Program. Preparedness for Response to Nuclear emergencies needs geographical details of the affected locations specially Nuclear Power Plant Sites and nearby public domain. Geographical information system data sets which the planners are looking for will have appropriate details in order to take decision and mobilize the resources in time and follow the Standard Operating Procedures. Maps are 2-dimensional representations of our real world and GIS makes it possible to manipulate large amounts of geo-spatially referenced data and convert it into information. This has become an integral part of the nuclear emergency preparedness and response planning. This GIS datasets consisting of layers such as village settlements, roads, hospitals, police stations, shelters etc. is standardized and effectively used during the emergency. The paper focuses on the need of standardization of GIS datasets which in turn can be used as a tool to display and evaluate the impact of standoff distances and selected zones in community planning. It will also highlight the database specifications which will help in fast processing of data and analysis to derive useful and helpful information. GIS has the capability to store, manipulate, analyze and display the large amount of required spatial and tabular data. This study intends to carry out a proper response and preparedness
Current status of Plasmodium knowlesi vectors: a public health concern?
Vythilingam, I; Wong, M L; Wan-Yussof, W S
2018-01-01
Plasmodium knowlesi a simian malaria parasite is currently affecting humans in Southeast Asia. Malaysia has reported the most number of cases and P. knowlesi is the predominant species occurring in humans. The vectors of P. knowlesi belong to the Leucosphyrus group of Anopheles mosquitoes. These are generally described as forest-dwelling mosquitoes. With deforestation and changes in land-use, some species have become predominant in farms and villages. However, knowledge on the distribution of these vectors in the country is sparse. From a public health point of view it is important to know the vectors, so that risk factors towards knowlesi malaria can be identified and control measures instituted where possible. Here, we review what is known about the knowlesi malaria vectors and ascertain the gaps in knowledge, so that future studies could concentrate on this paucity of data in-order to address this zoonotic problem.
Vector superconductivity in cosmic strings
International Nuclear Information System (INIS)
Dvali, G.R.; Mahajan, S.M.
1992-03-01
We argue that in most realistic cases, the usual Witten-type bosonic superconductivity of the cosmic string is automatically (independent of the existence of superconducting currents) accompanied by the condensation of charged gauge vector bosons in the core giving rise to a new vector type superconductivity. The value of the charged vector condensate is related with the charged scalar expectation value, and vanishes only if the latter goes to zero. The mechanism for the proposed vector superconductivity, differing fundamentally from those in the literature, is delineated using the simplest realistic example of the two Higgs doublet standard model interacting with the extra cosmic string. It is shown that for a wide range of parameters, for which the string becomes scalarly superconducting, W boson condensates (the sources of vector superconductivity) are necessarily excited. (author). 14 refs
Verification of target motion effects on SAR imagery using the Gotcha GMTI challenge dataset
Hack, Dan E.; Saville, Michael A.
2010-04-01
This paper investigates the relationship between a ground moving target's kinematic state and its SAR image. While effects such as cross-range offset, defocus, and smearing appear well understood, their derivations in the literature typically employ simplifications of the radar/target geometry and assume point scattering targets. This study adopts a geometrical model for understanding target motion effects in SAR imagery, termed the target migration path, and focuses on experimental verification of predicted motion effects using both simulated and empirical datasets based on the Gotcha GMTI challenge dataset. Specifically, moving target imagery is generated from three data sources: first, simulated phase history for a moving point target; second, simulated phase history for a moving vehicle derived from a simulated Mazda MPV X-band signature; and third, empirical phase history from the Gotcha GMTI challenge dataset. Both simulated target trajectories match the truth GPS target position history from the Gotcha GMTI challenge dataset, allowing direct comparison between all three imagery sets and the predicted target migration path. This paper concludes with a discussion of the parallels between the target migration path and the measurement model within a Kalman filtering framework, followed by conclusions.
Hawaii ESI: M_MAMPT (Marine Mammal Points)
National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains sensitive biological resource data for endangered Hawaiian monk seal pupping and haul-out sites. Vector points in this data set represent...
American Samoa ESI: T_MAMPT (Terrestrial Mammal Points)
National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains sensitive biological resource data for bats in American Samoa. Vector points in this data set represent bat roosts and caves. Species-specific...
Cook Inlet and Kenai Peninsula, Alaska ESI: VOLCANOS (Volcano Points)
National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains the locations of volcanos in Cook Inlet and Kenai Peninsula, Alaska. Vector points in the data set represent the location of the volcanos....
Western Alaska ESI: SOCECON (Socioeconomic Resource Points and Lines)
National Oceanic and Atmospheric Administration, Department of Commerce — This data set contains human-use resource data for airports, mining sites, area boundaries, and scenic rivers in Western Alaska. Vector points and lines in this data...
Emerging vector borne diseases – incidence through vectors
Directory of Open Access Journals (Sweden)
Sara eSavic
2014-12-01
Full Text Available Vector borne diseases use to be a major public health concern only in tropical and subtropical areas, but today they are an emerging threat for the continental and developed countries also. Nowdays, in intercontinetal countries, there is a struggle with emerging diseases which have found their way to appear through vectors. Vector borne zoonotic diseases occur when vectors, animal hosts, climate conditions, pathogens and susceptible human population exist at the same time, at the same place. Global climate change is predicted to lead to an increase in vector borne infectious diseases and disease outbreaks. It could affect the range and popultion of pathogens, host and vectors, transmission season, etc. Reliable surveilance for diseases that are most likely to emerge is required. Canine vector borne diseases represent a complex group of diseases including anaplasmosis, babesiosis, bartonellosis, borreliosis, dirofilariosis, erlichiosis, leishmaniosis. Some of these diseases cause serious clinical symptoms in dogs and some of them have a zoonotic potential with an effect to public health. It is expected from veterinarians in coordination with medical doctors to play a fudamental role at primeraly prevention and then treatment of vector borne diseases in dogs. The One Health concept has to be integrated into the struggle against emerging diseases.During a four year period, from 2009-2013, a total number of 551 dog samples were analysed for vector borne diseases (borreliosis, babesiosis, erlichiosis, anaplasmosis, dirofilariosis and leishmaniasis in routine laboratory work. The analysis were done by serological tests – ELISA for borreliosis, dirofilariosis and leishmaniasis, modified Knott test for dirofilariosis and blood smear for babesiosis, erlichiosis and anaplasmosis. This number of samples represented 75% of total number of samples that were sent for analysis for different diseases in dogs. Annually, on avarege more then half of the samples
``Massless'' vector field in de Sitter universe
Garidi, T.; Gazeau, J.-P.; Rouhani, S.; Takook, M. V.
2008-03-01
We proceed to the quantization of the massless vector field in the de Sitter (dS) space. This work is the natural continuation of a previous article devoted to the quantization of the dS massive vector field [J. P. Gazeau and M. V. Takook, J. Math. Phys. 41, 5920 (2000); T. Garidi et al., ibid. 43, 6379 (2002).] The term ``massless'' is used by reference to conformal invariance and propagation on the dS lightcone whereas ``massive'' refers to those dS fields which unambiguously contract to Minkowskian massive fields at zero curvature. Due to the combined occurrences of gauge invariance and indefinite metric, the covariant quantization of the massless vector field requires an indecomposable representation of the de Sitter group. We work with the gauge fixing corresponding to the simplest Gupta-Bleuler structure. The field operator is defined with the help of coordinate-independent de Sitter waves (the modes). The latter are simple to manipulate and most adapted to group theoretical approaches. The physical states characterized by the divergencelessness condition are, for instance, easy to identify. The whole construction is based on analyticity requirements in the complexified pseudo-Riemannian manifold for the modes and the two-point function.
''Massless'' vector field in de Sitter universe
International Nuclear Information System (INIS)
Garidi, T.; Gazeau, J.-P.; Rouhani, S.; Takook, M. V.
2008-01-01
We proceed to the quantization of the massless vector field in the de Sitter (dS) space. This work is the natural continuation of a previous article devoted to the quantization of the dS massive vector field [J. P. Gazeau and M. V. Takook, J. Math. Phys. 41, 5920 (2000); T. Garidi et al., ibid. 43, 6379 (2002).] The term ''massless'' is used by reference to conformal invariance and propagation on the dS lightcone whereas ''massive'' refers to those dS fields which unambiguously contract to Minkowskian massive fields at zero curvature. Due to the combined occurrences of gauge invariance and indefinite metric, the covariant quantization of the massless vector field requires an indecomposable representation of the de Sitter group. We work with the gauge fixing corresponding to the simplest Gupta-Bleuler structure. The field operator is defined with the help of coordinate-independent de Sitter waves (the modes). The latter are simple to manipulate and most adapted to group theoretical approaches. The physical states characterized by the divergencelessness condition are, for instance, easy to identify. The whole construction is based on analyticity requirements in the complexified pseudo-Riemannian manifold for the modes and the two-point function
Relative Error Evaluation to Typical Open Global dem Datasets in Shanxi Plateau of China
Zhao, S.; Zhang, S.; Cheng, W.
2018-04-01
Produced by radar data or stereo remote sensing image pairs, global DEM datasets are one of the most important types for DEM data. Relative error relates to surface quality created by DEM data, so it relates to geomorphology and hydrologic applications using DEM data. Taking Shanxi Plateau of China as the study area, this research evaluated the relative error to typical open global DEM datasets including Shuttle Radar Terrain Mission (SRTM) data with 1 arc second resolution (SRTM1), SRTM data with 3 arc second resolution (SRTM3), ASTER global DEM data in the second version (GDEM-v2) and ALOS world 3D-30m (AW3D) data. Through process and selection, more than 300,000 ICESat/GLA14 points were used as the GCP data, and the vertical error was computed and compared among four typical global DEM datasets. Then, more than 2,600,000 ICESat/GLA14 point pairs were acquired using the distance threshold between 100 m and 500 m. Meanwhile, the horizontal distance between every point pair was computed, so the relative error was achieved using slope values based on vertical error difference and the horizontal distance of the point pairs. Finally, false slope ratio (FSR) index was computed through analyzing the difference between DEM and ICESat/GLA14 values for every point pair. Both relative error and FSR index were categorically compared for the four DEM datasets under different slope classes. Research results show: Overall, AW3D has the lowest relative error values in mean error, mean absolute error, root mean square error and standard deviation error; then the SRTM1 data, its values are a little higher than AW3D data; the SRTM3 and GDEM-v2 data have the highest relative error values, and the values for the two datasets are similar. Considering different slope conditions, all the four DEM data have better performance in flat areas but worse performance in sloping regions; AW3D has the best performance in all the slope classes, a litter better than SRTM1; with slope increasing
The charge form factor of the neutron from sup 2 H-vector, (e-vector, e' n)p
Passchier, I; Szczerba, D; Alarcon, R; Bauer, T S; Boersma, D J; Van der Brand, J F J; Bulten, H J; Ferro-Luzzi, M; Higinbotham, D W; Jager, C W D; Klous, S; Kolster, H; Lang, J; Nikolenko, D M; Nooren, G J; Norum, B E; Poolman, H R; Rachek, Igor A; Simani, M C; Six, E; Vries, H D; Wang, K; Zhou, Z L
2000-01-01
We report on the first measurement of spin-correlation parameters in quasifree electron scattering from vector-polarized deuterium. Polarized electrons were injected into an electron storage ring at a beam energy of 720 MeV. A Siberian snake was employed to preserve longitudinal polarization at the interaction point. Vector-polarized deuterium was produced by an atomic beam source and injected into an open-ended cylindrical cell, internal to the electron storage ring. The spin correlation parameter A sup V sub e sub d was measured for the reaction sup 2 H-vector, (e-vector, e'n)p at a four-momentum transfer squared of 0.21 (GeV/c) sup 2 from which a value for the charge form factor of the neutron was extracted.
The Dataset of Countries at Risk of Electoral Violence
Birch, Sarah; Muchlinski, David
2017-01-01
Electoral violence is increasingly affecting elections around the world, yet researchers have been limited by a paucity of granular data on this phenomenon. This paper introduces and describes a new dataset of electoral violence – the Dataset of Countries at Risk of Electoral Violence (CREV) – that provides measures of 10 different types of electoral violence across 642 elections held around the globe between 1995 and 2013. The paper provides a detailed account of how and why the dataset was ...
Norwegian Hydrological Reference Dataset for Climate Change Studies
Energy Technology Data Exchange (ETDEWEB)
Magnussen, Inger Helene; Killingland, Magnus; Spilde, Dag
2012-07-01
Based on the Norwegian hydrological measurement network, NVE has selected a Hydrological Reference Dataset for studies of hydrological change. The dataset meets international standards with high data quality. It is suitable for monitoring and studying the effects of climate change on the hydrosphere and cryosphere in Norway. The dataset includes streamflow, groundwater, snow, glacier mass balance and length change, lake ice and water temperature in rivers and lakes.(Author)
Sistem Deteksi Retinopati Diabetik Menggunakan Support Vector Machine
Directory of Open Access Journals (Sweden)
Wahyudi Setiawan
2014-02-01
Full Text Available Diabetic Retinopathy is a complication of Diabetes Melitus. It can be a blindness if untreated settled as early as possible. System created in this thesis is the detection of diabetic retinopathy level of the image obtained from fundus photographs. There are three main steps to resolve the problems, preprocessing, feature extraction and classification. Preprocessing methods that used in this system are Grayscale Green Channel, Gaussian Filter, Contrast Limited Adaptive Histogram Equalization and Masking. Two Dimensional Linear Discriminant Analysis (2DLDA is used for feature extraction. Support Vector Machine (SVM is used for classification. The test result performed by taking a dataset of MESSIDOR with number of images that vary for the training phase, otherwise is used for the testing phase. Test result show the optimal accuracy are 84% . Keywords : Diabetic Retinopathy, Support Vector Machine, Two Dimensional Linear Discriminant Analysis, MESSIDOR
Support vector machine for the diagnosis of malignant mesothelioma
Ushasukhanya, S.; Nithyakalyani, A.; Sivakumar, V.
2018-04-01
Harmful mesothelioma is an illness in which threatening (malignancy) cells shape in the covering of the trunk or stomach area. Being presented to asbestos can influence the danger of threatening mesothelioma. Signs and side effects of threatening mesothelioma incorporate shortness of breath and agony under the rib confine. Tests that inspect within the trunk and belly are utilized to recognize (find) and analyse harmful mesothelioma. Certain elements influence forecast (shot of recuperation) and treatment choices. In this review, Support vector machine (SVM) classifiers were utilized for Mesothelioma sickness conclusion. SVM output is contrasted by concentrating on Mesothelioma’s sickness and findings by utilizing similar information set. The support vector machine algorithm gives 92.5% precision acquired by means of 3-overlap cross-approval. The Mesothelioma illness dataset were taken from an organization reports from Turkey.
Modeling and prediction of flotation performance using support vector regression
Directory of Open Access Journals (Sweden)
Despotović Vladimir
2017-01-01
Full Text Available Continuous efforts have been made in recent year to improve the process of paper recycling, as it is of critical importance for saving the wood, water and energy resources. Flotation deinking is considered to be one of the key methods for separation of ink particles from the cellulose fibres. Attempts to model the flotation deinking process have often resulted in complex models that are difficult to implement and use. In this paper a model for prediction of flotation performance based on Support Vector Regression (SVR, is presented. Representative data samples were created in laboratory, under a variety of practical control variables for the flotation deinking process, including different reagents, pH values and flotation residence time. Predictive model was created that was trained on these data samples, and the flotation performance was assessed showing that Support Vector Regression is a promising method even when dataset used for training the model is limited.
Public Availability to ECS Collected Datasets
Henderson, J. F.; Warnken, R.; McLean, S. J.; Lim, E.; Varner, J. D.
2013-12-01
Coastal nations have spent considerable resources exploring the limits of their extended continental shelf (ECS) beyond 200 nm. Although these studies are funded to fulfill requirements of the UN Convention on the Law of the Sea, the investments are producing new data sets in frontier areas of Earth's oceans that will be used to understand, explore, and manage the seafloor and sub-seafloor for decades to come. Although many of these datasets are considered proprietary until a nation's potential ECS has become 'final and binding' an increasing amount of data are being released and utilized by the public. Data sets include multibeam, seismic reflection/refraction, bottom sampling, and geophysical data. The U.S. ECS Project, a multi-agency collaboration whose mission is to establish the full extent of the continental shelf of the United States consistent with international law, relies heavily on data and accurate, standard metadata. The United States has made it a priority to make available to the public all data collected with ECS-funding as quickly as possible. The National Oceanic and Atmospheric Administration's (NOAA) National Geophysical Data Center (NGDC) supports this objective by partnering with academia and other federal government mapping agencies to archive, inventory, and deliver marine mapping data in a coordinated, consistent manner. This includes ensuring quality, standard metadata and developing and maintaining data delivery capabilities built on modern digital data archives. Other countries, such as Ireland, have submitted their ECS data for public availability and many others have made pledges to participate in the future. The data services provided by NGDC support the U.S. ECS effort as well as many developing nation's ECS effort through the U.N. Environmental Program. Modern discovery, visualization, and delivery of scientific data and derived products that span national and international sources of data ensure the greatest re-use of data and
BIA Indian Lands Dataset (Indian Lands of the United States)
Federal Geographic Data Committee — The American Indian Reservations / Federally Recognized Tribal Entities dataset depicts feature location, selected demographics and other associated data for the 561...
Framework for Interactive Parallel Dataset Analysis on the Grid
Energy Technology Data Exchange (ETDEWEB)
Alexander, David A.; Ananthan, Balamurali; /Tech-X Corp.; Johnson, Tony; Serbo, Victor; /SLAC
2007-01-10
We present a framework for use at a typical Grid site to facilitate custom interactive parallel dataset analysis targeting terabyte-scale datasets of the type typically produced by large multi-institutional science experiments. We summarize the needs for interactive analysis and show a prototype solution that satisfies those needs. The solution consists of desktop client tool and a set of Web Services that allow scientists to sign onto a Grid site, compose analysis script code to carry out physics analysis on datasets, distribute the code and datasets to worker nodes, collect the results back to the client, and to construct professional-quality visualizations of the results.
Socioeconomic Data and Applications Center (SEDAC) Treaty Status Dataset
National Aeronautics and Space Administration — The Socioeconomic Data and Application Center (SEDAC) Treaty Status Dataset contains comprehensive treaty information for multilateral environmental agreements,...
Efficient morse decompositions of vector fields.
Chen, Guoning; Mischaikow, Konstantin; Laramee, Robert S; Zhang, Eugene
2008-01-01
Existing topology-based vector field analysis techniques rely on the ability to extract the individual trajectories such as fixed points, periodic orbits, and separatrices that are sensitive to noise and errors introduced by simulation and interpolation. This can make such vector field analysis unsuitable for rigorous interpretations. We advocate the use of Morse decompositions, which are robust with respect to perturbations, to encode the topological structures of a vector field in the form of a directed graph, called a Morse connection graph (MCG). While an MCG exists for every vector field, it need not be unique. Previous techniques for computing MCG's, while fast, are overly conservative and usually results in MCG's that are too coarse to be useful for the applications. To address this issue, we present a new technique for performing Morse decomposition based on the concept of tau-maps, which typically provides finer MCG's than existing techniques. Furthermore, the choice of tau provides a natural tradeoff between the fineness of the MCG's and the computational costs. We provide efficient implementations of Morse decomposition based on tau-maps, which include the use of forward and backward mapping techniques and an adaptive approach in constructing better approximations of the images of the triangles in the meshes used for simulation.. Furthermore, we propose the use of spatial tau-maps in addition to the original temporal tau-maps. These techniques provide additional trade-offs between the quality of the MCGs and the speed of computation. We demonstrate the utility of our technique with various examples in the plane and on surfaces including engine simulation data sets.
International Nuclear Information System (INIS)
Nonomiya, Iwao; Ishiguro, Misako; Tsutsui, Tsuneo
1990-07-01
In this report, we describe the vectorization of two-dimensional Sn-method radiation transport code DOT3.5. Vectorized codes are not only the NEA original version developed at ORNL but also the versions improved by JAERI: DOT3.5 FNS version for fusion neutronics analyses, DOT3.5 FER version for fusion reactor design, and ESPRIT module of RADHEAT-V4 code system for radiation shielding and radiation transport analyses. In DOT3.5, input/output processing time amounts to a great part of the elapsed time when a large number of energy groups and/or a large number of spatial mesh points are used in the calculated problem. Therefore, an improvement has been made for the speedup of input/output processing in the DOT3.5 FNS version, and DOT-DD (Double Differential cross section) code. The total speedup ratio of vectorized version to the original scalar one is 1.7∼1.9 for DOT3.5 NEA version, 2.2∼2.3 fro DOT3.5 FNS version, 1.7 for DOT3.5 FER version, and 3.1∼4.4 for RADHEAT-V4, respectively. The elapsed times for improved DOT3.5 FNS version and DOT-DD are reduced to 50∼65% that of the original version by the input/output speedup. In this report, we describe summary of codes, the techniques used for the vectorization and input/output speedup, verification of computed results, and speedup effect. (author)
Enhancing poxvirus vectors vaccine immunogenicity.
García-Arriaza, Juan; Esteban, Mariano
2014-01-01
Attenuated recombinant poxvirus vectors expressing heterologous antigens from pathogens are currently at various stages in clinical trials with the aim to establish their efficacy. This is because these vectors have shown excellent safety profiles, significant immunogenicity against foreign expressed antigens and are able to induce protective immune responses. In view of the limited efficacy triggered by some poxvirus strains used in clinical trials (i.e, ALVAC in the RV144 phase III clinical trial for HIV), and of the restrictive replication capacity of the highly attenuated vectors like MVA and NYVAC, there is a consensus that further improvements of these vectors should be pursuit. In this review we considered several strategies that are currently being implemented, as well as new approaches, to improve the immunogenicity of the poxvirus vectors. This includes heterologous prime/boost protocols, use of co-stimulatory molecules, deletion of viral immunomodulatory genes still present in the poxvirus genome, enhancing virus promoter strength, enhancing vector replication capacity, optimizing expression of foreign heterologous sequences, and the combined use of adjuvants. An optimized poxvirus vector triggering long-lasting immunity with a high protective efficacy against a selective disease should be sought.
Stable piecewise polynomial vector fields
Directory of Open Access Journals (Sweden)
Claudio Pessoa
2012-09-01
Full Text Available Let $N={y>0}$ and $S={y<0}$ be the semi-planes of $mathbb{R}^2$ having as common boundary the line $D={y=0}$. Let $X$ and $Y$ be polynomial vector fields defined in $N$ and $S$, respectively, leading to a discontinuous piecewise polynomial vector field $Z=(X,Y$. This work pursues the stability and the transition analysis of solutions of $Z$ between $N$ and $S$, started by Filippov (1988 and Kozlova (1984 and reformulated by Sotomayor-Teixeira (1995 in terms of the regularization method. This method consists in analyzing a one parameter family of continuous vector fields $Z_{epsilon}$, defined by averaging $X$ and $Y$. This family approaches $Z$ when the parameter goes to zero. The results of Sotomayor-Teixeira and Sotomayor-Machado (2002 providing conditions on $(X,Y$ for the regularized vector fields to be structurally stable on planar compact connected regions are extended to discontinuous piecewise polynomial vector fields on $mathbb{R}^2$. Pertinent genericity results for vector fields satisfying the above stability conditions are also extended to the present case. A procedure for the study of discontinuous piecewise vector fields at infinity through a compactification is proposed here.
Chikungunya Virus–Vector Interactions
Directory of Open Access Journals (Sweden)
Lark L. Coffey
2014-11-01
Full Text Available Chikungunya virus (CHIKV is a mosquito-borne alphavirus that causes chikungunya fever, a severe, debilitating disease that often produces chronic arthralgia. Since 2004, CHIKV has emerged in Africa, Indian Ocean islands, Asia, Europe, and the Americas, causing millions of human infections. Central to understanding CHIKV emergence is knowledge of the natural ecology of transmission and vector infection dynamics. This review presents current understanding of CHIKV infection dynamics in mosquito vectors and its relationship to human disease emergence. The following topics are reviewed: CHIKV infection and vector life history traits including transmission cycles, genetic origins, distribution, emergence and spread, dispersal, vector competence, vector immunity and microbial interactions, and co-infection by CHIKV and other arboviruses. The genetics of vector susceptibility and host range changes, population heterogeneity and selection for the fittest viral genomes, dual host cycling and its impact on CHIKV adaptation, viral bottlenecks and intrahost diversity, and adaptive constraints on CHIKV evolution are also discussed. The potential for CHIKV re-emergence and expansion into new areas and prospects for prevention via vector control are also briefly reviewed.
Covariance estimation in Terms of Stokes Parameters with Application to Vector Sensor Imaging
2016-12-15
surements. A vector sensor (example shown in Figure 1) measures the electromagnetic field at a single point using three orthogonal dipole elements and...Secretary of Defense for Research and Engineering . Figure 1. Atom antenna [1], an electromagnetic vector sensor. The antenna is composed of three...orthogonal loop and dipole elements with a common phase center, measuring the complete electromagnetic field in a six-element vector . rounding sphere as
Introducing a Web API for Dataset Submission into a NASA Earth Science Data Center
Moroni, D. F.; Quach, N.; Francis-Curley, W.
2016-12-01
As the landscape of data becomes increasingly more diverse in the domain of Earth Science, the challenges of managing and preserving data become more onerous and complex, particularly for data centers on fixed budgets and limited staff. Many solutions already exist to ease the cost burden for the downstream component of the data lifecycle, yet most archive centers are still racing to keep up with the influx of new data that still needs to find a quasi-permanent resting place. For instance, having well-defined metadata that is consistent across the entire data landscape provides for well-managed and preserved datasets throughout the latter end of the data lifecycle. Translators between different metadata dialects are already in operational use, and facilitate keeping older datasets relevant in today's world of rapidly evolving metadata standards. However, very little is done to address the first phase of the lifecycle, which deals with the entry of both data and the corresponding metadata into a system that is traditionally opaque and closed off to external data producers, thus resulting in a significant bottleneck to the dataset submission process. The ATRAC system was the NOAA NCEI's answer to this previously obfuscated barrier to scientists wishing to find a home for their climate data records, providing a web-based entry point to submit timely and accurate metadata and information about a very specific dataset. A couple of NASA's Distributed Active Archive Centers (DAACs) have implemented their own versions of a web-based dataset and metadata submission form including the ASDC and the ORNL DAAC. The Physical Oceanography DAAC is the most recent in the list of NASA-operated DAACs who have begun to offer their own web-based dataset and metadata submission services to data producers. What makes the PO.DAAC dataset and metadata submission service stand out from these pre-existing services is the option of utilizing both a web browser GUI and a RESTful API to
Structural dataset for the PPARγ V290M mutant
Directory of Open Access Journals (Sweden)
Ana C. Puhl
2016-06-01
Full Text Available Loss-of-function mutation V290M in the ligand-binding domain of the peroxisome proliferator activated receptor γ (PPARγ is associated with a ligand resistance syndrome (PLRS, characterized by partial lipodystrophy and severe insulin resistance. In this data article we discuss an X-ray diffraction dataset that yielded the structure of PPARγ LBD V290M mutant refined at 2.3 Å resolution, that allowed building of 3D model of the receptor mutant with high confidence and revealed continuous well-defined electron density for the partial agonist diclofenac bound to hydrophobic pocket of the PPARγ. These structural data provide significant insights into molecular basis of PLRS caused by V290M mutation and are correlated with the receptor disability of rosiglitazone binding and increased affinity for corepressors. Furthermore, our structural evidence helps to explain clinical observations which point out to a failure to restore receptor function by the treatment with a full agonist of PPARγ, rosiglitazone.
Scalable and portable visualization of large atomistic datasets
Sharma, Ashish; Kalia, Rajiv K.; Nakano, Aiichiro; Vashishta, Priya
2004-10-01
A scalable and portable code named Atomsviewer has been developed to interactively visualize a large atomistic dataset consisting of up to a billion atoms. The code uses a hierarchical view frustum-culling algorithm based on the octree data structure to efficiently remove atoms outside of the user's field-of-view. Probabilistic and depth-based occlusion-culling algorithms then select atoms, which have a high probability of being visible. Finally a multiresolution algorithm is used to render the selected subset of visible atoms at varying levels of detail. Atomsviewer is written in C++ and OpenGL, and it has been tested on a number of architectures including Windows, Macintosh, and SGI. Atomsviewer has been used to visualize tens of millions of atoms on a standard desktop computer and, in its parallel version, up to a billion atoms. Program summaryTitle of program: Atomsviewer Catalogue identifier: ADUM Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADUM Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer for which the program is designed and others on which it has been tested: 2.4 GHz Pentium 4/Xeon processor, professional graphics card; Apple G4 (867 MHz)/G5, professional graphics card Operating systems under which the program has been tested: Windows 2000/XP, Mac OS 10.2/10.3, SGI IRIX 6.5 Programming languages used: C++, C and OpenGL Memory required to execute with typical data: 1 gigabyte of RAM High speed storage required: 60 gigabytes No. of lines in the distributed program including test data, etc.: 550 241 No. of bytes in the distributed program including test data, etc.: 6 258 245 Number of bits in a word: Arbitrary Number of processors used: 1 Has the code been vectorized or parallelized: No Distribution format: tar gzip file Nature of physical problem: Scientific visualization of atomic systems Method of solution: Rendering of atoms using computer graphic techniques, culling algorithms for data
Process for structural geologic analysis of topography and point data
Eliason, Jay R.; Eliason, Valerie L. C.
1987-01-01
A quantitative method of geologic structural analysis of digital terrain data is described for implementation on a computer. Assuming selected valley segments are controlled by the underlying geologic structure, topographic lows in the terrain data, defining valley bottoms, are detected, filtered and accumulated into a series line segments defining contiguous valleys. The line segments are then vectorized to produce vector segments, defining valley segments, which may be indicative of the underlying geologic structure. Coplanar analysis is performed on vector segment pairs to determine which vectors produce planes which represent underlying geologic structure. Point data such as fracture phenomena which can be related to fracture planes in 3-dimensional space can be analyzed to define common plane orientation and locations. The vectors, points, and planes are displayed in various formats for interpretation.
Emerging Vector-Borne Diseases - Incidence through Vectors.
Savić, Sara; Vidić, Branka; Grgić, Zivoslav; Potkonjak, Aleksandar; Spasojevic, Ljubica
2014-01-01
Vector-borne diseases use to be a major public health concern only in tropical and subtropical areas, but today they are an emerging threat for the continental and developed countries also. Nowadays, in intercontinental countries, there is a struggle with emerging diseases, which have found their way to appear through vectors. Vector-borne zoonotic diseases occur when vectors, animal hosts, climate conditions, pathogens, and susceptible human population exist at the same time, at the same place. Global climate change is predicted to lead to an increase in vector-borne infectious diseases and disease outbreaks. It could affect the range and population of pathogens, host and vectors, transmission season, etc. Reliable surveillance for diseases that are most likely to emerge is required. Canine vector-borne diseases represent a complex group of diseases including anaplasmosis, babesiosis, bartonellosis, borreliosis, dirofilariosis, ehrlichiosis, and leishmaniosis. Some of these diseases cause serious clinical symptoms in dogs and some of them have a zoonotic potential with an effect to public health. It is expected from veterinarians in coordination with medical doctors to play a fundamental role at primarily prevention and then treatment of vector-borne diseases in dogs. The One Health concept has to be integrated into the struggle against emerging diseases. During a 4-year period, from 2009 to 2013, a total number of 551 dog samples were analyzed for vector-borne diseases (borreliosis, babesiosis, ehrlichiosis, anaplasmosis, dirofilariosis, and leishmaniasis) in routine laboratory work. The analysis was done by serological tests - ELISA for borreliosis, dirofilariosis, and leishmaniasis, modified Knott test for dirofilariosis, and blood smear for babesiosis, ehrlichiosis, and anaplasmosis. This number of samples represented 75% of total number of samples that were sent for analysis for different diseases in dogs. Annually, on average more then half of the samples
DEFF Research Database (Denmark)
Davis, Christopher James; Kedlaya, Kiran
2014-01-01
We study the kernel and cokernel of the Frobenius map on the p-typical Witt vectors of a commutative ring, not necessarily of characteristic p. We give many equivalent conditions to surjectivity of the Frobenius map on both finite and infinite length Witt vectors. In particular, surjectivity...... on finite Witt vectors turns out to be stable under certain integral extensions; this provides a clean formulation of a strong generalization of Faltings’s almost purity theorem from p-adic Hodge theory, incorporating recent improvements by Kedlaya–Liu and by Scholze....
Vector boson scattering at CLIC
Energy Technology Data Exchange (ETDEWEB)
Kilian, Wolfgang; Fleper, Christian [Department Physik, Universitaet Siegen, 57068 Siegen (Germany); Reuter, Juergen [DESY Theory Group, 22603 Hamburg (Germany); Sekulla, Marco [Institut fuer Theoretische Physik, Karlsruher Institut fuer Technologie, 76131 Karlsruhe (Germany)
2016-07-01
Linear colliders operating in a range of multiple TeV are able to investigate the details of vector boson scattering and electroweak symmetry breaking. We calculate cross sections with the Monte Carlo generator WHIZARD for vector boson scattering processes at the future linear e{sup +} e{sup -} collider CLIC. By finding suitable cuts, the vector boson scattering signal processes are isolated from the background. Finally, we are able to determine exclusion sensitivities on the non-Standard Model parameters of the relevant dimension eight operators.
Vector control of induction machines
Robyns, Benoit
2012-01-01
After a brief introduction to the main law of physics and fundamental concepts inherent in electromechanical conversion, ""Vector Control of Induction Machines"" introduces the standard mathematical models for induction machines - whichever rotor technology is used - as well as several squirrel-cage induction machine vector-control strategies. The use of causal ordering graphs allows systematization of the design stage, as well as standardization of the structure of control devices. ""Vector Control of Induction Machines"" suggests a unique approach aimed at reducing parameter sensitivity for
Vectors of rickettsiae in Africa.
Bitam, Idir
2012-12-01
Vector-borne diseases are caused by parasites, bacteria, or viruses transmitted by the bites of hematophagous arthropods. In Africa, there has been a recent emergence of new diseases and the re-emergence of existing diseases, usually with changes in disease epidemiology (e.g., geographical distribution, prevalence, and pathogenicity). In Africa, rickettsioses are recognized as important emerging vector-borne infections in humans. Rickettsial diseases are transmitted by different types of arthropods, ticks, fleas, lice, and mites. This review will examine the roles of these different arthropod vectors and their geographical distributions. Copyright © 2012 Elsevier GmbH. All rights reserved.
Recommendation on vectors and vector-transmitted diseases
Netherlands Food and Consumer Product Safety Authority
2009-01-01
In view of their increasing risk of introduction and their possible implications in causing major disease outbreaks, vectors, as well as vector-transmitted diseases like dengue, West Nile disease, Lyme disease and bluetongue need to be recognised as a threat to public and animal health and to the economy, also in the Netherlands. There has been an increase in the incidence of these diseases in the past two to three decades. Climate changes and changes in the use of land, water managemen...
Vector independent transmission of the vector-borne bluetongue virus.
van der Sluijs, Mirjam Tineke Willemijn; de Smit, Abraham J; Moormann, Rob J M
2016-01-01
Bluetongue is an economically important disease of ruminants. The causative agent, Bluetongue virus (BTV), is mainly transmitted by insect vectors. This review focuses on vector-free BTV transmission, and its epizootic and economic consequences. Vector-free transmission can either be vertical, from dam to fetus, or horizontal via direct contract. For several BTV-serotypes, vertical (transplacental) transmission has been described, resulting in severe congenital malformations. Transplacental transmission had been mainly associated with live vaccine strains. Yet, the European BTV-8 strain demonstrated a high incidence of transplacental transmission in natural circumstances. The relevance of transplacental transmission for the epizootiology is considered limited, especially in enzootic areas. However, transplacental transmission can have a substantial economic impact due to the loss of progeny. Inactivated vaccines have demonstrated to prevent transplacental transmission. Vector-free horizontal transmission has also been demonstrated. Since direct horizontal transmission requires close contact of animals, it is considered only relevant for within-farm spreading of BTV. The genetic determinants which enable vector-free transmission are present in virus strains circulating in the field. More research into the genetic changes which enable vector-free transmission is essential to better evaluate the risks associated with outbreaks of new BTV serotypes and to design more appropriate control measures.
Some BMO estimates for vector-valued multilinear singular integral ...
Indian Academy of Sciences (India)
R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22
the multilinear operator related to some singular integral operators is obtained. The main purpose of this paper is to establish the BMO end-point estimates for some vector-valued multilinear operators related to certain singular integral operators. First, let us introduce some notations [10,16]. Throughout this paper, Q = Q(x,r).
Relevance Vector Machine for Prediction of Soil Properties | Samui ...
African Journals Online (AJOL)
One of the first, most important steps in geotechnical engineering is site characterization. The ultimate goal of site characterization is to predict the in-situ soil properties at any half-space point for a site based on limited number of tests and data. In the present study, relevance vector machine (RVM) has been used to develop ...
Almost purity for overconvergent Witt vectors
DEFF Research Database (Denmark)
Davis, Christopher James; Kedlaya, Kiran
2015-01-01
. Here, we use almost purity to lift the finite étale extension of R[p−1]R[p−1] to a finite étale extension of rings of overconvergent Witt vectors. The point is that no hypothesis of p-adic completeness is needed; this result thus points towards potential global analogues of p -adic Hodge theory....... As an illustration, we construct (φ,Γ)(φ,Γ)-modules associated with Artin Motives over QQ. The (φ,Γ)(φ,Γ)-modules we construct are defined over a base ring which seems well-suited to generalization to a more global setting; we plan to pursue such generalizations in later work....
Locally analytic vectors in representations of locally
Emerton, Matthew J
2017-01-01
The goal of this memoir is to provide the foundations for the locally analytic representation theory that is required in three of the author's other papers on this topic. In the course of writing those papers the author found it useful to adopt a particular point of view on locally analytic representation theory: namely, regarding a locally analytic representation as being the inductive limit of its subspaces of analytic vectors (of various "radii of analyticity"). The author uses the analysis of these subspaces as one of the basic tools in his study of such representations. Thus in this memoir he presents a development of locally analytic representation theory built around this point of view. The author has made a deliberate effort to keep the exposition reasonably self-contained and hopes that this will be of some benefit to the reader.
Annotating spatio-temporal datasets for meaningful analysis in the Web
Stasch, Christoph; Pebesma, Edzer; Scheider, Simon
2014-05-01
More and more environmental datasets that vary in space and time are available in the Web. This comes along with an advantage of using the data for other purposes than originally foreseen, but also with the danger that users may apply inappropriate analysis procedures due to lack of important assumptions made during the data collection process. In order to guide towards a meaningful (statistical) analysis of spatio-temporal datasets available in the Web, we have developed a Higher-Order-Logic formalism that captures some relevant assumptions in our previous work [1]. It allows to proof on meaningful spatial prediction and aggregation in a semi-automated fashion. In this poster presentation, we will present a concept for annotating spatio-temporal datasets available in the Web with concepts defined in our formalism. Therefore, we have defined a subset of the formalism as a Web Ontology Language (OWL) pattern. It allows capturing the distinction between the different spatio-temporal variable types, i.e. point patterns, fields, lattices and trajectories, that in turn determine whether a particular dataset can be interpolated or aggregated in a meaningful way using a certain procedure. The actual annotations that link spatio-temporal datasets with the concepts in the ontology pattern are provided as Linked Data. In order to allow data producers to add the annotations to their datasets, we have implemented a Web portal that uses a triple store at the backend to store the annotations and to make them available in the Linked Data cloud. Furthermore, we have implemented functions in the statistical environment R to retrieve the RDF annotations and, based on these annotations, to support a stronger typing of spatio-temporal datatypes guiding towards a meaningful analysis in R. [1] Stasch, C., Scheider, S., Pebesma, E., Kuhn, W. (2014): "Meaningful spatial prediction and aggregation", Environmental Modelling & Software, 51, 149-165.
Vectorization at the KENO-IV code
International Nuclear Information System (INIS)
Asai, K.; Higuchi, K.; Katakura, J.
1986-01-01
The multigroup criticality safety code KENO-IV has been vectorized and tested on the FACOM VP-100 vector processor. At first, the vectorized KENO-IV on a scalar processor was slower than the original one by a factor of 1.4 because of the overhead introduced by vectorization. Making modifications of algorithms and techniques for vectorization, the vectorized version has become faster than the original one by a factor of 1.4 on the vector processor. For further speedup of the code, some improvements on compiler and hardware, especially on addition of Monte Carlo pipelines to the vector processor, are discussed
Introduction to matrices and vectors
Schwartz, Jacob T
2001-01-01
In this concise undergraduate text, the first three chapters present the basics of matrices - in later chapters the author shows how to use vectors and matrices to solve systems of linear equations. 1961 edition.
GRE Enzymes for Vector Analysis
U.S. Environmental Protection Agency — Microbial enzyme data that were collected during the 2004-2006 EMAP-GRE program. These data were then used by Moorhead et al (2016) in their ecoenzyme vector...
Scanning vector Hall probe microscopy
International Nuclear Information System (INIS)
Cambel, V.; Gregusova, D.; Fedor, J.; Kudela, R.; Bending, S.J.
2004-01-01
We have developed a scanning vector Hall probe microscope for mapping magnetic field vector over magnetic samples. The microscope is based on a micromachined Hall sensor and the cryostat with scanning system. The vector Hall sensor active area is ∼5x5 μm 2 . It is realized by patterning three Hall probes on the tilted faces of GaAs pyramids. Data from these 'tilted' Hall probes are used to reconstruct the full magnetic field vector. The scanning area of the microscope is 5x5 mm 2 , space resolution 2.5 μm, field resolution ∼1 μT Hz -1/2 at temperatures 10-300 K
DEFF Research Database (Denmark)
Pihl, Michael Johannes
The main purpose of this PhD project is to develop an ultrasonic method for 3D vector flow imaging. The motivation is to advance the field of velocity estimation in ultrasound, which plays an important role in the clinic. The velocity of blood has components in all three spatial dimensions, yet...... are (vx, vy, vz) = (-0.03, 95, 1.0) ± (9, 6, 1) cm/s compared with the expected (0, 96, 0) cm/s. Afterwards, 3D vector flow images from a cross-sectional plane of the vessel are presented. The out of plane velocities exhibit the expected 2D circular-symmetric parabolic shape. The experimental results...... verify that the 3D TO method estimates the complete 3D velocity vectors, and that the method is suitable for 3D vector flow imaging....
DEFF Research Database (Denmark)
Holbek, Simon
, if this significant reduction in the element count can still provide precise and robust 3-D vector flow estimates in a plane. The study concludes that the RC array is capable of estimating precise 3-D vector flow both in a plane and in a volume, despite the low channel count. However, some inherent new challenges...... ultrasonic vector flow estimation and bring it a step closer to a clinical application. A method for high frame rate 3-D vector flow estimation in a plane using the transverse oscillation method combined with a 1024 channel 2-D matrix array is presented. The proposed method is validated both through phantom...... hampers the task of real-time processing. In a second study, some of the issue with the 2-D matrix array are solved by introducing a 2-D row-column (RC) addressing array with only 62 + 62 elements. It is investigated both through simulations and via experimental setups in various flow conditions...
Transcriptional Silencing of Retroviral Vectors
DEFF Research Database (Denmark)
Lund, Anders Henrik; Duch, M.; Pedersen, F.S.
1996-01-01
. Extinction of long-term vector expression has been observed after implantation of transduced hematopoietic cells as well as fibroblasts, myoblasts and hepatocytes. Here we review the influence of vector structure, integration site and cell type on transcriptional silencing. While down-regulation of proviral...... transcription is known from a number of cellular and animal models, major insight has been gained from studies in the germ line and embryonal cells of the mouse. Key elements for the transfer and expression of retroviral vectors, such as the viral transcriptional enhancer and the binding site for the t......RNA primer for reverse transcription may have a major influence on transcriptional silencing. Alterations of these elements of the vector backbone as well as the use of internal promoter elements from housekeeping genes may contribute to reduce transcriptional silencing. The use of cell culture and animal...
High Accuracy Vector Helium Magnetometer
National Aeronautics and Space Administration — The proposed HAVHM instrument is a laser-pumped helium magnetometer with both triaxial vector and omnidirectional scalar measurement capabilities in a single...
NSGIC Local Govt | GIS Inventory — Address Points dataset current as of 2013. The Address Point layer contains an address point for almost every structure over 200 square feet and for some vacant...
An Analysis of the GTZAN Music Genre Dataset
DEFF Research Database (Denmark)
Sturm, Bob L.
2012-01-01
Most research in automatic music genre recognition has used the dataset assembled by Tzanetakis et al. in 2001. The composition and integrity of this dataset, however, has never been formally analyzed. For the first time, we provide an analysis of its composition, and create a machine...
Really big data: Processing and analysis of large datasets
Modern animal breeding datasets are large and getting larger, due in part to the recent availability of DNA data for many animals. Computational methods for efficiently storing and analyzing those data are under development. The amount of storage space required for such datasets is increasing rapidl...
A New Outlier Detection Method for Multidimensional Datasets
Abdel Messih, Mario A.
2012-07-01
This study develops a novel hybrid method for outlier detection (HMOD) that combines the idea of distance based and density based methods. The proposed method has two main advantages over most of the other outlier detection methods. The first advantage is that it works well on both dense and sparse datasets. The second advantage is that, unlike most other outlier detection methods that require careful parameter setting and prior knowledge of the data, HMOD is not very sensitive to small changes in parameter values within certain parameter ranges. The only required parameter to set is the number of nearest neighbors. In addition, we made a fully parallelized implementation of HMOD that made it very efficient in applications. Moreover, we proposed a new way of using the outlier detection for redundancy reduction in datasets where the confidence level that evaluates how accurate the less redundant dataset can be used to represent the original dataset can be specified by users. HMOD is evaluated on synthetic datasets (dense and mixed “dense and sparse”) and a bioinformatics problem of redundancy reduction of dataset of position weight matrices (PWMs) of transcription factor binding sites. In addition, in the process of assessing the performance of our redundancy reduction method, we developed a simple tool that can be used to evaluate the confidence level of reduced dataset representing the original dataset. The evaluation of the results shows that our method can be used in a wide range of problems.
On rationality of moduli spaces of vector bundles on real Hirzebruch ...
Indian Academy of Sciences (India)
Introduction. Moduli spaces of semistable vector bundles on a smooth projective variety are studied from various points of view. One of the questions that is often addressed is the birational type of the moduli space, more precisely, the question of rationality. It is known that the moduli space of semistable vector bundles of ...
Abandoned Uranium Mine (AUM) Points, Navajo Nation, 2016, US EPA Region 9
U.S. Environmental Protection Agency — This GIS dataset contains point features of all Abandoned Uranium Mines (AUMs) on or within one mile of the Navajo Nation. Points are centroids developed from the...
An exotic composite vector boson
International Nuclear Information System (INIS)
Akama, Keiichi; Hattori, Takashi; Yasue, Masaki.
1990-08-01
An exotic composite vector boson, V, is introduced in two dynamical models of composite quarks, leptons, W and Z. One is based on four Fermi interactions, in which composite vector bosons are regarded as fermion-antifermion bound states and the other is based on the confining SU(2) L gauge model, in which they are given by scalar-antiscalar bound states. Both approaches describe the same effective interactions for the sector of composite quarks, leptons, W, Z, γ and V. (author)
ATLAS File and Dataset Metadata Collection and Use
Albrand, S; The ATLAS collaboration; Lambert, F; Gallas, E J
2012-01-01
The ATLAS Metadata Interface (“AMI”) was designed as a generic cataloguing system, and as such it has found many uses in the experiment including software release management, tracking of reconstructed event sizes and control of dataset nomenclature. The primary use of AMI is to provide a catalogue of datasets (file collections) which is searchable using physics criteria. In this paper we discuss the various mechanisms used for filling the AMI dataset and file catalogues. By correlating information from different sources we can derive aggregate information which is important for physics analysis; for example the total number of events contained in dataset, and possible reasons for missing events such as a lost file. Finally we will describe some specialized interfaces which were developed for the Data Preparation and reprocessing coordinators. These interfaces manipulate information from both the dataset domain held in AMI, and the run-indexed information held in the ATLAS COMA application (Conditions and ...
A dataset on tail risk of commodities markets.
Powell, Robert J; Vo, Duc H; Pham, Thach N; Singh, Abhay K
2017-12-01
This article contains the datasets related to the research article "The long and short of commodity tails and their relationship to Asian equity markets"(Powell et al., 2017) [1]. The datasets contain the daily prices (and price movements) of 24 different commodities decomposed from the S&P GSCI index and the daily prices (and price movements) of three share market indices including World, Asia, and South East Asia for the period 2004-2015. Then, the dataset is divided into annual periods, showing the worst 5% of price movements for each year. The datasets are convenient to examine the tail risk of different commodities as measured by Conditional Value at Risk (CVaR) as well as their changes over periods. The datasets can also be used to investigate the association between commodity markets and share markets.
Vectoring of parallel synthetic jets
Berk, Tim; Ganapathisubramani, Bharathram; Gomit, Guillaume
2015-11-01
A pair of parallel synthetic jets can be vectored by applying a phase difference between the two driving signals. The resulting jet can be merged or bifurcated and either vectored towards the actuator leading in phase or the actuator lagging in phase. In the present study, the influence of phase difference and Strouhal number on the vectoring behaviour is examined experimentally. Phase-locked vorticity fields, measured using Particle Image Velocimetry (PIV), are used to track vortex pairs. The physical mechanisms that explain the diversity in vectoring behaviour are observed based on the vortex trajectories. For a fixed phase difference, the vectoring behaviour is shown to be primarily influenced by pinch-off time of vortex rings generated by the synthetic jets. Beyond a certain formation number, the pinch-off timescale becomes invariant. In this region, the vectoring behaviour is determined by the distance between subsequent vortex rings. We acknowledge the financial support from the European Research Council (ERC grant agreement no. 277472).
RECONSTRUCTION OF 3D VECTOR MODELS OF BUILDINGS BY COMBINATION OF ALS, TLS AND VLS DATA
Directory of Open Access Journals (Sweden)
H. Boulaassal
2012-09-01
Full Text Available Airborne Laser Scanning (ALS, Terrestrial Laser Scanning (TLS and Vehicle based Laser Scanning (VLS are widely used as data acquisition methods for 3D building modelling. ALS data is often used to generate, among others, roof models. TLS data has proven its effectiveness in the geometric reconstruction of building façades. Although the operating algorithms used in the processing chain of these two kinds of data are quite similar, their combination should be more investigated. This study explores the possibility of combining ALS and TLS data for simultaneously producing 3D building models from bird point of view and pedestrian point of view. The geometric accuracy of roofs and façades models is different due to the acquisition techniques. In order to take these differences into account, the surfaces composing roofs and façades are extracted with the same algorithm of segmentation. Nevertheless the segmentation algorithm must be adapted to the properties of the different point clouds. It is based on the RANSAC algorithm, but has been applied in a sequential way in order to extract all potential planar clusters from airborne and terrestrial datasets. Surfaces are fitted to planar clusters, allowing edge detection and reconstruction of vector polygons. Models resulting from TLS data are obviously more accurate than those generated from ALS data. Therefore, the geometry of the roofs is corrected and adapted according to the geometry of the corresponding façades. Finally, the effects of the differences between raw ALS and TLS data on the results of the modeling process are analyzed. It is shown that such combination could be used to produce reliable 3D building models.
Web-based GIS: the vector-borne disease airline importation risk (VBD-AIR) tool.
Huang, Zhuojie; Das, Anirrudha; Qiu, Youliang; Tatem, Andrew J
2012-08-14
Over the past century, the size and complexity of the air travel network has increased dramatically. Nowadays, there are 29.6 million scheduled flights per year and around 2.7 billion passengers are transported annually. The rapid expansion of the network increasingly connects regions of endemic vector-borne disease with the rest of the world, resulting in challenges to health systems worldwide in terms of vector-borne pathogen importation and disease vector invasion events. Here we describe the development of a user-friendly Web-based GIS tool: the Vector-Borne Disease Airline Importation Risk Tool (VBD-AIR), to help better define the roles of airports and airlines in the transmission and spread of vector-borne diseases. Spatial datasets on modeled global disease and vector distributions, as well as climatic and air network traffic data were assembled. These were combined to derive relative risk metrics via air travel for imported infections, imported vectors and onward transmission, and incorporated into a three-tier server architecture in a Model-View-Controller framework with distributed GIS components. A user-friendly web-portal was built that enables dynamic querying of the spatial databases to provide relevant information. The VBD-AIR tool constructed enables the user to explore the interrelationships among modeled global distributions of vector-borne infectious diseases (malaria. dengue, yellow fever and chikungunya) and international air service routes to quantify seasonally changing risks of vector and vector-borne disease importation and spread by air travel, forming an evidence base to help plan mitigation strategies. The VBD-AIR tool is available at http://www.vbd-air.com. VBD-AIR supports a data flow that generates analytical results from disparate but complementary datasets into an organized cartographical presentation on a web map for the assessment of vector-borne disease movements on the air travel network. The framework built provides a flexible
Versatile generation of optical vector fields and vector beams using a non-interferometric approach.
Tripathi, Santosh; Toussaint, Kimani C
2012-05-07
We present a versatile, non-interferometric method for generating vector fields and vector beams which can produce all the states of polarization represented on a higher-order Poincaré sphere. The versatility and non-interferometric nature of this method is expected to enable exploration of various exotic properties of vector fields and vector beams. To illustrate this, we study the propagation properties of some vector fields and find that, in general, propagation alters both their intensity and polarization distribution, and more interestingly, converts some vector fields into vector beams. In the article, we also suggest a modified Jones vector formalism to represent vector fields and vector beams.
Prediction and analysis of beta-turns in proteins by support vector machine.
Pham, Tho Hoan; Satou, Kenji; Ho, Tu Bao
2003-01-01
Tight turn has long been recognized as one of the three important features of proteins after the alpha-helix and beta-sheet. Tight turns play an important role in globular proteins from both the structural and functional points of view. More than 90% tight turns are beta-turns. Analysis and prediction of beta-turns in particular and tight turns in general are very useful for the design of new molecules such as drugs, pesticides, and antigens. In this paper, we introduce a support vector machine (SVM) approach to prediction and analysis of beta-turns. We have investigated two aspects of applying SVM to the prediction and analysis of beta-turns. First, we developed a new SVM method, called BTSVM, which predicts beta-turns of a protein from its sequence. The prediction results on the dataset of 426 non-homologous protein chains by sevenfold cross-validation technique showed that our method is superior to the other previous methods. Second, we analyzed how amino acid positions support (or prevent) the formation of beta-turns based on the "multivariable" classification model of a linear SVM. This model is more general than the other ones of previous statistical methods. Our analysis results are more comprehensive and easier to use than previously published analysis results.
Ultrametric distribution of culture vectors in an extended Axelrod model of cultural dissemination
Stivala, Alex; Robins, Garry; Kashima, Yoshihisa; Kirley, Michael
2014-05-01
The Axelrod model of cultural diffusion is an apparently simple model that is capable of complex behaviour. A recent work used a real-world dataset of opinions as initial conditions, demonstrating the effects of the ultrametric distribution of empirical opinion vectors in promoting cultural diversity in the model. Here we quantify the degree of ultrametricity of the initial culture vectors and investigate the effect of varying degrees of ultrametricity on the absorbing state of both a simple and extended model. Unlike the simple model, ultrametricity alone is not sufficient to sustain long-term diversity in the extended Axelrod model; rather, the initial conditions must also have sufficiently large variance in intervector distances. Further, we find that a scheme for evolving synthetic opinion vectors from cultural ``prototypes'' shows the same behaviour as real opinion data in maintaining cultural diversity in the extended model; whereas neutral evolution of cultural vectors does not.
Point Information Gain and Multidimensional Data Analysis
Directory of Open Access Journals (Sweden)
Renata Rychtáriková
2016-10-01
Full Text Available We generalize the point information gain (PIG and derived quantities, i.e., point information gain entropy (PIE and point information gain entropy density (PIED, for the case of the Rényi entropy and simulate the behavior of PIG for typical distributions. We also use these methods for the analysis of multidimensional datasets. We demonstrate the main properties of PIE/PIED spectra for the real data with the examples of several images and discuss further possible utilizations in other fields of data processing.
RetroTransformDB: A Dataset of Generic Transforms for Retrosynthetic Analysis
Directory of Open Access Journals (Sweden)
Svetlana Avramova
2018-04-01
Full Text Available Presently, software tools for retrosynthetic analysis are widely used by organic, medicinal, and computational chemists. Rule-based systems extensively use collections of retro-reactions (transforms. While there are many public datasets with reactions in synthetic direction (usually non-generic reactions, there are no publicly-available databases with generic reactions in computer-readable format which can be used for the purposes of retrosynthetic analysis. Here we present RetroTransformDB—a dataset of transforms, compiled and coded in SMIRKS line notation by us. The collection is comprised of more than 100 records, with each one including the reaction name, SMIRKS linear notation, the functional group to be obtained, and the transform type classification. All SMIRKS transforms were tested syntactically, semantically, and from a chemical point of view in different software platforms. The overall dataset design and the retrosynthetic fitness were analyzed and curated by organic chemistry experts. The RetroTransformDB dataset may be used by open-source and commercial software packages, as well as chemoinformatics tools.
Ebolavirus Classification Based on Natural Vectors
Zheng, Hui; Yin, Changchuan; Hoang, Tung; He, Rong Lucy; Yang, Jie
2015-01-01
According to the WHO, ebolaviruses have resulted in 8818 human deaths in West Africa as of January 2015. To better understand the evolutionary relationship of the ebolaviruses and infer virulence from the relationship, we applied the alignment-free natural vector method to classify the newest ebolaviruses. The dataset includes three new Guinea viruses as well as 99 viruses from Sierra Leone. For the viruses of the family of Filoviridae, both genus label classification and species label classification achieve an accuracy rate of 100%. We represented the relationships among Filoviridae viruses by Unweighted Pair Group Method with Arithmetic Mean (UPGMA) phylogenetic trees and found that the filoviruses can be separated well by three genera. We performed the phylogenetic analysis on the relationship among different species of Ebolavirus by their coding-complete genomes and seven viral protein genes (glycoprotein [GP], nucleoprotein [NP], VP24, VP30, VP35, VP40, and RNA polymerase [L]). The topology of the phylogenetic tree by the viral protein VP24 shows consistency with the variations of virulence of ebolaviruses. The result suggests that VP24 be a pharmaceutical target for treating or preventing ebolaviruses. PMID:25803489
Gulf of Aqaba Field Trip - Datasets
Hanafy, Sherif M.
2013-11-01
OBJECTIVE: In this work we use geophysical methods to locate and characterize active faults in alluvial sediments. INTRODUCTION: Since only subtle material and velocity contrasts are expected across the faults, we used seismic refraction tomography and 2D resistivity imaging to locate the fault. One seismic profile and one 2D resistivity profile are collected at an alluvial fan on the Gulf of Aqaba coast in Saudi Arabia. The collected data are inverted to generate the traveltime tomogram and the electric resistivity tomogram (ERT). A low velocity anomaly is shown on the traveltime tomogram indicates the colluvial wedge associated with the fault. The location of the fault is shown on the ERT as a vertical high resistivity anomaly. Two data sets were collected at the study site to map the subsurface structure along a profile across the known normal fault described above. The first data set is a seismic refraction data set and the second is a 2D resistivity imaging data set. A total of 120 common shot gathers were collected (MatLab and DPik format). Each shot gather has 120 traces at equal shot and receiver intervals of 2.5 m. The total length of the profile is 297.5 m . Data were recorded using a 1 ms sampling interval for a total recording time of 0.3 s. A 200 lb weight drop was used as the seismic source, with 10 to 15 stacks at each shot location. One 2D resistivity profile is acquired at the same location and parallel to the seismic profile. The acquisition parameters of the resistivity profile are: No. of nodes: 64, Node interval: 5 m, Configuration Array: Schlumberger-Wenner, Total profile length: 315 m, Both seismic and resistivity profiles share the same starting point at the western end of the profile.
On the non-Gaussian correlation of the primordial curvature perturbation with vector fields
DEFF Research Database (Denmark)
Kumar Jain, Rajeev; Sloth, Martin Snoager
2013-01-01
We compute the three-point cross-correlation function of the primordial curvature perturbation generated during inflation with two powers of a vector field in a model where conformal invariance is broken by a direct coupling of the vector field with the inflaton. If the vector field is identified...... with the electromagnetic field, this correlation would be a non-Gaussian signature of primordial magnetic fields generated during inflation. We find that the signal is maximized for the flattened configuration where the wave number of the curvature perturbation is twice that of the vector field and in this limit...
Representation and display of vector field topology in fluid flow data sets
Helman, James; Hesselink, Lambertus
1989-01-01
The visualization of physical processes in general and of vector fields in particular is discussed. An approach to visualizing flow topology that is based on the physics and mathematics underlying the physical phenomenon is presented. It involves determining critical points in the flow where the velocity vector vanishes. The critical points, connected by principal lines or planes, determine the topology of the flow. The complexity of the data is reduced without sacrificing the quantitative nature of the data set. By reducing the original vector field to a set of critical points and their connections, a representation of the topology of a two-dimensional vector field that is much smaller than the original data set but retains with full precision the information pertinent to the flow topology is obtained. This representation can be displayed as a set of points and tangent curves or as a graph. Analysis (including algorithms), display, interaction, and implementation aspects are discussed.
Giacosa, Francesco; Sammet, Julia; Janowski, Stanislaus
2017-06-01
We calculate two- and three-body decays of the (lightest) vector glueball into (pseudo)scalar, (axial-)vector, as well as pseudovector and excited vector mesons in the framework of a model of QCD. While absolute values of widths cannot be predicted because the corresponding coupling constants are unknown, some interesting branching ratios can be evaluated by setting the mass of the yet hypothetical vector glueball to 3.8 GeV as predicted by quenched lattice QCD. We find that the decay mode ω π π should be one of the largest (both through the decay chain O →b1π →ω π π and through the direct coupling O →ω π π ). Similarly, the (direct and indirect) decay into π K K*(892 ) is sizable. Moreover, the decays into ρ π and K*(892 )K are, although subleading, possible and could play a role in explaining the ρ π puzzle of the charmonium state ψ (2 S ) thanks to a (small) mixing with the vector glueball. The vector glueball can be directly formed at the ongoing BESIII experiment as well as at the future PANDA experiment at the FAIR facility. If the width is sufficiently small (≲100 MeV ) it should not escape future detection. It should be stressed that the employed model is based on some inputs and simplifying assumptions: the value of glueball mass (at present, the quenched lattice value is used), the lack of mixing of the glueball with other quarkonium states, and the use of few interaction terms. It then represents a first step toward the identification of the main decay channels of the vector glueball, but shall be improved when corresponding experimental candidates and/or new lattice results will be available.
Discovery and Reuse of Open Datasets: An Exploratory Study
Directory of Open Access Journals (Sweden)
Sara
2016-07-01
Full Text Available Objective: This article analyzes twenty cited or downloaded datasets and the repositories that house them, in order to produce insights that can be used by academic libraries to encourage discovery and reuse of research data in institutional repositories. Methods: Using Thomson Reuters’ Data Citation Index and repository download statistics, we identified twenty cited/downloaded datasets. We documented the characteristics of the cited/downloaded datasets and their corresponding repositories in a self-designed rubric. The rubric includes six major categories: basic information; funding agency and journal information; linking and sharing; factors to encourage reuse; repository characteristics; and data description. Results: Our small-scale study suggests that cited/downloaded datasets generally comply with basic recommendations for facilitating reuse: data are documented well; formatted for use with a variety of software; and shared in established, open access repositories. Three significant factors also appear to contribute to dataset discovery: publishing in discipline-specific repositories; indexing in more than one location on the web; and using persistent identifiers. The cited/downloaded datasets in our analysis came from a few specific disciplines, and tended to be funded by agencies with data publication mandates. Conclusions: The results of this exploratory research provide insights that can inform academic librarians as they work to encourage discovery and reuse of institutional datasets. Our analysis also suggests areas in which academic librarians can target open data advocacy in their communities in order to begin to build open data success stories that will fuel future advocacy efforts.
Viability of Controlling Prosthetic Hand Utilizing Electroencephalograph (EEG) Dataset Signal
Miskon, Azizi; A/L Thanakodi, Suresh; Raihan Mazlan, Mohd; Mohd Haziq Azhar, Satria; Nooraya Mohd Tawil, Siti
2016-11-01
This project presents the development of an artificial hand controlled by Electroencephalograph (EEG) signal datasets for the prosthetic application. The EEG signal datasets were used as to improvise the way to control the prosthetic hand compared to the Electromyograph (EMG). The EMG has disadvantages to a person, who has not used the muscle for a long time and also to person with degenerative issues due to age factor. Thus, the EEG datasets found to be an alternative for EMG. The datasets used in this work were taken from Brain Computer Interface (BCI) Project. The datasets were already classified for open, close and combined movement operations. It served the purpose as an input to control the prosthetic hand by using an Interface system between Microsoft Visual Studio and Arduino. The obtained results reveal the prosthetic hand to be more efficient and faster in response to the EEG datasets with an additional LiPo (Lithium Polymer) battery attached to the prosthetic. Some limitations were also identified in terms of the hand movements, weight of the prosthetic, and the suggestions to improve were concluded in this paper. Overall, the objective of this paper were achieved when the prosthetic hand found to be feasible in operation utilizing the EEG datasets.
Sparse Group Penalized Integrative Analysis of Multiple Cancer Prognosis Datasets
Liu, Jin; Huang, Jian; Xie, Yang; Ma, Shuangge
2014-01-01
SUMMARY In cancer research, high-throughput profiling studies have been extensively conducted, searching for markers associated with prognosis. Because of the “large d, small n” characteristic, results generated from the analysis of a single dataset can be unsatisfactory. Recent studies have shown that integrative analysis, which simultaneously analyzes multiple datasets, can be more effective than single-dataset analysis and classic meta-analysis. In most of existing integrative analysis, the homogeneity model has been assumed, which postulates that different datasets share the same set of markers. Several approaches have been designed to reinforce this assumption. In practice, different datasets may differ in terms of patient selection criteria, profiling techniques, and many other aspects. Such differences may make the homogeneity model too restricted. In this study, we assume the heterogeneity model, under which different datasets are allowed to have different sets of markers. With multiple cancer prognosis datasets, we adopt the AFT (accelerated failure time) model to describe survival. This model may have the lowest computational cost among popular semiparametric survival models. For marker selection, we adopt a sparse group MCP (minimax concave penalty) approach. This approach has an intuitive formulation and can be computed using an effective group coordinate descent algorithm. Simulation study shows that it outperforms the existing approaches under both the homogeneity and heterogeneity models. Data analysis further demonstrates the merit of heterogeneity model and proposed approach. PMID:23938111
The local structure of a Liouville vector field
International Nuclear Information System (INIS)
Ciriza, E.
1990-05-01
In this work we investigate the local structure of a Liouville vector field ξ of a Kaehler manifold (P,Ω) which vanishes on an isotropic submanifold Q of P. Some of the eigenvalues of its linear part at the singular points are zero and the remaining ones are in resonance. We show that there is a C 1 -smooth linearizing conjugation between the Liouville vector field ξ and its linear part. To do this we construct Darboux coordinates adapted to the unstable foliation which is provided by the Centre Manifold Theorem. We then apply recent linearization results due to G. Sell. (author). 11 refs
PROVIDING GEOGRAPHIC DATASETS AS LINKED DATA IN SDI
Directory of Open Access Journals (Sweden)
E. Hietanen
2016-06-01
Full Text Available In this study, a prototype service to provide data from Web Feature Service (WFS as linked data is implemented. At first, persistent and unique Uniform Resource Identifiers (URI are created to all spatial objects in the dataset. The objects are available from those URIs in Resource Description Framework (RDF data format. Next, a Web Ontology Language (OWL ontology is created to describe the dataset information content using the Open Geospatial Consortium’s (OGC GeoSPARQL vocabulary. The existing data model is modified in order to take into account the linked data principles. The implemented service produces an HTTP response dynamically. The data for the response is first fetched from existing WFS. Then the Geographic Markup Language (GML format output of the WFS is transformed on-the-fly to the RDF format. Content Negotiation is used to serve the data in different RDF serialization formats. This solution facilitates the use of a dataset in different applications without replicating the whole dataset. In addition, individual spatial objects in the dataset can be referred with URIs. Furthermore, the needed information content of the objects can be easily extracted from the RDF serializations available from those URIs. A solution for linking data objects to the dataset URI is also introduced by using the Vocabulary of Interlinked Datasets (VoID. The dataset is divided to the subsets and each subset is given its persistent and unique URI. This enables the whole dataset to be explored with a web browser and all individual objects to be indexed by search engines.
Homogenised Australian climate datasets used for climate change monitoring
International Nuclear Information System (INIS)
Trewin, Blair; Jones, David; Collins; Dean; Jovanovic, Branislava; Braganza, Karl
2007-01-01
Full text: The Australian Bureau of Meteorology has developed a number of datasets for use in climate change monitoring. These datasets typically cover 50-200 stations distributed as evenly as possible over the Australian continent, and have been subject to detailed quality control and homogenisation.The time period over which data are available for each element is largely determined by the availability of data in digital form. Whilst nearly all Australian monthly and daily precipitation data have been digitised, a significant quantity of pre-1957 data (for temperature and evaporation) or pre-1987 data (for some other elements) remains to be digitised, and is not currently available for use in the climate change monitoring datasets. In the case of temperature and evaporation, the start date of the datasets is also determined by major changes in instruments or observing practices for which no adjustment is feasible at the present time. The datasets currently available cover: Monthly and daily precipitation (most stations commence 1915 or earlier, with many extending back to the late 19th century, and a few to the mid-19th century); Annual temperature (commences 1910); Daily temperature (commences 1910, with limited station coverage pre-1957); Twice-daily dewpoint/relative humidity (commences 1957); Monthly pan evaporation (commences 1970); Cloud amount (commences 1957) (Jovanovic etal. 2007). As well as the station-based datasets listed above, an additional dataset being developed for use in climate change monitoring (and other applications) covers tropical cyclones in the Australian region. This is described in more detail in Trewin (2007). The datasets already developed are used in analyses of observed climate change, which are available through the Australian Bureau of Meteorology website (http://www.bom.gov.au/silo/products/cli_chg/). They are also used as a basis for routine climate monitoring, and in the datasets used for the development of seasonal
Learning with LOGO: Logo and Vectors.
Lough, Tom; Tipps, Steve
1986-01-01
This is the first of a two-part series on the general concept of vector space. Provides tool procedures to allow investigation of vector properties, vector addition and subtraction, and X and Y components. Lists several sources of additional vector ideas. (JM)
Representative Vector Machines: A Unified Framework for Classical Classifiers.
Gui, Jie; Liu, Tongliang; Tao, Dacheng; Sun, Zhenan; Tan, Tieniu
2016-08-01
Classifier design is a fundamental problem in pattern recognition. A variety of pattern classification methods such as the nearest neighbor (NN) classifier, support vector machine (SVM), and sparse representation-based classification (SRC) have been proposed in the literature. These typical and widely used classifiers were originally developed from different theory or application motivations and they are conventionally treated as independent and specific solutions for pattern classification. This paper proposes a novel pattern classification framework, namely, representative vector machines (or RVMs for short). The basic idea of RVMs is to assign the class label of a test example according to its nearest representative vector. The contributions of RVMs are twofold. On one hand, the proposed RVMs establish a unified framework of classical classifiers because NN, SVM, and SRC can be interpreted as the special cases of RVMs with different definitions of representative vectors. Thus, the underlying relationship among a number of classical classifiers is revealed for better understanding of pattern classification. On the other hand, novel and advanced classifiers are inspired in the framework of RVMs. For example, a robust pattern classification method called discriminant vector machine (DVM) is motivated from RVMs. Given a test example, DVM first finds its k -NNs and then performs classification based on the robust M-estimator and manifold regularization. Extensive experimental evaluations on a variety of visual recognition tasks such as face recognition (Yale and face recognition grand challenge databases), object categorization (Caltech-101 dataset), and action recognition (Action Similarity LAbeliNg) demonstrate the advantages of DVM over other classifiers.
McAllister, Patrick; Zheng, Huiru; Bond, Raymond; Moorhead, Anne
2018-04-01
Obesity is increasing worldwide and can cause many chronic conditions such as type-2 diabetes, heart disease, sleep apnea, and some cancers. Monitoring dietary intake through food logging is a key method to maintain a healthy lifestyle to prevent and manage obesity. Computer vision methods have been applied to food logging to automate image classification for monitoring dietary intake. In this work we applied pretrained ResNet-152 and GoogleNet convolutional neural networks (CNNs), initially trained using ImageNet Large Scale Visual Recognition Challenge (ILSVRC) dataset with MatConvNet package, to extract features from food image datasets; Food 5K, Food-11, RawFooT-DB, and Food-101. Deep features were extracted from CNNs and used to train machine learning classifiers including artificial neural network (ANN), support vector machine (SVM), Random Forest, and Naive Bayes. Results show that using ResNet-152 deep features with SVM with RBF kernel can accurately detect food items with 99.4% accuracy using Food-5K validation food image dataset and 98.8% with Food-5K evaluation dataset using ANN, SVM-RBF, and Random Forest classifiers. Trained with ResNet-152 features, ANN can achieve 91.34%, 99.28% when applied to Food-11 and RawFooT-DB food image datasets respectively and SVM with RBF kernel can achieve 64.98% with Food-101 image dataset. From this research it is clear that using deep CNN features can be used efficiently for diverse food item image classification. The work presented in this research shows that pretrained ResNet-152 features provide sufficient generalisation power when applied to a range of food image classification tasks. Copyright © 2018 Elsevier Ltd. All rights reserved.
Tension in the recent Type Ia supernovae datasets
International Nuclear Information System (INIS)
Wei, Hao
2010-01-01
In the present work, we investigate the tension in the recent Type Ia supernovae (SNIa) datasets Constitution and Union. We show that they are in tension not only with the observations of the cosmic microwave background (CMB) anisotropy and the baryon acoustic oscillations (BAO), but also with other SNIa datasets such as Davis and SNLS. Then, we find the main sources responsible for the tension. Further, we make this more robust by employing the method of random truncation. Based on the results of this work, we suggest two truncated versions of the Union and Constitution datasets, namely the UnionT and ConstitutionT SNIa samples, whose behaviors are more regular.
On some orthogonality properties of Maxwell's multipole vectors
International Nuclear Information System (INIS)
Gramada, Apostol
2007-01-01
We determine the location of the expansion points with respect to which the two Maxwell's multipole vectors of the quadrupole moment and the dipole vector of a distribution of charge form an orthogonal trihedron. We find that with respect to these 'orthogonality centres' both the dipole and the quadrupole moments are each characterized by a single real parameter. We further show that the orthogonality centres coincide with the stationary points of the magnitude of the quadrupole moment and, therefore, they can be seen as an extension of the concept of centre of the dipole moment of a neutral system introduced previously in the literature. The nature of the stationary points then provides the means for the classification of a distribution of charge in two different categories
U.S. Environmental Protection Agency — Point geospatial dataset representing locations of NPDES Waste Water Treatment Plant Facilities. NPDES (National Pollution Discharge Elimination System) is an EPA...
U.S. Environmental Protection Agency — Point geospatial dataset representing locations of NPDES Waste Water Treatment Plant Facilities. NPDES (National Pollution Discharge Elimination System) is an EPA...
U.S. Environmental Protection Agency — Point geospatial dataset representing locations of NPDES Waste Water Treatment Plant Facilities. NPDES (National Pollution Discharge Elimination System) is an EPA...
Chimpanzee Adenovirus Vector Ebola Vaccine.
Ledgerwood, Julie E; DeZure, Adam D; Stanley, Daphne A; Coates, Emily E; Novik, Laura; Enama, Mary E; Berkowitz, Nina M; Hu, Zonghui; Joshi, Gyan; Ploquin, Aurélie; Sitar, Sandra; Gordon, Ingelise J; Plummer, Sarah A; Holman, LaSonji A; Hendel, Cynthia S; Yamshchikov, Galina; Roman, Francois; Nicosia, Alfredo; Colloca, Stefano; Cortese, Riccardo; Bailer, Robert T; Schwartz, Richard M; Roederer, Mario; Mascola, John R; Koup, Richard A; Sullivan, Nancy J; Graham, Barney S
2017-03-09
The unprecedented 2014 epidemic of Ebola virus disease (EVD) prompted an international response to accelerate the availability of a preventive vaccine. A replication-defective recombinant chimpanzee adenovirus type 3-vectored ebolavirus vaccine (cAd3-EBO), encoding the glycoprotein from Zaire and Sudan species, that offers protection in the nonhuman primate model, was rapidly advanced into phase 1 clinical evaluation. We conducted a phase 1, dose-escalation, open-label trial of cAd3-EBO. Twenty healthy adults, in sequentially enrolled groups of 10 each, received vaccination intramuscularly in doses of 2×10 10 particle units or 2×10 11 particle units. Primary and secondary end points related to safety and immunogenicity were assessed throughout the first 8 weeks after vaccination; in addition, longer-term vaccine durability was assessed at 48 weeks after vaccination. In this small study, no safety concerns were identified; however, transient fever developed within 1 day after vaccination in two participants who had received the 2×10 11 particle-unit dose. Glycoprotein-specific antibodies were induced in all 20 participants; the titers were of greater magnitude in the group that received the 2×10 11 particle-unit dose than in the group that received the 2×10 10 particle-unit dose (geometric mean titer against the Zaire antigen at week 4, 2037 vs. 331; P=0.001). Glycoprotein-specific T-cell responses were more frequent among those who received the 2×10 11 particle-unit dose than among those who received the 2×10 10 particle-unit dose, with a CD4 response in 10 of 10 participants versus 3 of 10 participants (P=0.004) and a CD8 response in 7 of 10 participants versus 2 of 10 participants (P=0.07) at week 4. Assessment of the durability of the antibody response showed that titers remained high at week 48, with the highest titers in those who received the 2×10 11 particle-unit dose. Reactogenicity and immune responses to cAd3-EBO vaccine were dose-dependent. At
Elliptic-symmetry vector optical fields.
Pan, Yue; Li, Yongnan; Li, Si-Min; Ren, Zhi-Cheng; Kong, Ling-Jun; Tu, Chenghou; Wang, Hui-Tian
2014-08-11
We present in principle and demonstrate experimentally a new kind of vector fields: elliptic-symmetry vector optical fields. This is a significant development in vector fields, as this breaks the cylindrical symmetry and enriches the family of vector fields. Due to the presence of an additional degrees of freedom, which is the interval between the foci in the elliptic coordinate system, the elliptic-symmetry vector fields are more flexible than the cylindrical vector fields for controlling the spatial structure of polarization and for engineering the focusing fields. The elliptic-symmetry vector fields can find many specific applications from optical trapping to optical machining and so on.
A generalized nonlocal vector calculus
Alali, Bacim; Liu, Kuo; Gunzburger, Max
2015-10-01
A nonlocal vector calculus was introduced in Du et al. (Math Model Meth Appl Sci 23:493-540, 2013) that has proved useful for the analysis of the peridynamics model of nonlocal mechanics and nonlocal diffusion models. A formulation is developed that provides a more general setting for the nonlocal vector calculus that is independent of particular nonlocal models. It is shown that general nonlocal calculus operators are integral operators with specific integral kernels. General nonlocal calculus properties are developed, including nonlocal integration by parts formula and Green's identities. The nonlocal vector calculus introduced in Du et al. (Math Model Meth Appl Sci 23:493-540, 2013) is shown to be recoverable from the general formulation as a special example. This special nonlocal vector calculus is used to reformulate the peridynamics equation of motion in terms of the nonlocal gradient operator and its adjoint. A new example of nonlocal vector calculus operators is introduced, which shows the potential use of the general formulation for general nonlocal models.
Generalized Selection Weighted Vector Filters
Directory of Open Access Journals (Sweden)
Rastislav Lukac
2004-09-01
Full Text Available This paper introduces a class of nonlinear multichannel filters capable of removing impulsive noise in color images. The here-proposed generalized selection weighted vector filter class constitutes a powerful filtering framework for multichannel signal processing. Previously defined multichannel filters such as vector median filter, basic vector directional filter, directional-distance filter, weighted vector median filters, and weighted vector directional filters are treated from a global viewpoint using the proposed framework. Robust order-statistic concepts and increased degree of freedom in filter design make the proposed method attractive for a variety of applications. Introduced multichannel sigmoidal adaptation of the filter parameters and its modifications allow to accommodate the filter parameters to varying signal and noise statistics. Simulation studies reported in this paper indicate that the proposed filter class is computationally attractive, yields excellent performance, and is able to preserve fine details and color information while efficiently suppressing impulsive noise. This paper is an extended version of the paper by Lukac et al. presented at the 2003 IEEE-EURASIP Workshop on Nonlinear Signal and Image Processing (NSIP '03 in Grado, Italy.
Federal Emergency Management Agency, Department of Homeland Security — FEMA Framework Basemap datasets comprise six of the seven FGDC themes of geospatial data that are used by most GIS applications (Note: the seventh framework theme,...
Directory of Open Access Journals (Sweden)
Al Mehedi Hasan
2017-07-01
Full Text Available The prediction of subcellular locations of proteins can provide useful hints for revealing their functions as well as for understanding the mechanisms of some diseases and, finally, for developing novel drugs. As the number of newly discovered proteins has been growing exponentially, laboratory-based experiments to determine the location of an uncharacterized protein in a living cell have become both expensive and time-consuming. Consequently, to tackle these challenges, computational methods are being developed as an alternative to help biologists in selecting target proteins and designing related experiments. However, the success of protein subcellular localization prediction is still a complicated and challenging problem, particularly when query proteins may have multi-label characteristics, i.e. their simultaneous existence in more than one subcellular location, or if they move between two or more different subcellular locations as well. At this point, to get rid of this problem, several types of subcellular localization prediction methods with different levels of accuracy have been proposed. The support vector machine (SVM has been employed to provide potential solutions for problems connected with the prediction of protein subcellular localization. However, the practicability of SVM is affected by difficulties in selecting its appropriate kernel as well as in selecting the parameters of that selected kernel. The literature survey has shown that most researchers apply the radial basis function (RBF kernel to build a SVM based subcellular localization prediction system. Surprisingly, there are still many other kernel functions which have not yet been applied in the prediction of protein subcellular localization. However, the nature of this classification problem requires the application of different kernels for SVM to ensure an optimal result. From this viewpoint, this paper presents the work to apply different kernels for SVM in protein
Directory of Open Access Journals (Sweden)
Wang Lily
2008-07-01
Full Text Available Abstract Background Cancer diagnosis and clinical outcome prediction are among the most important emerging applications of gene expression microarray technology with several molecular signatures on their way toward clinical deployment. Use of the most accurate classification algorithms available for microarray gene expression data is a critical ingredient in order to develop the best possible molecular signatures for patient care. As suggested by a large body of literature to date, support vector machines can be considered "best of class" algorithms for classification of such data. Recent work, however, suggests that random forest classifiers may outperform support vector machines in this domain. Results In the present paper we identify methodological biases of prior work comparing random forests and support vector machines and conduct a new rigorous evaluation of the two algorithms that corrects these limitations. Our experiments use 22 diagnostic and prognostic datasets and show that support vector machines outperform random forests, often by a large margin. Our data also underlines the importance of sound research design in benchmarking and comparison of bioinformatics algorithms. Conclusion We found that both on average and in the majority of microarray datasets, random forests are outperformed by support vector machines both in the settings when no gene selection is performed and when several popular gene selection methods are used.
Directory of Open Access Journals (Sweden)
Shokri Saeid
2015-01-01
Full Text Available An accurate prediction of sulfur content is very important for the proper operation and product quality control in hydrodesulfurization (HDS process. For this purpose, a reliable data- driven soft sensors utilizing Support Vector Regression (SVR was developed and the effects of integrating Vector Quantization (VQ with Principle Component Analysis (PCA were studied on the assessment of this soft sensor. First, in pre-processing step the PCA and VQ techniques were used to reduce dimensions of the original input datasets. Then, the compressed datasets were used as input variables for the SVR model. Experimental data from the HDS setup were employed to validate the proposed integrated model. The integration of VQ/PCA techniques with SVR model was able to increase the prediction accuracy of SVR. The obtained results show that integrated technique (VQ-SVR was better than (PCA-SVR in prediction accuracy. Also, VQ decreased the sum of the training and test time of SVR model in comparison with PCA. For further evaluation, the performance of VQ-SVR model was also compared to that of SVR. The obtained results indicated that VQ-SVR model delivered the best satisfactory predicting performance (AARE= 0.0668 and R2= 0.995 in comparison with investigated models.
a Variant of Lsd-Slam Capable of Processing High-Speed Low-Framerate Monocular Datasets
Schmid, S.; Fritsch, D.
2017-11-01
We develop a new variant of LSD-SLAM, called C-LSD-SLAM, which is capable of performing monocular tracking and mapping in high-speed low-framerate situations such as those of the KITTI datasets. The methods used here are robust against the influence of erronously triangulated points near the epipolar direction, which otherwise causes tracking divergence.
A Core Set Based Large Vector-Angular Region and Margin Approach for Novelty Detection
Directory of Open Access Journals (Sweden)
Jiusheng Chen
2016-01-01
Full Text Available A large vector-angular region and margin (LARM approach is presented for novelty detection based on imbalanced data. The key idea is to construct the largest vector-angular region in the feature space to separate normal training patterns; meanwhile, maximize the vector-angular margin between the surface of this optimal vector-angular region and abnormal training patterns. In order to improve the generalization performance of LARM, the vector-angular distribution is optimized by maximizing the vector-angular mean and minimizing the vector-angular variance, which separates the normal and abnormal examples well. However, the inherent computation of quadratic programming (QP solver takes O(n3 training time and at least O(n2 space, which might be computational prohibitive for large scale problems. By (1+ε and (1-ε-approximation algorithm, the core set based LARM algorithm is proposed for fast training LARM problem. Experimental results based on imbalanced datasets have validated the favorable efficiency of the proposed approach in novelty detection.
Garraín, Daniel; Fazio, Simone; de la Rúa, Cristina; Recchioni, Marco; Lechón, Yolanda; Mathieux, Fabrice
2015-01-01
The aim of this paper is to identify areas of potential improvement of the European Reference Life Cycle Database (ELCD) electricity datasets. The revision is based on the data quality indicators described by the International Life Cycle Data system (ILCD) Handbook, applied on sectorial basis. These indicators evaluate the technological, geographical and time-related representativeness of the dataset and the appropriateness in terms of completeness, precision and methodology. Results show that ELCD electricity datasets have a very good quality in general terms, nevertheless some findings and recommendations in order to improve the quality of Life-Cycle Inventories have been derived. Moreover, these results ensure the quality of the electricity-related datasets to any LCA practitioner, and provide insights related to the limitations and assumptions underlying in the datasets modelling. Giving this information, the LCA practitioner will be able to decide whether the use of the ELCD electricity datasets is appropriate based on the goal and scope of the analysis to be conducted. The methodological approach would be also useful for dataset developers and reviewers, in order to improve the overall Data Quality Requirements of databases.
Anisotropic cosmological solutions in massive vector theories
Energy Technology Data Exchange (ETDEWEB)
Heisenberg, Lavinia [Institute for Theoretical Studies, ETH Zurich, Clausiusstrasse 47, 8092 Zurich (Switzerland); Kase, Ryotaro; Tsujikawa, Shinji, E-mail: Lavinia.heisenberg@googlemail.com, E-mail: r.kase@rs.tus.ac.jp, E-mail: shinji@rs.kagu.tus.ac.jp [Department of Physics, Faculty of Science, Tokyo University of Science, 1-3, Kagurazaka, Shinjuku-ku, Tokyo 162-8601 (Japan)
2016-11-01
In beyond-generalized Proca theories including the extension to theories higher than second order, we study the role of a spatial component v of a massive vector field on the anisotropic cosmological background. We show that, as in the case of the isotropic cosmological background, there is no additional ghostly degrees of freedom associated with the Ostrogradski instability. In second-order generalized Proca theories we find the existence of anisotropic solutions on which the ratio between the anisotropic expansion rate Σ and the isotropic expansion rate H remains nearly constant in the radiation-dominated epoch. In the regime where Σ/ H is constant, the spatial vector component v works as a dark radiation with the equation of state close to 1/3. During the matter era, the ratio Σ/ H decreases with the decrease of v . As long as the conditions |Σ| || H and v {sup 2} || φ{sup 2} are satisfied around the onset of late-time cosmic acceleration, where φ is the temporal vector component, we find that the solutions approach the isotropic de Sitter fixed point (Σ = 0 = v ) in accordance with the cosmic no-hair conjecture. In the presence of v and Σ the early evolution of the dark energy equation of state w {sub DE} in the radiation era is different from that in the isotropic case, but the approach to the isotropic value w {sub DE}{sup (iso)} typically occurs at redshifts z much larger than 1. Thus, apart from the existence of dark radiation, the anisotropic cosmological dynamics at low redshifts is similar to that in isotropic generalized Proca theories. In beyond-generalized Proca theories the only consistent solution to avoid the divergence of a determinant of the dynamical system corresponds to v = 0, so Σ always decreases in time.
Anisotropic cosmological solutions in massive vector theories
International Nuclear Information System (INIS)
Heisenberg, Lavinia; Kase, Ryotaro; Tsujikawa, Shinji
2016-01-01
In beyond-generalized Proca theories including the extension to theories higher than second order, we study the role of a spatial component v of a massive vector field on the anisotropic cosmological background. We show that, as in the case of the isotropic cosmological background, there is no additional ghostly degrees of freedom associated with the Ostrogradski instability. In second-order generalized Proca theories we find the existence of anisotropic solutions on which the ratio between the anisotropic expansion rate Σ and the isotropic expansion rate H remains nearly constant in the radiation-dominated epoch. In the regime where Σ/ H is constant, the spatial vector component v works as a dark radiation with the equation of state close to 1/3. During the matter era, the ratio Σ/ H decreases with the decrease of v . As long as the conditions |Σ| || H and v 2 || φ 2 are satisfied around the onset of late-time cosmic acceleration, where φ is the temporal vector component, we find that the solutions approach the isotropic de Sitter fixed point (Σ = 0 = v ) in accordance with the cosmic no-hair conjecture. In the presence of v and Σ the early evolution of the dark energy equation of state w DE in the radiation era is different from that in the isotropic case, but the approach to the isotropic value w DE (iso) typically occurs at redshifts z much larger than 1. Thus, apart from the existence of dark radiation, the anisotropic cosmological dynamics at low redshifts is similar to that in isotropic generalized Proca theories. In beyond-generalized Proca theories the only consistent solution to avoid the divergence of a determinant of the dynamical system corresponds to v = 0, so Σ always decreases in time.
Estimated Perennial Streams of Idaho and Related Geospatial Datasets
Rea, Alan; Skinner, Kenneth D.
2009-01-01
The perennial or intermittent status of a stream has bearing on many regulatory requirements. Because of changing technologies over time, cartographic representation of perennial/intermittent status of streams on U.S. Geological Survey (USGS) topographic maps is not always accurate and (or) consistent from one map sheet to another. Idaho Administrative Code defines an intermittent stream as one having a 7-day, 2-year low flow (7Q2) less than 0.1 cubic feet per second. To establish consistency with the Idaho Administrative Code, the USGS developed regional regression equations for Idaho streams for several low-flow statistics, including 7Q2. Using these regression equations, the 7Q2 streamflow may be estimated for naturally flowing streams anywhere in Idaho to help determine perennial/intermittent status of streams. Using these equations in conjunction with a Geographic Information System (GIS) technique known as weighted flow accumulation allows for an automated and continuous estimation of 7Q2 streamflow at all points along a stream, which in turn can be used to determine if a stream is intermittent or perennial according to the Idaho Administrative Code operational definition. The selected regression equations were applied to create continuous grids of 7Q2 estimates for the eight low-flow regression regions of Idaho. By applying the 0.1 ft3/s criterion, the perennial streams have been estimated in each low-flow region. Uncertainty in the estimates is shown by identifying a 'transitional' zone, corresponding to flow estimates of 0.1 ft3/s plus and minus one standard error. Considerable additional uncertainty exists in the model of perennial streams presented in this report. The regression models provide overall estimates based on general trends within each regression region. These models do not include local factors such as a large spring or a losing reach that may greatly affect flows at any given point. Site-specific flow data, assuming a sufficient period of
Toward lattice fractional vector calculus
Tarasov, Vasily E.
2014-09-01
An analog of fractional vector calculus for physical lattice models is suggested. We use an approach based on the models of three-dimensional lattices with long-range inter-particle interactions. The lattice analogs of fractional partial derivatives are represented by kernels of lattice long-range interactions, where the Fourier series transformations of these kernels have a power-law form with respect to wave vector components. In the continuum limit, these lattice partial derivatives give derivatives of non-integer order with respect to coordinates. In the three-dimensional description of the non-local continuum, the fractional differential operators have the form of fractional partial derivatives of the Riesz type. As examples of the applications of the suggested lattice fractional vector calculus, we give lattice models with long-range interactions for the fractional Maxwell equations of non-local continuous media and for the fractional generalization of the Mindlin and Aifantis continuum models of gradient elasticity.
Gauge Theories of Vector Particles
Glashow, S. L.; Gell-Mann, M.
1961-04-24
The possibility of generalizing the Yang-Mills trick is examined. Thus we seek theories of vector bosons invariant under continuous groups of coordinate-dependent linear transformations. All such theories may be expressed as superpositions of certain "simple" theories; we show that each "simple theory is associated with a simple Lie algebra. We may introduce mass terms for the vector bosons at the price of destroying the gauge-invariance for coordinate-dependent gauge functions. The theories corresponding to three particular simple Lie algebras - those which admit precisely two commuting quantum numbers - are examined in some detail as examples. One of them might play a role in the physics of the strong interactions if there is an underlying super-symmetry, transcending charge independence, that is badly broken. The intermediate vector boson theory of weak interactions is discussed also. The so-called "schizon" model cannot be made to conform to the requirements of partial gauge-invariance.
Search for intermediate vector bosons
International Nuclear Information System (INIS)
Cline, D.B.; Rubbia, C.; van der Meer, S.
1982-01-01
Over the past 15 years a new class of unified theories has been developed to describe the forces acting between elementary particles. The most successful of the new theories establishes a link between electromagnetism and the weak force. A crucial prediction of this unified electroweak theory is the existence of three massive particles called intermediate vector bosons. If these intermediate vector bosons exist and if they have properties attributed to them by electroweak theory, they should soon be detected, as the world's first particle accelerator with enough energy to create such particles has recently been completed at the European Organization for Nuclear Research (CERN) in Geneva. The accelerator has been converted to a colliding beam machine in which protons and antiprotons collide head on. According to electroweak theory, intermediate vector bosons can be created in proton-antiproton collisions. (SC)