Discrete ellipsoidal statistical BGK model and Burnett equations
Zhang, Yu-Dong; Xu, Ai-Guo; Zhang, Guang-Cai; Chen, Zhi-Hua; Wang, Pei
2018-06-01
A new discrete Boltzmann model, the discrete ellipsoidal statistical Bhatnagar-Gross-Krook (ESBGK) model, is proposed to simulate nonequilibrium compressible flows. Compared with the original discrete BGK model, the discrete ES-BGK has a flexible Prandtl number. For the discrete ES-BGK model in the Burnett level, two kinds of discrete velocity model are introduced and the relations between nonequilibrium quantities and the viscous stress and heat flux in the Burnett level are established. The model is verified via four benchmark tests. In addition, a new idea is introduced to recover the actual distribution function through the macroscopic quantities and their space derivatives. The recovery scheme works not only for discrete Boltzmann simulation but also for hydrodynamic ones, for example, those based on the Navier-Stokes or the Burnett equations.
Parametric statistical inference for discretely observed diffusion processes
DEFF Research Database (Denmark)
Pedersen, Asger Roer
Part 1: Theoretical results Part 2: Statistical applications of Gaussian diffusion processes in freshwater ecology......Part 1: Theoretical results Part 2: Statistical applications of Gaussian diffusion processes in freshwater ecology...
Universality of correlations of levels with discrete statistics
Brezin, Edouard; Kazakov, Vladimir
1999-01-01
We study the statistics of a system of N random levels with integer values, in the presence of a logarithmic repulsive potential of Dyson type. This probleme arises in sums over representations (Young tableaux) of GL(N) in various matrix problems and in the study of statistics of partitions for the permutation group. The model is generalized to include an external source and its correlators are found in closed form for any N. We reproduce the density of levels in the large N and double scalin...
Statistical characterization of discrete conservative systems: The web map
Ruiz, Guiomar; Tirnakli, Ugur; Borges, Ernesto P.; Tsallis, Constantino
2017-10-01
We numerically study the two-dimensional, area preserving, web map. When the map is governed by ergodic behavior, it is, as expected, correctly described by Boltzmann-Gibbs statistics, based on the additive entropic functional SB G[p (x ) ] =-k ∫d x p (x ) lnp (x ) . In contrast, possible ergodicity breakdown and transitory sticky dynamical behavior drag the map into the realm of generalized q statistics, based on the nonadditive entropic functional Sq[p (x ) ] =k 1/-∫d x [p(x ) ] q q -1 (q ∈R ;S1=SB G ). We statistically describe the system (probability distribution of the sum of successive iterates, sensitivity to the initial condition, and entropy production per unit time) for typical values of the parameter that controls the ergodicity of the map. For small (large) values of the external parameter K , we observe q -Gaussian distributions with q =1.935 ⋯ (Gaussian distributions), like for the standard map. In contrast, for intermediate values of K , we observe a different scenario, due to the fractal structure of the trajectories embedded in the chaotic sea. Long-standing non-Gaussian distributions are characterized in terms of the kurtosis and the box-counting dimension of chaotic sea.
Discrete changes of current statistics in periodically driven stochastic systems
International Nuclear Information System (INIS)
Chernyak, Vladimir Y; Sinitsyn, N A
2010-01-01
We demonstrate that the counting statistics of currents in periodically driven ergodic stochastic systems can show sharp changes of some of its properties in response to continuous changes of the driving protocol. To describe this effect, we introduce a new topological phase factor in the evolution of the moment generating function which is akin to the topological geometric phase in the evolution of a periodically driven quantum mechanical system with time-reversal symmetry. This phase leads to the prediction of a sign change for the difference of the probabilities to find even and odd numbers of particles transferred in a stochastic system in response to cyclic evolution of control parameters. The driving protocols that lead to this sign change should enclose specific degeneracy points in the space of control parameters. The relation between the topology of the paths in the control parameter space and the sign changes can be described in terms of the first Stiefel–Whitney class of topological invariants. (letter)
Statistical inference for discrete-time samples from affine stochastic delay differential equations
DEFF Research Database (Denmark)
Küchler, Uwe; Sørensen, Michael
2013-01-01
Statistical inference for discrete time observations of an affine stochastic delay differential equation is considered. The main focus is on maximum pseudo-likelihood estimators, which are easy to calculate in practice. A more general class of prediction-based estimating functions is investigated...
CROSAT: A digital computer program for statistical-spectral analysis of two discrete time series
International Nuclear Information System (INIS)
Antonopoulos Domis, M.
1978-03-01
The program CROSAT computes directly from two discrete time series auto- and cross-spectra, transfer and coherence functions, using a Fast Fourier Transform subroutine. Statistical analysis of the time series is optional. While of general use the program is constructed to be immediately compatible with the ICL 4-70 and H316 computers at AEE Winfrith, and perhaps with minor modifications, with any other hardware system. (author)
One of the main uses of biomarker measurements is to compare different populations to each other and to assess risk in comparison to established parameters. This is most often done using summary statistics such as central tendency, variance components, confidence intervals, excee...
The Role of Preference Axioms and Respondent Behaviour in Statistical Models for Discrete Choice
DEFF Research Database (Denmark)
Hougaard, Jens Leth; Tjur, Tue; Østerdal, Lars Peter
Discrete choice experiments are widely used in relation to healthcare. A stream of recent literature therefore aims at testing the validityof the underlying preference axioms of completeness and transitivity,and detecting other preference phenomena such as unstability, learn-ing/tiredness effects......, ordering effects, dominance, etc. Unfortunatelythere seems to be some confusion about what is actually being tested,and the link between the statistical tests performed and the relevantunderlying model of respondent behaviour has not been explored inthis literature. The present paper tries to clarify...
International Nuclear Information System (INIS)
Farrell, Patricio; Koprucki, Thomas; Fuhrmann, Jürgen
2017-01-01
We compare three thermodynamically consistent numerical fluxes known in the literature, appearing in a Voronoï finite volume discretization of the van Roosbroeck system with general charge carrier statistics. Our discussion includes an extension of the Scharfetter–Gummel scheme to non-Boltzmann (e.g. Fermi–Dirac) statistics. It is based on the analytical solution of a two-point boundary value problem obtained by projecting the continuous differential equation onto the interval between neighboring collocation points. Hence, it serves as a reference flux. The exact solution of the boundary value problem can be approximated by computationally cheaper fluxes which modify certain physical quantities. One alternative scheme averages the nonlinear diffusion (caused by the non-Boltzmann nature of the problem), another one modifies the effective density of states. To study the differences between these three schemes, we analyze the Taylor expansions, derive an error estimate, visualize the flux error and show how the schemes perform for a carefully designed p-i-n benchmark simulation. We present strong evidence that the flux discretization based on averaging the nonlinear diffusion has an edge over the scheme based on modifying the effective density of states.
Farrell, Patricio; Koprucki, Thomas; Fuhrmann, Jürgen
2017-10-01
We compare three thermodynamically consistent numerical fluxes known in the literature, appearing in a Voronoï finite volume discretization of the van Roosbroeck system with general charge carrier statistics. Our discussion includes an extension of the Scharfetter-Gummel scheme to non-Boltzmann (e.g. Fermi-Dirac) statistics. It is based on the analytical solution of a two-point boundary value problem obtained by projecting the continuous differential equation onto the interval between neighboring collocation points. Hence, it serves as a reference flux. The exact solution of the boundary value problem can be approximated by computationally cheaper fluxes which modify certain physical quantities. One alternative scheme averages the nonlinear diffusion (caused by the non-Boltzmann nature of the problem), another one modifies the effective density of states. To study the differences between these three schemes, we analyze the Taylor expansions, derive an error estimate, visualize the flux error and show how the schemes perform for a carefully designed p-i-n benchmark simulation. We present strong evidence that the flux discretization based on averaging the nonlinear diffusion has an edge over the scheme based on modifying the effective density of states.
Sakhr, Jamal; Nieminen, John M.
2018-03-01
Two decades ago, Wang and Ong, [Phys. Rev. A 55, 1522 (1997)], 10.1103/PhysRevA.55.1522 hypothesized that the local box-counting dimension of a discrete quantum spectrum should depend exclusively on the nearest-neighbor spacing distribution (NNSD) of the spectrum. In this Rapid Communication, we validate their hypothesis by deriving an explicit formula for the local box-counting dimension of a countably-infinite discrete quantum spectrum. This formula expresses the local box-counting dimension of a spectrum in terms of single and double integrals of the NNSD of the spectrum. As applications, we derive an analytical formula for Poisson spectra and closed-form approximations to the local box-counting dimension for spectra having Gaussian orthogonal ensemble (GOE), Gaussian unitary ensemble (GUE), and Gaussian symplectic ensemble (GSE) spacing statistics. In the Poisson and GOE cases, we compare our theoretical formulas with the published numerical data of Wang and Ong and observe excellent agreement between their data and our theory. We also study numerically the local box-counting dimensions of the Riemann zeta function zeros and the alternate levels of GOE spectra, which are often used as numerical models of spectra possessing GUE and GSE spacing statistics, respectively. In each case, the corresponding theoretical formula is found to accurately describe the numerically computed local box-counting dimension.
Directory of Open Access Journals (Sweden)
Joanna F Dipnall
Full Text Available Atheoretical large-scale data mining techniques using machine learning algorithms have promise in the analysis of large epidemiological datasets. This study illustrates the use of a hybrid methodology for variable selection that took account of missing data and complex survey design to identify key biomarkers associated with depression from a large epidemiological study.The study used a three-step methodology amalgamating multiple imputation, a machine learning boosted regression algorithm and logistic regression, to identify key biomarkers associated with depression in the National Health and Nutrition Examination Study (2009-2010. Depression was measured using the Patient Health Questionnaire-9 and 67 biomarkers were analysed. Covariates in this study included gender, age, race, smoking, food security, Poverty Income Ratio, Body Mass Index, physical activity, alcohol use, medical conditions and medications. The final imputed weighted multiple logistic regression model included possible confounders and moderators.After the creation of 20 imputation data sets from multiple chained regression sequences, machine learning boosted regression initially identified 21 biomarkers associated with depression. Using traditional logistic regression methods, including controlling for possible confounders and moderators, a final set of three biomarkers were selected. The final three biomarkers from the novel hybrid variable selection methodology were red cell distribution width (OR 1.15; 95% CI 1.01, 1.30, serum glucose (OR 1.01; 95% CI 1.00, 1.01 and total bilirubin (OR 0.12; 95% CI 0.05, 0.28. Significant interactions were found between total bilirubin with Mexican American/Hispanic group (p = 0.016, and current smokers (p<0.001.The systematic use of a hybrid methodology for variable selection, fusing data mining techniques using a machine learning algorithm with traditional statistical modelling, accounted for missing data and complex survey sampling
Dipnall, Joanna F; Pasco, Julie A; Berk, Michael; Williams, Lana J; Dodd, Seetal; Jacka, Felice N; Meyer, Denny
2016-01-01
Atheoretical large-scale data mining techniques using machine learning algorithms have promise in the analysis of large epidemiological datasets. This study illustrates the use of a hybrid methodology for variable selection that took account of missing data and complex survey design to identify key biomarkers associated with depression from a large epidemiological study. The study used a three-step methodology amalgamating multiple imputation, a machine learning boosted regression algorithm and logistic regression, to identify key biomarkers associated with depression in the National Health and Nutrition Examination Study (2009-2010). Depression was measured using the Patient Health Questionnaire-9 and 67 biomarkers were analysed. Covariates in this study included gender, age, race, smoking, food security, Poverty Income Ratio, Body Mass Index, physical activity, alcohol use, medical conditions and medications. The final imputed weighted multiple logistic regression model included possible confounders and moderators. After the creation of 20 imputation data sets from multiple chained regression sequences, machine learning boosted regression initially identified 21 biomarkers associated with depression. Using traditional logistic regression methods, including controlling for possible confounders and moderators, a final set of three biomarkers were selected. The final three biomarkers from the novel hybrid variable selection methodology were red cell distribution width (OR 1.15; 95% CI 1.01, 1.30), serum glucose (OR 1.01; 95% CI 1.00, 1.01) and total bilirubin (OR 0.12; 95% CI 0.05, 0.28). Significant interactions were found between total bilirubin with Mexican American/Hispanic group (p = 0.016), and current smokers (pmachine learning algorithm with traditional statistical modelling, accounted for missing data and complex survey sampling methodology and was demonstrated to be a useful tool for detecting three biomarkers associated with depression for future
The Role of Discrete Global Grid Systems in the Global Statistical Geospatial Framework
Purss, M. B. J.; Peterson, P.; Minchin, S. A.; Bermudez, L. E.
2016-12-01
The United Nations Committee of Experts on Global Geospatial Information Management (UN-GGIM) has proposed the development of a Global Statistical Geospatial Framework (GSGF) as a mechanism for the establishment of common analytical systems that enable the integration of statistical and geospatial information. Conventional coordinate reference systems address the globe with a continuous field of points suitable for repeatable navigation and analytical geometry. While this continuous field is represented on a computer in a digitized and discrete fashion by tuples of fixed-precision floating point values, it is a non-trivial exercise to relate point observations spatially referenced in this way to areal coverages on the surface of the Earth. The GSGF states the need to move to gridded data delivery and the importance of using common geographies and geocoding. The challenges associated with meeting these goals are not new and there has been a significant effort within the geospatial community to develop nested gridding standards to tackle these issues over many years. These efforts have recently culminated in the development of a Discrete Global Grid Systems (DGGS) standard which has been developed under the auspices of Open Geospatial Consortium (OGC). DGGS provide a fixed areal based geospatial reference frame for the persistent location of measured Earth observations, feature interpretations, and modelled predictions. DGGS address the entire planet by partitioning it into a discrete hierarchical tessellation of progressively finer resolution cells, which are referenced by a unique index that facilitates rapid computation, query and analysis. The geometry and location of the cell is the principle aspect of a DGGS. Data integration, decomposition, and aggregation is optimised in the DGGS hierarchical structure and can be exploited for efficient multi-source data processing, storage, discovery, transmission, visualization, computation, analysis, and modelling. During
Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C
2015-02-01
Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Carter, Jeffrey R.; Simon, Wayne E.
1990-08-01
Neural networks are trained using Recursive Error Minimization (REM) equations to perform statistical classification. Using REM equations with continuous input variables reduces the required number of training experiences by factors of one to two orders of magnitude over standard back propagation. Replacing the continuous input variables with discrete binary representations reduces the number of connections by a factor proportional to the number of variables reducing the required number of experiences by another order of magnitude. Undesirable effects of using recurrent experience to train neural networks for statistical classification problems are demonstrated and nonrecurrent experience used to avoid these undesirable effects. 1. THE 1-41 PROBLEM The statistical classification problem which we address is is that of assigning points in ddimensional space to one of two classes. The first class has a covariance matrix of I (the identity matrix) the covariance matrix of the second class is 41. For this reason the problem is known as the 1-41 problem. Both classes have equal probability of occurrence and samples from both classes may appear anywhere throughout the ddimensional space. Most samples near the origin of the coordinate system will be from the first class while most samples away from the origin will be from the second class. Since the two classes completely overlap it is impossible to have a classifier with zero error. The minimum possible error is known as the Bayes error and
Distinguishability notion based on Wootters statistical distance: Application to discrete maps
Gomez, Ignacio S.; Portesi, M.; Lamberti, P. W.
2017-08-01
We study the distinguishability notion given by Wootters for states represented by probability density functions. This presents the particularity that it can also be used for defining a statistical distance in chaotic unidimensional maps. Based on that definition, we provide a metric d ¯ for an arbitrary discrete map. Moreover, from d ¯ , we associate a metric space with each invariant density of a given map, which results to be the set of all distinguished points when the number of iterations of the map tends to infinity. Also, we give a characterization of the wandering set of a map in terms of the metric d ¯ , which allows us to identify the dissipative regions in the phase space. We illustrate the results in the case of the logistic and the circle maps numerically and analytically, and we obtain d ¯ and the wandering set for some characteristic values of their parameters. Finally, an extension of the metric space associated for arbitrary probability distributions (not necessarily invariant densities) is given along with some consequences. The statistical properties of distributions given by histograms are characterized in terms of the cardinal of the associated metric space. For two conjugate variables, the uncertainty principle is expressed in terms of the diameters of the associated metric space with those variables.
Dipnall, Joanna F.
2016-01-01
Background Atheoretical large-scale data mining techniques using machine learning algorithms have promise in the analysis of large epidemiological datasets. This study illustrates the use of a hybrid methodology for variable selection that took account of missing data and complex survey design to identify key biomarkers associated with depression from a large epidemiological study. Methods The study used a three-step methodology amalgamating multiple imputation, a machine learning boosted regression algorithm and logistic regression, to identify key biomarkers associated with depression in the National Health and Nutrition Examination Study (2009–2010). Depression was measured using the Patient Health Questionnaire-9 and 67 biomarkers were analysed. Covariates in this study included gender, age, race, smoking, food security, Poverty Income Ratio, Body Mass Index, physical activity, alcohol use, medical conditions and medications. The final imputed weighted multiple logistic regression model included possible confounders and moderators. Results After the creation of 20 imputation data sets from multiple chained regression sequences, machine learning boosted regression initially identified 21 biomarkers associated with depression. Using traditional logistic regression methods, including controlling for possible confounders and moderators, a final set of three biomarkers were selected. The final three biomarkers from the novel hybrid variable selection methodology were red cell distribution width (OR 1.15; 95% CI 1.01, 1.30), serum glucose (OR 1.01; 95% CI 1.00, 1.01) and total bilirubin (OR 0.12; 95% CI 0.05, 0.28). Significant interactions were found between total bilirubin with Mexican American/Hispanic group (p = 0.016), and current smokers (p<0.001). Conclusion The systematic use of a hybrid methodology for variable selection, fusing data mining techniques using a machine learning algorithm with traditional statistical modelling, accounted for missing data and
Meta-Statistics for Variable Selection: The R Package BioMark
Directory of Open Access Journals (Sweden)
Ron Wehrens
2012-11-01
Full Text Available Biomarker identification is an ever more important topic in the life sciences. With the advent of measurement methodologies based on microarrays and mass spectrometry, thousands of variables are routinely being measured on complex biological samples. Often, the question is what makes two groups of samples different. Classical hypothesis testing suffers from the multiple testing problem; however, correcting for this often leads to a lack of power. In addition, choosing α cutoff levels remains somewhat arbitrary. Also in a regression context, a model depending on few but relevant variables will be more accurate and precise, and easier to interpret biologically.We propose an R package, BioMark, implementing two meta-statistics for variable selection. The first, higher criticism, presents a data-dependent selection threshold for significance, instead of a cookbook value of α = 0.05. It is applicable in all cases where two groups are compared. The second, stability selection, is more general, and can also be applied in a regression context. This approach uses repeated subsampling of the data in order to assess the variability of the model coefficients and selects those that remain consistently important. It is shown using experimental spike-in data from the field of metabolomics that both approaches work well with real data. BioMark also contains functionality for simulating data with specific characteristics for algorithm development and testing.
Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons
2014-01-01
Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829
Quantitative imaging biomarkers: a review of statistical methods for computer algorithm comparisons.
Obuchowski, Nancy A; Reeves, Anthony P; Huang, Erich P; Wang, Xiao-Feng; Buckler, Andrew J; Kim, Hyun J Grace; Barnhart, Huiman X; Jackson, Edward F; Giger, Maryellen L; Pennello, Gene; Toledano, Alicia Y; Kalpathy-Cramer, Jayashree; Apanasovich, Tatiyana V; Kinahan, Paul E; Myers, Kyle J; Goldgof, Dmitry B; Barboriak, Daniel P; Gillies, Robert J; Schwartz, Lawrence H; Sullivan, Daniel C
2015-02-01
Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Statistical and Probabilistic Extensions to Ground Operations' Discrete Event Simulation Modeling
Trocine, Linda; Cummings, Nicholas H.; Bazzana, Ashley M.; Rychlik, Nathan; LeCroy, Kenneth L.; Cates, Grant R.
2010-01-01
NASA's human exploration initiatives will invest in technologies, public/private partnerships, and infrastructure, paving the way for the expansion of human civilization into the solar system and beyond. As it is has been for the past half century, the Kennedy Space Center will be the embarkation point for humankind's journey into the cosmos. Functioning as a next generation space launch complex, Kennedy's launch pads, integration facilities, processing areas, launch and recovery ranges will bustle with the activities of the world's space transportation providers. In developing this complex, KSC teams work through the potential operational scenarios: conducting trade studies, planning and budgeting for expensive and limited resources, and simulating alternative operational schemes. Numerous tools, among them discrete event simulation (DES), were matured during the Constellation Program to conduct such analyses with the purpose of optimizing the launch complex for maximum efficiency, safety, and flexibility while minimizing life cycle costs. Discrete event simulation is a computer-based modeling technique for complex and dynamic systems where the state of the system changes at discrete points in time and whose inputs may include random variables. DES is used to assess timelines and throughput, and to support operability studies and contingency analyses. It is applicable to any space launch campaign and informs decision-makers of the effects of varying numbers of expensive resources and the impact of off nominal scenarios on measures of performance. In order to develop representative DES models, methods were adopted, exploited, or created to extend traditional uses of DES. The Delphi method was adopted and utilized for task duration estimation. DES software was exploited for probabilistic event variation. A roll-up process was used, which was developed to reuse models and model elements in other less - detailed models. The DES team continues to innovate and expand
Application of Multivariate Statistical Analysis to Biomarkers in Se-Turkey Crude Oils
Gürgey, K.; Canbolat, S.
2017-11-01
Twenty-four crude oil samples were collected from the 24 oil fields distributed in different districts of SE-Turkey. API and Sulphur content (%), Stable Carbon Isotope, Gas Chromatography (GC), and Gas Chromatography-Mass Spectrometry (GC-MS) data were used to construct a geochemical data matrix. The aim of this study is to examine the genetic grouping or correlations in the crude oil samples, hence the number of source rocks present in the SE-Turkey. To achieve these aims, two of the multivariate statistical analysis techniques (Principle Component Analysis [PCA] and Cluster Analysis were applied to data matrix of 24 samples and 8 source specific biomarker variables/parameters. The results showed that there are 3 genetically different oil groups: Batman-Nusaybin Oils, Adıyaman-Kozluk Oils and Diyarbakir Oils, in addition to a one mixed group. These groupings imply that at least, three different source rocks are present in South-Eastern (SE) Turkey. Grouping of the crude oil samples appears to be consistent with the geographic locations of the oils fields, subsurface stratigraphy as well as geology of the area.
APPLICATION OF MULTIVARIATE STATISTICAL ANALYSIS TO BIOMARKERS IN SE-TURKEY CRUDE OILS
Directory of Open Access Journals (Sweden)
K. Gürgey
2017-11-01
Full Text Available Twenty-four crude oil samples were collected from the 24 oil fields distributed in different districts of SE-Turkey. API and Sulphur content (%, Stable Carbon Isotope, Gas Chromatography (GC, and Gas Chromatography-Mass Spectrometry (GC-MS data were used to construct a geochemical data matrix. The aim of this study is to examine the genetic grouping or correlations in the crude oil samples, hence the number of source rocks present in the SE-Turkey. To achieve these aims, two of the multivariate statistical analysis techniques (Principle Component Analysis [PCA] and Cluster Analysis were applied to data matrix of 24 samples and 8 source specific biomarker variables/parameters. The results showed that there are 3 genetically different oil groups: Batman-Nusaybin Oils, Adıyaman-Kozluk Oils and Diyarbakir Oils, in addition to a one mixed group. These groupings imply that at least, three different source rocks are present in South-Eastern (SE Turkey. Grouping of the crude oil samples appears to be consistent with the geographic locations of the oils fields, subsurface stratigraphy as well as geology of the area.
On the propagation of a charged particle beam in a random medium. II: Discrete binary statistics
International Nuclear Information System (INIS)
Promraning, G.C.; Prinja, A.K.
1995-01-01
The authors consider the linear transport of energetic charged particles through a background stochastic mixture consisting of two immiscible fluids or solids. The transport model used is the continuous slowing down description in the straight ahead approximation. Under the assumption of homogeneous Markovian mixing statistics and separable (in space and energy) stopping powers with a common energy dependence, the problem of finding the ensemble averaged intensity and dose is reduced to simple quadrature. The use of the Liouville master equation offers an alternate approach to this problem, and leads to an exact differential equations whose solutions give the ensemble-averaged intensity and dose. This master equation approach applies to inhomogeneous Markovian statistics as well as non-separable stopping powers. Both treatments can be extended, in an approximate way, to non-Markovian statistics. Typical numerical results are given, contrasting this stochastic treatment with the standard treatment which ignores the stochastic nature of the problem. 11 refs., 9 figs., 1 tab
Statistical geological discrete fracture network model. Forsmark modelling stage 2.2
International Nuclear Information System (INIS)
Fox, Aaron; La Pointe, Paul; Simeonov, Assen; Hermanson, Jan; Oehman, Johan
2007-11-01
The Swedish Nuclear Fuel and Waste Management Company (SKB) is performing site characterization at two different locations, Forsmark and Laxemar, in order to locate a site for a final geologic repository for spent nuclear fuel. The program is built upon the development of Site Descriptive Models (SDMs) at specific timed data freezes. Each SDM is formed from discipline-specific reports from across the scientific spectrum. This report describes the methods, analyses, and conclusions of the geological modeling team with respect to a geological and statistical model of fractures and minor deformation zones (henceforth referred to as the geological DFN), version 2.2, at the Forsmark site. The geological DFN builds upon the work of other geological modelers, including the deformation zone (DZ), rock domain (RD), and fracture domain (FD) models. The geological DFN is a statistical model for stochastically simulating rock fractures and minor deformation zones as a scale of less than 1,000 m (the lower cut-off of the DZ models). The geological DFN is valid within four specific fracture domains inside the local model region, and encompassing the candidate volume at Forsmark: FFM01, FFM02, FFM03, and FFM06. The models are build using data from detailed surface outcrop maps and the cored borehole record at Forsmark. The conceptual model for the Forsmark 2.2 geological revolves around the concept of orientation sets; for each fracture domain, other model parameters such as size and intensity are tied to the orientation sets. Two classes of orientation sets were described; Global sets, which are encountered everywhere in the model region, and Local sets, which represent highly localized stress environments. Orientation sets were described in terms of their general cardinal direction (NE, NW, etc). Two alternatives are presented for fracture size modeling: - the tectonic continuum approach (TCM, TCMF) described by coupled size-intensity scaling following power law distributions
Statistical geological discrete fracture network model. Forsmark modelling stage 2.2
Energy Technology Data Exchange (ETDEWEB)
Fox, Aaron; La Pointe, Paul [Golder Associates Inc (United States); Simeonov, Assen [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden); Hermanson, Jan; Oehman, Johan [Golder Associates AB, Stockholm (Sweden)
2007-11-15
The Swedish Nuclear Fuel and Waste Management Company (SKB) is performing site characterization at two different locations, Forsmark and Laxemar, in order to locate a site for a final geologic repository for spent nuclear fuel. The program is built upon the development of Site Descriptive Models (SDMs) at specific timed data freezes. Each SDM is formed from discipline-specific reports from across the scientific spectrum. This report describes the methods, analyses, and conclusions of the geological modeling team with respect to a geological and statistical model of fractures and minor deformation zones (henceforth referred to as the geological DFN), version 2.2, at the Forsmark site. The geological DFN builds upon the work of other geological modelers, including the deformation zone (DZ), rock domain (RD), and fracture domain (FD) models. The geological DFN is a statistical model for stochastically simulating rock fractures and minor deformation zones as a scale of less than 1,000 m (the lower cut-off of the DZ models). The geological DFN is valid within four specific fracture domains inside the local model region, and encompassing the candidate volume at Forsmark: FFM01, FFM02, FFM03, and FFM06. The models are build using data from detailed surface outcrop maps and the cored borehole record at Forsmark. The conceptual model for the Forsmark 2.2 geological revolves around the concept of orientation sets; for each fracture domain, other model parameters such as size and intensity are tied to the orientation sets. Two classes of orientation sets were described; Global sets, which are encountered everywhere in the model region, and Local sets, which represent highly localized stress environments. Orientation sets were described in terms of their general cardinal direction (NE, NW, etc). Two alternatives are presented for fracture size modeling: - the tectonic continuum approach (TCM, TCMF) described by coupled size-intensity scaling following power law distributions
International Nuclear Information System (INIS)
Marinescu, D.C.; Radulescu, T.G.
1977-06-01
The Integral Fourier Transform has a large range of applications in such areas as communication theory, circuit theory, physics, etc. In order to perform discrete Fourier Transform the Finite Fourier Transform is defined; it operates upon N samples of a uniformely sampled continuous function. All the properties known in the continuous case can be found in the discrete case also. The first part of the paper presents the relationship between the Finite Fourier Transform and the Integral one. The computing of a Finite Fourier Transform is a problem in itself since in order to transform a set of N data we have to perform N 2 ''operations'' if the transformation relations are used directly. An algorithm known as the Fast Fourier Transform (FFT) reduces this figure from N 2 to a more reasonable Nlog 2 N, when N is a power of two. The original Cooley and Tuckey algorithm for FFT can be further improved when higher basis are used. The price to be paid in this case is the increase in complexity of such algorithms. The recurrence relations and a comparation among such algorithms are presented. The key point in understanding the application of FFT resides in the convolution theorem which states that the convolution (an N 2 type procedure) of the primitive functions is equivalent to the ordinar multiplication of their transforms. Since filtering is actually a convolution process we present several procedures to perform digital filtering by means of FFT. The best is the one using the segmentation of records and the transformation of pairs of records. In the digital processing of signals, besides digital filtering a special attention is paid to the estimation of various statistical characteristics of a signal as: autocorrelation and correlation functions, periodiograms, density power sepctrum, etc. We give several algorithms for the consistent and unbiased estimation of such functions, by means of FFT. (author)
Dipnall, Joanna F.; Pasco, Julie A.; Berk, Michael; Williams, Lana J.; Dodd, Seetal; Jacka, Felice N.; Meyer, Denny
2016-01-01
Background Atheoretical large-scale data mining techniques using machine learning algorithms have promise in the analysis of large epidemiological datasets. This study illustrates the use of a hybrid methodology for variable selection that took account of missing data and complex survey design to identify key biomarkers associated with depression from a large epidemiological study. Methods The study used a three-step methodology amalgamating multiple imputation, a machine learning boosted reg...
Eppig, Joel S; Edmonds, Emily C; Campbell, Laura; Sanderson-Cimino, Mark; Delano-Wood, Lisa; Bondi, Mark W
2017-08-01
Research demonstrates heterogeneous neuropsychological profiles among individuals with mild cognitive impairment (MCI). However, few studies have included visuoconstructional ability or used latent mixture modeling to statistically identify MCI subtypes. Therefore, we examined whether unique neuropsychological MCI profiles could be ascertained using latent profile analysis (LPA), and subsequently investigated cerebrospinal fluid (CSF) biomarkers, genotype, and longitudinal clinical outcomes between the empirically derived classes. A total of 806 participants diagnosed by means of the Alzheimer's Disease Neuroimaging Initiative (ADNI) MCI criteria received a comprehensive neuropsychological battery assessing visuoconstructional ability, language, attention/executive function, and episodic memory. Test scores were adjusted for demographic characteristics using standardized regression coefficients based on "robust" normal control performance (n=260). Calculated Z-scores were subsequently used in the LPA, and CSF-derived biomarkers, genotype, and longitudinal clinical outcome were evaluated between the LPA-derived MCI classes. Statistical fit indices suggested a 3-class model was the optimal LPA solution. The three-class LPA consisted of a mixed impairment MCI class (n=106), an amnestic MCI class (n=455), and an LPA-derived normal class (n=245). Additionally, the amnestic and mixed classes were more likely to be apolipoprotein e4+ and have worse Alzheimer's disease CSF biomarkers than LPA-derived normal subjects. Our study supports significant heterogeneity in MCI neuropsychological profiles using LPA and extends prior work (Edmonds et al., 2015) by demonstrating a lower rate of progression in the approximately one-third of ADNI MCI individuals who may represent "false-positive" diagnoses. Our results underscore the importance of using sensitive, actuarial methods for diagnosing MCI, as current diagnostic methods may be over-inclusive. (JINS, 2017, 23, 564-576).
Karaismailoğlu, Eda; Dikmen, Zeliha Günnur; Akbıyık, Filiz; Karaağaoğlu, Ahmet Ergun
2018-04-30
Background/aim: Myoglobin, cardiac troponin T, B-type natriuretic peptide (BNP), and creatine kinase isoenzyme MB (CK-MB) are frequently used biomarkers for evaluating risk of patients admitted to an emergency department with chest pain. Recently, time- dependent receiver operating characteristic (ROC) analysis has been used to evaluate the predictive power of biomarkers where disease status can change over time. We aimed to determine the best set of biomarkers that estimate cardiac death during follow-up time. We also obtained optimal cut-off values of these biomarkers, which differentiates between patients with and without risk of death. A web tool was developed to estimate time intervals in risk. Materials and methods: A total of 410 patients admitted to the emergency department with chest pain and shortness of breath were included. Cox regression analysis was used to determine an optimal set of biomarkers that can be used for estimating cardiac death and to combine the significant biomarkers. Time-dependent ROC analysis was performed for evaluating performances of significant biomarkers and a combined biomarker during 240 h. The bootstrap method was used to compare statistical significance and the Youden index was used to determine optimal cut-off values. Results : Myoglobin and BNP were significant by multivariate Cox regression analysis. Areas under the time-dependent ROC curves of myoglobin and BNP were about 0.80 during 240 h, and that of the combined biomarker (myoglobin + BNP) increased to 0.90 during the first 180 h. Conclusion: Although myoglobin is not clinically specific to a cardiac event, in our study both myoglobin and BNP were found to be statistically significant for estimating cardiac death. Using this combined biomarker may increase the power of prediction. Our web tool can be useful for evaluating the risk status of new patients and helping clinicians in making decisions.
DEFF Research Database (Denmark)
Madsen, Tobias
2017-01-01
In the present thesis I develop, implement and apply statistical methods for detecting genomic elements implicated in cancer development and progression. This is done in two separate bodies of work. The first uses the somatic mutation burden to distinguish cancer driver mutations from passenger m...
Towards the disease biomarker in an individual patient using statistical health monitoring
Engel, J.; Blanchet, L.M.; Engelke, U.F.; Wevers, R.A.; Buydens, L.M.
2014-01-01
In metabolomics, identification of complex diseases is often based on application of (multivariate) statistical techniques to the data. Commonly, each disease requires its own specific diagnostic model, separating healthy and diseased individuals, which is not very practical in a diagnostic setting.
Lind, Mads V; Savolainen, Otto I; Ross, Alastair B
2016-08-01
Data quality is critical for epidemiology, and as scientific understanding expands, the range of data available for epidemiological studies and the types of tools used for measurement have also expanded. It is essential for the epidemiologist to have a grasp of the issues involved with different measurement tools. One tool that is increasingly being used for measuring biomarkers in epidemiological cohorts is mass spectrometry (MS), because of the high specificity and sensitivity of MS-based methods and the expanding range of biomarkers that can be measured. Further, the ability of MS to quantify many biomarkers simultaneously is advantageously compared to single biomarker methods. However, as with all methods used to measure biomarkers, there are a number of pitfalls to consider which may have an impact on results when used in epidemiology. In this review we discuss the use of MS for biomarker analyses, focusing on metabolites and their application and potential issues related to large-scale epidemiology studies, the use of MS "omics" approaches for biomarker discovery and how MS-based results can be used for increasing biological knowledge gained from epidemiological studies. Better understanding of the possibilities and possible problems related to MS-based measurements will help the epidemiologist in their discussions with analytical chemists and lead to the use of the most appropriate statistical tools for these data.
International Nuclear Information System (INIS)
Darcel, C.; Davy, P.; Le Goc, R.; Dreuzy, J.R. de; Bour, O.
2009-11-01
Investigations led for several years at Laxemar and Forsmark reveal the large heterogeneity of geological formations and associated fracturing. This project aims at reinforcing the statistical DFN modeling framework adapted to a site scale. This leads therefore to develop quantitative methods of characterization adapted to the nature of fracturing and data availability. We start with the hypothesis that the maximum likelihood DFN model is a power-law model with a density term depending on orientations. This is supported both by literature and specifically here by former analyses of the SKB data. This assumption is nevertheless thoroughly tested by analyzing the fracture trace and lineament maps. Fracture traces range roughly between 0.5 m and 10 m - i e the usual extension of the sample outcrops. Between the raw data and final data used to compute the fracture size distribution from which the size distribution model will arise, several steps are necessary, in order to correct data from finite-size, topographical and sampling effects. More precisely, a particular attention is paid to fracture segmentation status and fracture linkage consistent with the DFN model expected. The fracture scaling trend observed over both sites displays finally a shape parameter k t close to 1.2 with a density term (α 2d ) between 1.4 and 1.8. Only two outcrops clearly display a different trend with k t close to 3 and a density term (α 2d ) between 2 and 3.5. The fracture lineaments spread over the range between 100 meters and a few kilometers. When compared with fracture trace maps, these datasets are already interpreted and the linkage process developed previously has not to be done. Except for the subregional lineament map from Forsmark, lineaments display a clear power-law trend with a shape parameter k t equal to 3 and a density term between 2 and 4.5. The apparent variation in scaling exponent, from the outcrop scale (k t = 1.2) on one side, to the lineament scale (k t = 2) on
Energy Technology Data Exchange (ETDEWEB)
Darcel, C. (Itasca Consultants SAS (France)); Davy, P.; Le Goc, R.; Dreuzy, J.R. de; Bour, O. (Geosciences Rennes, UMR 6118 CNRS, Univ. def Rennes, Rennes (France))
2009-11-15
Investigations led for several years at Laxemar and Forsmark reveal the large heterogeneity of geological formations and associated fracturing. This project aims at reinforcing the statistical DFN modeling framework adapted to a site scale. This leads therefore to develop quantitative methods of characterization adapted to the nature of fracturing and data availability. We start with the hypothesis that the maximum likelihood DFN model is a power-law model with a density term depending on orientations. This is supported both by literature and specifically here by former analyses of the SKB data. This assumption is nevertheless thoroughly tested by analyzing the fracture trace and lineament maps. Fracture traces range roughly between 0.5 m and 10 m - i e the usual extension of the sample outcrops. Between the raw data and final data used to compute the fracture size distribution from which the size distribution model will arise, several steps are necessary, in order to correct data from finite-size, topographical and sampling effects. More precisely, a particular attention is paid to fracture segmentation status and fracture linkage consistent with the DFN model expected. The fracture scaling trend observed over both sites displays finally a shape parameter k{sub t} close to 1.2 with a density term (alpha{sub 2d}) between 1.4 and 1.8. Only two outcrops clearly display a different trend with k{sub t} close to 3 and a density term (alpha{sub 2d}) between 2 and 3.5. The fracture lineaments spread over the range between 100 meters and a few kilometers. When compared with fracture trace maps, these datasets are already interpreted and the linkage process developed previously has not to be done. Except for the subregional lineament map from Forsmark, lineaments display a clear power-law trend with a shape parameter k{sub t} equal to 3 and a density term between 2 and 4.5. The apparent variation in scaling exponent, from the outcrop scale (k{sub t} = 1.2) on one side, to
Welzenbach, Julia; Neuhoff, Christiane; Looft, Christian; Schellander, Karl; Tholen, Ernst; Große-Brinkhaus, Christine
2016-01-01
The aim of this study was to elucidate the underlying biochemical processes to identify potential key molecules of meat quality traits drip loss, pH of meat 1 h post-mortem (pH1), pH in meat 24 h post-mortem (pH24) and meat color. An untargeted metabolomics approach detected the profiles of 393 annotated and 1,600 unknown metabolites in 97 Duroc × Pietrain pigs. Despite obvious differences regarding the statistical approaches, the four applied methods, namely correlation analysis, principal component analysis, weighted network analysis (WNA) and random forest regression (RFR), revealed mainly concordant results. Our findings lead to the conclusion that meat quality traits pH1, pH24 and color are strongly influenced by processes of post-mortem energy metabolism like glycolysis and pentose phosphate pathway, whereas drip loss is significantly associated with metabolites of lipid metabolism. In case of drip loss, RFR was the most suitable method to identify reliable biomarkers and to predict the phenotype based on metabolites. On the other hand, WNA provides the best parameters to investigate the metabolite interactions and to clarify the complex molecular background of meat quality traits. In summary, it was possible to attain findings on the interaction of meat quality traits and their underlying biochemical processes. The detected key metabolites might be better indicators of meat quality especially of drip loss than the measured phenotype itself and potentially might be used as bio indicators. PMID:26919205
Degeling, Koen; Schivo, Stefano; Mehra, Niven; Koffijberg, Hendrik; Langerak, Rom; de Bono, Johann S; IJzerman, Maarten J
2017-12-01
With the advent of personalized medicine, the field of health economic modeling is being challenged and the use of patient-level dynamic modeling techniques might be required. To illustrate the usability of two such techniques, timed automata (TA) and discrete event simulation (DES), for modeling personalized treatment decisions. An early health technology assessment on the use of circulating tumor cells, compared with prostate-specific antigen and bone scintigraphy, to inform treatment decisions in metastatic castration-resistant prostate cancer was performed. Both modeling techniques were assessed quantitatively, in terms of intermediate outcomes (e.g., overtreatment) and health economic outcomes (e.g., net monetary benefit). Qualitatively, among others, model structure, agent interactions, data management (i.e., importing and exporting data), and model transparency were assessed. Both models yielded realistic and similar intermediate and health economic outcomes. Overtreatment was reduced by 6.99 and 7.02 weeks by applying circulating tumor cell as a response marker at a net monetary benefit of -€1033 and -€1104 for the TA model and the DES model, respectively. Software-specific differences were observed regarding data management features and the support for statistical distributions, which were considered better for the DES software. Regarding method-specific differences, interactions were modeled more straightforward using TA, benefiting from its compositional model structure. Both techniques prove suitable for modeling personalized treatment decisions, although DES would be preferred given the current software-specific limitations of TA. When these limitations are resolved, TA would be an interesting modeling alternative if interactions are key or its compositional structure is useful to manage multi-agent complex problems. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights
Hayslett, H T
1991-01-01
Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
International Nuclear Information System (INIS)
2005-01-01
For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees
International Nuclear Information System (INIS)
2001-01-01
For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
1999-01-01
For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
2003-01-01
For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products
International Nuclear Information System (INIS)
2004-01-01
For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
Hauber, A Brett; González, Juan Marcos; Groothuis-Oudshoorn, Catharina G M; Prior, Thomas; Marshall, Deborah A; Cunningham, Charles; IJzerman, Maarten J; Bridges, John F P
2016-06-01
Conjoint analysis is a stated-preference survey method that can be used to elicit responses that reveal preferences, priorities, and the relative importance of individual features associated with health care interventions or services. Conjoint analysis methods, particularly discrete choice experiments (DCEs), have been increasingly used to quantify preferences of patients, caregivers, physicians, and other stakeholders. Recent consensus-based guidance on good research practices, including two recent task force reports from the International Society for Pharmacoeconomics and Outcomes Research, has aided in improving the quality of conjoint analyses and DCEs in outcomes research. Nevertheless, uncertainty regarding good research practices for the statistical analysis of data from DCEs persists. There are multiple methods for analyzing DCE data. Understanding the characteristics and appropriate use of different analysis methods is critical to conducting a well-designed DCE study. This report will assist researchers in evaluating and selecting among alternative approaches to conducting statistical analysis of DCE data. We first present a simplistic DCE example and a simple method for using the resulting data. We then present a pedagogical example of a DCE and one of the most common approaches to analyzing data from such a question format-conditional logit. We then describe some common alternative methods for analyzing these data and the strengths and weaknesses of each alternative. We present the ESTIMATE checklist, which includes a list of questions to consider when justifying the choice of analysis method, describing the analysis, and interpreting the results. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Directory of Open Access Journals (Sweden)
Sudeepa Bhattacharyya
2006-01-01
Full Text Available Multiple Myeloma (MM is a severely debilitating neoplastic disease of B cell origin, with the primary source of morbidity and mortality associated with unrestrained bone destruction. Surface enhanced laser desorption/ionization time-of-flight mass spectrometry (SELDI-TOF MS was used to screen for potential biomarkers indicative of skeletal involvement in patients with MM. Serum samples from 48 MM patients, 24 with more than three bone lesions and 24 with no evidence of bone lesions were fractionated and analyzed in duplicate using copper ion loaded immobilized metal affinity SELDI chip arrays. The spectra obtained were compiled, normalized, and mass peaks with mass-to-charge ratios (m/z between 2000 and 20,000 Da identified. Peak information from all fractions was combined together and analyzed using univariate statistics, as well as a linear, partial least squares discriminant analysis (PLS-DA, and a non-linear, random forest (RF, classification algorithm. The PLS-DA model resulted in prediction accuracy between 96–100%, while the RF model was able to achieve a specificity and sensitivity of 87.5% each. Both models as well as multiple comparison adjusted univariate analysis identified a set of four peaks that were the most discriminating between the two groups of patients and hold promise as potential biomarkers for future diagnostic and/or therapeutic purposes.
Grigsby, Claude C.; Zmuda, Michael A.; Boone, Derek W.; Highlander, Tyler C.; Kramer, Ryan M.; Rizki, Mateen M.
2012-06-01
A growing body of discoveries in molecular signatures has revealed that volatile organic compounds (VOCs), the small molecules associated with an individual's odor and breath, can be monitored to reveal the identity and presence of a unique individual, as well their overall physiological status. Given the analysis requirements for differential VOC profiling via gas chromatography/mass spectrometry, our group has developed a novel informatics platform, Metabolite Differentiation and Discovery Lab (MeDDL). In its current version, MeDDL is a comprehensive tool for time-series spectral registration and alignment, visualization, comparative analysis, and machine learning to facilitate the efficient analysis of multiple, large-scale biomarker discovery studies. The MeDDL toolset can therefore identify a large differential subset of registered peaks, where their corresponding intensities can be used as features for classification. This initial screening of peaks yields results sets that are typically too large for incorporation into a portable, electronic nose based system in addition to including VOCs that are not amenable to classification; consequently, it is also important to identify an optimal subset of these peaks to increase classification accuracy and to decrease the cost of the final system. MeDDL's learning tools include a classifier similar to a K-nearest neighbor classifier used in conjunction with a genetic algorithm (GA) that simultaneously optimizes the classifier and subset of features. The GA uses ROC curves to produce classifiers having maximal area under their ROC curve. Experimental results on over a dozen recognition problems show many examples of classifiers and feature sets that produce perfect ROC curves.
Directory of Open Access Journals (Sweden)
Hilko van der Voet
Full Text Available Nutrient recommendations in use today are often derived from relatively old data of few studies with few individuals. However, for many nutrients, including vitamin B-12, extensive data have now become available from both observational studies and randomized controlled trials, addressing the relation between intake and health-related status biomarkers. The purpose of this article is to provide new methodology for dietary planning based on dose-response data and meta-analysis. The methodology builds on existing work, and is consistent with current methodology and measurement error models for dietary assessment. The detailed purposes of this paper are twofold. Firstly, to define a Population Nutrient Level (PNL for dietary planning in groups. Secondly, to show how data from different sources can be combined in an extended meta-analysis of intake-status datasets for estimating PNL as well as other nutrient intake values, such as the Average Nutrient Requirement (ANR and the Individual Nutrient Level (INL. For this, a computational method is presented for comparing a bivariate lognormal distribution to a health criterion value. Procedures to meta-analyse available data in different ways are described. Example calculations on vitamin B-12 requirements were made for four models, assuming different ways of estimating the dose-response relation, and different values of the health criterion. Resulting estimates of ANRs and less so for INLs were found to be sensitive to model assumptions, whereas estimates of PNLs were much less sensitive to these assumptions as they were closer to the average nutrient intake in the available data.
Hauber, A. Brett; Gonzalez, Juan Marcos; Groothuis-Oudshoorn, Catharina Gerarda Maria; Prior, Thomas; Marshall, Deborah A.; Cunningham, Charles; IJzerman, Maarten Joost; Bridges, John
2016-01-01
Conjoint analysis is a stated-preference survey method that can be used to elicit responses that reveal preferences, priorities, and the relative importance of individual features associated with health care interventions or services. Conjoint analysis methods, particularly discrete choice
DEFF Research Database (Denmark)
Sørensen, John Aasted
2011-01-01
The objectives of Discrete Mathematics (IDISM2) are: The introduction of the mathematics needed for analysis, design and verification of discrete systems, including the application within programming languages for computer systems. Having passed the IDISM2 course, the student will be able...... to accomplish the following: -Understand and apply formal representations in discrete mathematics. -Understand and apply formal representations in problems within discrete mathematics. -Understand methods for solving problems in discrete mathematics. -Apply methods for solving problems in discrete mathematics......; construct a finite state machine for a given application. Apply these concepts to new problems. The teaching in Discrete Mathematics is a combination of sessions with lectures and students solving problems, either manually or by using Matlab. Furthermore a selection of projects must be solved and handed...
Izadi, F A; Bagirov, G
2009-01-01
With its origins stretching back several centuries, discrete calculus is now an increasingly central methodology for many problems related to discrete systems and algorithms. The topics covered here usually arise in many branches of science and technology, especially in discrete mathematics, numerical analysis, statistics and probability theory as well as in electrical engineering, but our viewpoint here is that these topics belong to a much more general realm of mathematics; namely calculus and differential equations because of the remarkable analogy of the subject to this branch of mathemati
Analysis of biomarker data a practical guide
Looney, Stephen W
2015-01-01
A "how to" guide for applying statistical methods to biomarker data analysis Presenting a solid foundation for the statistical methods that are used to analyze biomarker data, Analysis of Biomarker Data: A Practical Guide features preferred techniques for biomarker validation. The authors provide descriptions of select elementary statistical methods that are traditionally used to analyze biomarker data with a focus on the proper application of each method, including necessary assumptions, software recommendations, and proper interpretation of computer output. In addition, the book discusses
International Nuclear Information System (INIS)
Bruse, Jan L.; McLeod, Kristin; Biglino, Giovanni; Ntsinjana, Hopewell N.; Capelli, Claudio
2016-01-01
Medical image analysis in clinical practice is commonly carried out on 2D image data, without fully exploiting the detailed 3D anatomical information that is provided by modern non-invasive medical imaging techniques. In this paper, a statistical shape analysis method is presented, which enables the extraction of 3D anatomical shape features from cardiovascular magnetic resonance (CMR) image data, with no need for manual landmarking. The method was applied to repaired aortic coarctation arches that present complex shapes, with the aim of capturing shape features as biomarkers of potential functional relevance. The method is presented from the user-perspective and is evaluated by comparing results with traditional morphometric measurements. Steps required to set up the statistical shape modelling analyses, from pre-processing of the CMR images to parameter setting and strategies to account for size differences and outliers, are described in detail. The anatomical mean shape of 20 aortic arches post-aortic coarctation repair (CoA) was computed based on surface models reconstructed from CMR data. By analysing transformations that deform the mean shape towards each of the individual patient’s anatomy, shape patterns related to differences in body surface area (BSA) and ejection fraction (EF) were extracted. The resulting shape vectors, describing shape features in 3D, were compared with traditionally measured 2D and 3D morphometric parameters. The computed 3D mean shape was close to population mean values of geometric shape descriptors and visually integrated characteristic shape features associated with our population of CoA shapes. After removing size effects due to differences in body surface area (BSA) between patients, distinct 3D shape features of the aortic arch correlated significantly with EF (r = 0.521, p = .022) and were well in agreement with trends as shown by traditional shape descriptors. The suggested method has the potential to discover previously
Bruse, Jan L; McLeod, Kristin; Biglino, Giovanni; Ntsinjana, Hopewell N; Capelli, Claudio; Hsia, Tain-Yen; Sermesant, Maxime; Pennec, Xavier; Taylor, Andrew M; Schievano, Silvia
2016-05-31
Medical image analysis in clinical practice is commonly carried out on 2D image data, without fully exploiting the detailed 3D anatomical information that is provided by modern non-invasive medical imaging techniques. In this paper, a statistical shape analysis method is presented, which enables the extraction of 3D anatomical shape features from cardiovascular magnetic resonance (CMR) image data, with no need for manual landmarking. The method was applied to repaired aortic coarctation arches that present complex shapes, with the aim of capturing shape features as biomarkers of potential functional relevance. The method is presented from the user-perspective and is evaluated by comparing results with traditional morphometric measurements. Steps required to set up the statistical shape modelling analyses, from pre-processing of the CMR images to parameter setting and strategies to account for size differences and outliers, are described in detail. The anatomical mean shape of 20 aortic arches post-aortic coarctation repair (CoA) was computed based on surface models reconstructed from CMR data. By analysing transformations that deform the mean shape towards each of the individual patient's anatomy, shape patterns related to differences in body surface area (BSA) and ejection fraction (EF) were extracted. The resulting shape vectors, describing shape features in 3D, were compared with traditionally measured 2D and 3D morphometric parameters. The computed 3D mean shape was close to population mean values of geometric shape descriptors and visually integrated characteristic shape features associated with our population of CoA shapes. After removing size effects due to differences in body surface area (BSA) between patients, distinct 3D shape features of the aortic arch correlated significantly with EF (r = 0.521, p = .022) and were well in agreement with trends as shown by traditional shape descriptors. The suggested method has the potential to discover
DEFF Research Database (Denmark)
Sørensen, John Aasted
2011-01-01
; construct a finite state machine for a given application. Apply these concepts to new problems. The teaching in Discrete Mathematics is a combination of sessions with lectures and students solving problems, either manually or by using Matlab. Furthermore a selection of projects must be solved and handed...... to accomplish the following: -Understand and apply formal representations in discrete mathematics. -Understand and apply formal representations in problems within discrete mathematics. -Understand methods for solving problems in discrete mathematics. -Apply methods for solving problems in discrete mathematics...... to new problems. Relations and functions: Define a product set; define and apply equivalence relations; construct and apply functions. Apply these concepts to new problems. Natural numbers and induction: Define the natural numbers; apply the principle of induction to verify a selection of properties...
DEFF Research Database (Denmark)
Busch, Peter Andre; Zinner Henriksen, Helle
2018-01-01
discretion is suggested to reduce this footprint by influencing or replacing their discretionary practices using ICT. What is less researched is whether digital discretion can cause changes in public policy outcomes, and under what conditions such changes can occur. Using the concept of public service values......This study reviews 44 peer-reviewed articles on digital discretion published in the period from 1998 to January 2017. Street-level bureaucrats have traditionally had a wide ability to exercise discretion stirring debate since they can add their personal footprint on public policies. Digital......, we suggest that digital discretion can strengthen ethical and democratic values but weaken professional and relational values. Furthermore, we conclude that contextual factors such as considerations made by policy makers on the macro-level and the degree of professionalization of street...
DEFF Research Database (Denmark)
Sørensen, John Aasted
2010-01-01
The introduction of the mathematics needed for analysis, design and verification of discrete systems, including applications within programming languages for computer systems. Course sessions and project work. Semester: Spring 2010 Ectent: 5 ects Class size: 18......The introduction of the mathematics needed for analysis, design and verification of discrete systems, including applications within programming languages for computer systems. Course sessions and project work. Semester: Spring 2010 Ectent: 5 ects Class size: 18...
DEFF Research Database (Denmark)
Sørensen, John Aasted
2010-01-01
The introduction of the mathematics needed for analysis, design and verification of discrete systems, including applications within programming languages for computer systems. Course sessions and project work. Semester: Autumn 2010 Ectent: 5 ects Class size: 15......The introduction of the mathematics needed for analysis, design and verification of discrete systems, including applications within programming languages for computer systems. Course sessions and project work. Semester: Autumn 2010 Ectent: 5 ects Class size: 15...
Caltagirone, Jean-Paul
2014-01-01
This book presents the fundamental principles of mechanics to re-establish the equations of Discrete Mechanics. It introduces physics and thermodynamics associated to the physical modeling. The development and the complementarity of sciences lead to review today the old concepts that were the basis for the development of continuum mechanics. The differential geometry is used to review the conservation laws of mechanics. For instance, this formalism requires a different location of vector and scalar quantities in space. The equations of Discrete Mechanics form a system of equations where the H
International Nuclear Information System (INIS)
Lee, T.D.
1985-01-01
This paper reviews the role of time throughout all phases of mechanics: classical mechanics, non-relativistic quantum mechanics, and relativistic quantum theory. As an example of the relativistic quantum field theory, the case of a massless scalar field interacting with an arbitrary external current is discussed. The comparison between the new discrete theory and the usual continuum formalism is presented. An example is given of a two-dimensional random lattice and its duel. The author notes that there is no evidence that the discrete mechanics is more appropriate than the usual continuum mechanics
Directory of Open Access Journals (Sweden)
Huarong Xu
2016-08-01
Full Text Available Polyamines, one of the most important kind of biomarkers in cancer research, were investigated in order to characterize different cancer types. An integrative approach which combined ultra-high performance liquid chromatography—tandem mass spectrometry detection and multiple statistical data processing strategies including outlier elimination, binary logistic regression analysis and cluster analysis had been developed to discover the characteristic biomarkers of lung and liver cancer. The concentrations of 14 polyamine metabolites in biosamples from lung (n = 50 and liver cancer patients (n = 50 were detected by a validated UHPLC-MS/MS method. Then the concentrations were converted into independent variables to characterize patients of lung and liver cancer by binary logic regression analysis. Significant independent variables were regarded as the potential biomarkers. Cluster analysis was engaged for further verifying. As a result, two values was discovered to identify lung and liver cancer, which were the product of the plasma concentration of putrescine and spermidine; and the ratio of the urine concentration of S-adenosyl-l-methionine and N-acetylspermidine. Results indicated that the established advanced method could be successfully applied to characterize lung and liver cancer, and may also enable a new way of discovering cancer biomarkers and characterizing other types of cancer.
Biomarkers of Pediatric Brain Tumors
Directory of Open Access Journals (Sweden)
Mark D Russell
2013-03-01
Full Text Available Background and Need for Novel Biomarkers: Brain tumors are the leading cause of death by solid tumors in children. Although improvements have been made in their radiological detection and treatment, our capacity to promptly diagnose pediatric brain tumors in their early stages remains limited. This contrasts several other cancers where serum biomarkers such as CA 19-9 and CA 125 facilitate early diagnosis and treatment. Aim: The aim of this article is to review the latest literature and highlight biomarkers which may be of clinical use in the common types of primary pediatric brain tumor. Methods: A PubMed search was performed to identify studies reporting biomarkers in the bodily fluids of pediatric patients with brain tumors. Details regarding the sample type (serum, cerebrospinal fluid or urine, biomarkers analyzed, methodology, tumor type and statistical significance were recorded. Results: A total of 12 manuscripts reporting 19 biomarkers in 367 patients vs. 397 controls were identified in the literature. Of the 19 biomarkers identified, 12 were isolated from cerebrospinal fluid, 2 from serum, 3 from urine, and 2 from multiple bodily fluids. All but one study reported statistically significant differences in biomarker expression between patient and control groups.Conclusions: This review identifies a panel of novel biomarkers for pediatric brain tumors. It provides a platform for the further studies necessary to validate these biomarkers and, in addition, highlights several techniques through which new biomarkers can be discovered.
Exarchakis, Georgios; Lücke, Jörg
2017-11-01
Sparse coding algorithms with continuous latent variables have been the subject of a large number of studies. However, discrete latent spaces for sparse coding have been largely ignored. In this work, we study sparse coding with latents described by discrete instead of continuous prior distributions. We consider the general case in which the latents (while being sparse) can take on any value of a finite set of possible values and in which we learn the prior probability of any value from data. This approach can be applied to any data generated by discrete causes, and it can be applied as an approximation of continuous causes. As the prior probabilities are learned, the approach then allows for estimating the prior shape without assuming specific functional forms. To efficiently train the parameters of our probabilistic generative model, we apply a truncated expectation-maximization approach (expectation truncation) that we modify to work with a general discrete prior. We evaluate the performance of the algorithm by applying it to a variety of tasks: (1) we use artificial data to verify that the algorithm can recover the generating parameters from a random initialization, (2) use image patches of natural images and discuss the role of the prior for the extraction of image components, (3) use extracellular recordings of neurons to present a novel method of analysis for spiking neurons that includes an intuitive discretization strategy, and (4) apply the algorithm on the task of encoding audio waveforms of human speech. The diverse set of numerical experiments presented in this letter suggests that discrete sparse coding algorithms can scale efficiently to work with realistic data sets and provide novel statistical quantities to describe the structure of the data.
Parker, R Gary
1988-01-01
This book treats the fundamental issues and algorithmic strategies emerging as the core of the discipline of discrete optimization in a comprehensive and rigorous fashion. Following an introductory chapter on computational complexity, the basic algorithmic results for the two major models of polynomial algorithms are introduced--models using matroids and linear programming. Further chapters treat the major non-polynomial algorithms: branch-and-bound and cutting planes. The text concludes with a chapter on heuristic algorithms.Several appendixes are included which review the fundamental ideas o
Foster, Guy M.; Graham, Jennifer L.
2016-04-06
The Kansas River is a primary source of drinking water for about 800,000 people in northeastern Kansas. Source-water supplies are treated by a combination of chemical and physical processes to remove contaminants before distribution. Advanced notification of changing water-quality conditions and cyanobacteria and associated toxin and taste-and-odor compounds provides drinking-water treatment facilities time to develop and implement adequate treatment strategies. The U.S. Geological Survey (USGS), in cooperation with the Kansas Water Office (funded in part through the Kansas State Water Plan Fund), and the City of Lawrence, the City of Topeka, the City of Olathe, and Johnson County Water One, began a study in July 2012 to develop statistical models at two Kansas River sites located upstream from drinking-water intakes. Continuous water-quality monitors have been operated and discrete-water quality samples have been collected on the Kansas River at Wamego (USGS site number 06887500) and De Soto (USGS site number 06892350) since July 2012. Continuous and discrete water-quality data collected during July 2012 through June 2015 were used to develop statistical models for constituents of interest at the Wamego and De Soto sites. Logistic models to continuously estimate the probability of occurrence above selected thresholds were developed for cyanobacteria, microcystin, and geosmin. Linear regression models to continuously estimate constituent concentrations were developed for major ions, dissolved solids, alkalinity, nutrients (nitrogen and phosphorus species), suspended sediment, indicator bacteria (Escherichia coli, fecal coliform, and enterococci), and actinomycetes bacteria. These models will be used to provide real-time estimates of the probability that cyanobacteria and associated compounds exceed thresholds and of the concentrations of other water-quality constituents in the Kansas River. The models documented in this report are useful for characterizing changes
Discrete gradients in discrete classical mechanics
International Nuclear Information System (INIS)
Renna, L.
1987-01-01
A simple model of discrete classical mechanics is given where, starting from the continuous Hamilton equations, discrete equations of motion are established together with a proper discrete gradient definition. The conservation laws of the total discrete momentum, angular momentum, and energy are demonstrated
Directory of Open Access Journals (Sweden)
Antonello Sindona
2015-03-01
Full Text Available The sudden introduction of a local impurity in a Fermi sea leads to an anomalous disturbance of its quantum state that represents a local quench, leaving the system out of equilibrium and giving rise to the Anderson orthogonality catastrophe. The statistics of the work done describe the energy fluctuations produced by the quench, providing an accurate and detailed insight into the fundamental physics of the process. We present here a numerical approach to the non-equilibrium work distribution, supported by applications to phenomena occurring at very diverse energy ranges. One of them is the valence electron shake-up induced by photo-ionization of a core state in a fullerene molecule. The other is the response of an ultra-cold gas of trapped fermions to an embedded two-level atom excited by a fast pulse. Working at low thermal energies, we detect the primary role played by many-particle states of the perturbed system with one or two excited fermions. We validate our approach through the comparison with some photoemission data on fullerene films and previous analytical calculations on harmonically trapped Fermi gases.
Firth, Jean M
1992-01-01
The analysis of signals and systems using transform methods is a very important aspect of the examination of processes and problems in an increasingly wide range of applications. Whereas the initial impetus in the development of methods appropriate for handling discrete sets of data occurred mainly in an electrical engineering context (for example in the design of digital filters), the same techniques are in use in such disciplines as cardiology, optics, speech analysis and management, as well as in other branches of science and engineering. This text is aimed at a readership whose mathematical background includes some acquaintance with complex numbers, linear differen tial equations, matrix algebra, and series. Specifically, a familiarity with Fourier series (in trigonometric and exponential forms) is assumed, and an exposure to the concept of a continuous integral transform is desirable. Such a background can be expected, for example, on completion of the first year of a science or engineering degree cour...
Exact analysis of discrete data
Hirji, Karim F
2005-01-01
Researchers in fields ranging from biology and medicine to the social sciences, law, and economics regularly encounter variables that are discrete or categorical in nature. While there is no dearth of books on the analysis and interpretation of such data, these generally focus on large sample methods. When sample sizes are not large or the data are otherwise sparse, exact methods--methods not based on asymptotic theory--are more accurate and therefore preferable.This book introduces the statistical theory, analysis methods, and computation techniques for exact analysis of discrete data. After reviewing the relevant discrete distributions, the author develops the exact methods from the ground up in a conceptually integrated manner. The topics covered range from univariate discrete data analysis, a single and several 2 x 2 tables, a single and several 2 x K tables, incidence density and inverse sampling designs, unmatched and matched case -control studies, paired binary and trinomial response models, and Markov...
Discrete Curvatures and Discrete Minimal Surfaces
Sun, Xiang
2012-01-01
This thesis presents an overview of some approaches to compute Gaussian and mean curvature on discrete surfaces and discusses discrete minimal surfaces. The variety of applications of differential geometry in visualization and shape design leads
Statistical Analysis and validation
Hoefsloot, H.C.J.; Horvatovich, P.; Bischoff, R.
2013-01-01
In this chapter guidelines are given for the selection of a few biomarker candidates from a large number of compounds with a relative low number of samples. The main concepts concerning the statistical validation of the search for biomarkers are discussed. These complicated methods and concepts are
Quantum chaos: statistical relaxation in discrete spectrum
International Nuclear Information System (INIS)
Chirikov, B.V.
1990-01-01
The controversial phenomenon of quantum chaos is discussed using the quantized standard map, or the kicked rotator, as a simple model. The relation to the classical dynamical chaos is tracked down on the basis of the correspondence principle. Several definitions of the quantum chaos are discussed. 27 refs
Quantum chaos: Statistical relaxation in discrete spectrum
International Nuclear Information System (INIS)
Chirikov, B.V.
1991-01-01
The controversial phenomenon of quantum chaos is discussed using the quantized standard map, or the kicked rotator, as a simple model. The relation to the classical dynamical chaos is tracked down on the basis of the correspondence principle. Various mechanisms of the quantum suppression of classical chaos are considered with an application to the excitation and ionization of Rydberg atoms in a microwave field. Several definitions of the quantum chaos are discussed. (author). 27 refs
Directory of Open Access Journals (Sweden)
Sandra C. Kirkwood
2002-01-01
Full Text Available Pharmacogenomic biomarkers hold great promise for the future of medicine and have been touted as a means to personalize prescriptions. Genetic biomarkers for disease susceptibility including both Mendelian and complex disease promise to result in improved understanding of the pathophysiology of disease, identification of new potential therapeutic targets, and improved molecular classification of disease. However essential to fulfilling the promise of individualized therapeutic intervention is the identification of drug activity biomarkers that stratify individuals based on likely response to a particular therapeutic, both positive response, efficacy, and negative response, development of side effect or toxicity. Prior to the widespread clinical application of a genetic biomarker multiple scientific studies must be completed to identify the genetic variants and delineate their functional significance in the pathophysiology of a carefully defined phenotype. The applicability of the genetic biomarker in the human population must then be verified through both retrospective studies utilizing stored or clinical trial samples, and through clinical trials prospectively stratifying patients based on the biomarker. The risk conferred by the polymorphism and the applicability in the general population must be clearly understood. Thus, the development and widespread application of a pharmacogenomic biomarker is an involved process and for most disease states we are just at the beginning of the journey towards individualized therapy and improved clinical outcome.
Discrete Curvatures and Discrete Minimal Surfaces
Sun, Xiang
2012-06-01
This thesis presents an overview of some approaches to compute Gaussian and mean curvature on discrete surfaces and discusses discrete minimal surfaces. The variety of applications of differential geometry in visualization and shape design leads to great interest in studying discrete surfaces. With the rich smooth surface theory in hand, one would hope that this elegant theory can still be applied to the discrete counter part. Such a generalization, however, is not always successful. While discrete surfaces have the advantage of being finite dimensional, thus easier to treat, their geometric properties such as curvatures are not well defined in the classical sense. Furthermore, the powerful calculus tool can hardly be applied. The methods in this thesis, including angular defect formula, cotangent formula, parallel meshes, relative geometry etc. are approaches based on offset meshes or generalized offset meshes. As an important application, we discuss discrete minimal surfaces and discrete Koenigs meshes.
Rohatgi, Vijay K
2003-01-01
Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth
Energy Technology Data Exchange (ETDEWEB)
Miranda, A.B. de; Delmas, A; Sacadura, J F [Institut National des Sciences Appliquees (INSA), 69 - Villeurbanne (France)
1997-12-31
A formulation based on the use of the discrete ordinate method applied to the integral form of the radiant heat transfer equation is proposed for non-grey gases. The correlations between transmittances are neglected and no explicit wall reflexion is considered. The configuration analyzed consists in a flat layer of non-isothermal steam-nitrogen mixture. Cavity walls are grey with diffuse reflexion and emission. A narrow band statistical model is used to represent the radiative properties of the gas. The distribution of the radiative source term inside the cavity is calculated along two temperature profiles in a uniform steam concentration. Results obtained using this simplified approach are in good agreement with those found in the literature for the same temperature and concentration distributions. This preliminary study seems to indicate that the algorithm based on the integration of radiant heat transfer along the luminance path is less sensitive to de-correlation effects than formulations based on the differential form the the radiant heat transfer. Thus, a more systematic study of the influence of the neglecting of correlations on the integral approach is analyzed in this work. (J.S.) 16 refs.
Energy Technology Data Exchange (ETDEWEB)
Miranda, A.B. de; Delmas, A.; Sacadura, J.F. [Institut National des Sciences Appliquees (INSA), 69 - Villeurbanne (France)
1996-12-31
A formulation based on the use of the discrete ordinate method applied to the integral form of the radiant heat transfer equation is proposed for non-grey gases. The correlations between transmittances are neglected and no explicit wall reflexion is considered. The configuration analyzed consists in a flat layer of non-isothermal steam-nitrogen mixture. Cavity walls are grey with diffuse reflexion and emission. A narrow band statistical model is used to represent the radiative properties of the gas. The distribution of the radiative source term inside the cavity is calculated along two temperature profiles in a uniform steam concentration. Results obtained using this simplified approach are in good agreement with those found in the literature for the same temperature and concentration distributions. This preliminary study seems to indicate that the algorithm based on the integration of radiant heat transfer along the luminance path is less sensitive to de-correlation effects than formulations based on the differential form the the radiant heat transfer. Thus, a more systematic study of the influence of the neglecting of correlations on the integral approach is analyzed in this work. (J.S.) 16 refs.
Discrete Morse functions for graph configuration spaces
International Nuclear Information System (INIS)
Sawicki, A
2012-01-01
We present an alternative application of discrete Morse theory for two-particle graph configuration spaces. In contrast to previous constructions, which are based on discrete Morse vector fields, our approach is through Morse functions, which have a nice physical interpretation as two-body potentials constructed from one-body potentials. We also give a brief introduction to discrete Morse theory. Our motivation comes from the problem of quantum statistics for particles on networks, for which generalized versions of anyon statistics can appear. (paper)
Application of an efficient Bayesian discretization method to biomedical data
Directory of Open Access Journals (Sweden)
Gopalakrishnan Vanathi
2011-07-01
Full Text Available Abstract Background Several data mining methods require data that are discrete, and other methods often perform better with discrete data. We introduce an efficient Bayesian discretization (EBD method for optimal discretization of variables that runs efficiently on high-dimensional biomedical datasets. The EBD method consists of two components, namely, a Bayesian score to evaluate discretizations and a dynamic programming search procedure to efficiently search the space of possible discretizations. We compared the performance of EBD to Fayyad and Irani's (FI discretization method, which is commonly used for discretization. Results On 24 biomedical datasets obtained from high-throughput transcriptomic and proteomic studies, the classification performances of the C4.5 classifier and the naïve Bayes classifier were statistically significantly better when the predictor variables were discretized using EBD over FI. EBD was statistically significantly more stable to the variability of the datasets than FI. However, EBD was less robust, though not statistically significantly so, than FI and produced slightly more complex discretizations than FI. Conclusions On a range of biomedical datasets, a Bayesian discretization method (EBD yielded better classification performance and stability but was less robust than the widely used FI discretization method. The EBD discretization method is easy to implement, permits the incorporation of prior knowledge and belief, and is sufficiently fast for application to high-dimensional data.
Mimetic discretization methods
Castillo, Jose E
2013-01-01
To help solve physical and engineering problems, mimetic or compatible algebraic discretization methods employ discrete constructs to mimic the continuous identities and theorems found in vector calculus. Mimetic Discretization Methods focuses on the recent mimetic discretization method co-developed by the first author. Based on the Castillo-Grone operators, this simple mimetic discretization method is invariably valid for spatial dimensions no greater than three. The book also presents a numerical method for obtaining corresponding discrete operators that mimic the continuum differential and
Time Discretization Techniques
Gottlieb, S.; Ketcheson, David I.
2016-01-01
The time discretization of hyperbolic partial differential equations is typically the evolution of a system of ordinary differential equations obtained by spatial discretization of the original problem. Methods for this time evolution include
Handbook of Spatial Statistics
Gelfand, Alan E
2010-01-01
Offers an introduction detailing the evolution of the field of spatial statistics. This title focuses on the three main branches of spatial statistics: continuous spatial variation (point referenced data); discrete spatial variation, including lattice and areal unit data; and, spatial point patterns.
Testing Preference Axioms in Discrete Choice experiments
DEFF Research Database (Denmark)
Hougaard, Jens Leth; Østerdal, Lars Peter; Tjur, Tue
Recent studies have tested the preference axioms of completeness and transitivity, and have detected other preference phenomena such as unstability, learning- and tiredness effects, ordering effects and dominance, in stated preference discrete choice experiments. However, it has not been explicitly...... of the preference axioms and other preference phenomena in the context of stated preference discrete choice experiments, and examine whether or how these can be subject to meaningful (statistical) tests...
Modeling discrete time-to-event data
Tutz, Gerhard
2016-01-01
This book focuses on statistical methods for the analysis of discrete failure times. Failure time analysis is one of the most important fields in statistical research, with applications affecting a wide range of disciplines, in particular, demography, econometrics, epidemiology and clinical research. Although there are a large variety of statistical methods for failure time analysis, many techniques are designed for failure times that are measured on a continuous scale. In empirical studies, however, failure times are often discrete, either because they have been measured in intervals (e.g., quarterly or yearly) or because they have been rounded or grouped. The book covers well-established methods like life-table analysis and discrete hazard regression models, but also introduces state-of-the art techniques for model evaluation, nonparametric estimation and variable selection. Throughout, the methods are illustrated by real life applications, and relationships to survival analysis in continuous time are expla...
Baecklund transformations for discrete Painleve equations: Discrete PII-PV
International Nuclear Information System (INIS)
Sakka, A.; Mugan, U.
2006-01-01
Transformation properties of discrete Painleve equations are investigated by using an algorithmic method. This method yields explicit transformations which relates the solutions of discrete Painleve equations, discrete P II -P V , with different values of parameters. The particular solutions which are expressible in terms of the discrete analogue of the classical special functions of discrete Painleve equations can also be obtained from these transformations
Discrete Gabor transform and discrete Zak transform
Bastiaans, M.J.; Namazi, N.M.; Matthews, K.
1996-01-01
Gabor's expansion of a discrete-time signal into a set of shifted and modulated versions of an elementary signal or synthesis window is introduced, along with the inverse operation, i.e. the Gabor transform, which uses an analysis window that is related to the synthesis window and with the help of
Discrete Mathematics Re "Tooled."
Grassl, Richard M.; Mingus, Tabitha T. Y.
1999-01-01
Indicates the importance of teaching discrete mathematics. Describes how the use of technology can enhance the teaching and learning of discrete mathematics. Explorations using Excel, Derive, and the TI-92 proved how preservice and inservice teachers experienced a new dimension in problem solving and discovery. (ASK)
Discrete modeling considerations in multiphase fluid dynamics
International Nuclear Information System (INIS)
Ransom, V.H.; Ramshaw, J.D.
1988-01-01
The modeling of multiphase flows play a fundamental role in light water reactor safety. The main ingredients in our discrete modeling Weltanschauung are the following considerations: (1) Any physical model must be cast into discrete form for a digital computer. (2) The usual approach of formulating models in differential form and then discretizing them is potentially hazardous. It may be preferable to formulate the model in discrete terms from the outset. (3) Computer time and storage constraints limit the resolution that can be employed in practical calculations. These limits effectively define the physical phenomena, length scales, and time scales which cannot be directly represented in the calculation and therefore must be modeled. This information should be injected into the model formulation process at an early stage. (4) Practical resolution limits are generally so coarse that traditional convergence and truncation-error analyses become irrelevant. (5) A discrete model constitutes a reduced description of a physical system, from which fine-scale details are eliminated. This elimination creates a statistical closure problem. Methods from statistical physics may therefore be useful in the formulation of discrete models. In the present paper we elaborate on these themes and illustrate them with simple examples. 48 refs
Homogenization of discrete media
International Nuclear Information System (INIS)
Pradel, F.; Sab, K.
1998-01-01
Material such as granular media, beam assembly are easily seen as discrete media. They look like geometrical points linked together thanks to energetic expressions. Our purpose is to extend discrete kinematics to the one of an equivalent continuous material. First we explain how we build the localisation tool for periodic materials according to estimated continuum medium type (classical Cauchy, and Cosserat media). Once the bridge built between discrete and continuum media, we exhibit its application over two bidimensional beam assembly structures : the honey comb and a structural reinforced variation. The new behavior is then applied for the simple plan shear problem in a Cosserat continuum and compared with the real discrete solution. By the mean of this example, we establish the agreement of our new model with real structures. The exposed method has a longer range than mechanics and can be applied to every discrete problems like electromagnetism in which relationship between geometrical points can be summed up by an energetic function. (orig.)
International Nuclear Information System (INIS)
Aydin, Alhun; Sisman, Altug
2016-01-01
By considering the quantum-mechanically minimum allowable energy interval, we exactly count number of states (NOS) and introduce discrete density of states (DOS) concept for a particle in a box for various dimensions. Expressions for bounded and unbounded continua are analytically recovered from discrete ones. Even though substantial fluctuations prevail in discrete DOS, they're almost completely flattened out after summation or integration operation. It's seen that relative errors of analytical expressions of bounded/unbounded continua rapidly decrease for high NOS values (weak confinement or high energy conditions), while the proposed analytical expressions based on Weyl's conjecture always preserve their lower error characteristic. - Highlights: • Discrete density of states considering minimum energy difference is proposed. • Analytical DOS and NOS formulas based on Weyl conjecture are given. • Discrete DOS and NOS functions are examined for various dimensions. • Relative errors of analytical formulas are much better than the conventional ones.
Energy Technology Data Exchange (ETDEWEB)
Aydin, Alhun; Sisman, Altug, E-mail: sismanal@itu.edu.tr
2016-03-22
By considering the quantum-mechanically minimum allowable energy interval, we exactly count number of states (NOS) and introduce discrete density of states (DOS) concept for a particle in a box for various dimensions. Expressions for bounded and unbounded continua are analytically recovered from discrete ones. Even though substantial fluctuations prevail in discrete DOS, they're almost completely flattened out after summation or integration operation. It's seen that relative errors of analytical expressions of bounded/unbounded continua rapidly decrease for high NOS values (weak confinement or high energy conditions), while the proposed analytical expressions based on Weyl's conjecture always preserve their lower error characteristic. - Highlights: • Discrete density of states considering minimum energy difference is proposed. • Analytical DOS and NOS formulas based on Weyl conjecture are given. • Discrete DOS and NOS functions are examined for various dimensions. • Relative errors of analytical formulas are much better than the conventional ones.
Okuyama, Yoshifumi
2014-01-01
Discrete Control Systems establishes a basis for the analysis and design of discretized/quantized control systemsfor continuous physical systems. Beginning with the necessary mathematical foundations and system-model descriptions, the text moves on to derive a robust stability condition. To keep a practical perspective on the uncertain physical systems considered, most of the methods treated are carried out in the frequency domain. As part of the design procedure, modified Nyquist–Hall and Nichols diagrams are presented and discretized proportional–integral–derivative control schemes are reconsidered. Schemes for model-reference feedback and discrete-type observers are proposed. Although single-loop feedback systems form the core of the text, some consideration is given to multiple loops and nonlinearities. The robust control performance and stability of interval systems (with multiple uncertainties) are outlined. Finally, the monograph describes the relationship between feedback-control and discrete ev...
Discrete repulsive oscillator wavefunctions
International Nuclear Information System (INIS)
Munoz, Carlos A; Rueda-Paz, Juvenal; Wolf, Kurt Bernardo
2009-01-01
For the study of infinite discrete systems on phase space, the three-dimensional Lorentz algebra and group, so(2,1) and SO(2,1), provide a discrete model of the repulsive oscillator. Its eigenfunctions are found in the principal irreducible representation series, where the compact generator-that we identify with the position operator-has the infinite discrete spectrum of the integers Z, while the spectrum of energies is a double continuum. The right- and left-moving wavefunctions are given by hypergeometric functions that form a Dirac basis for l 2 (Z). Under contraction, the discrete system limits to the well-known quantum repulsive oscillator. Numerical computations of finite approximations raise further questions on the use of Dirac bases for infinite discrete systems.
Energy Technology Data Exchange (ETDEWEB)
Morris, J; Johnson, S
2007-12-03
The Distinct Element Method (also frequently referred to as the Discrete Element Method) (DEM) is a Lagrangian numerical technique where the computational domain consists of discrete solid elements which interact via compliant contacts. This can be contrasted with Finite Element Methods where the computational domain is assumed to represent a continuum (although many modern implementations of the FEM can accommodate some Distinct Element capabilities). Often the terms Discrete Element Method and Distinct Element Method are used interchangeably in the literature, although Cundall and Hart (1992) suggested that Discrete Element Methods should be a more inclusive term covering Distinct Element Methods, Displacement Discontinuity Analysis and Modal Methods. In this work, DEM specifically refers to the Distinct Element Method, where the discrete elements interact via compliant contacts, in contrast with Displacement Discontinuity Analysis where the contacts are rigid and all compliance is taken up by the adjacent intact material.
Introduction to Bayesian statistics
Bolstad, William M
2017-01-01
There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...
Understanding advanced statistical methods
Westfall, Peter
2013-01-01
Introduction: Probability, Statistics, and ScienceReality, Nature, Science, and ModelsStatistical Processes: Nature, Design and Measurement, and DataModelsDeterministic ModelsVariabilityParametersPurely Probabilistic Statistical ModelsStatistical Models with Both Deterministic and Probabilistic ComponentsStatistical InferenceGood and Bad ModelsUses of Probability ModelsRandom Variables and Their Probability DistributionsIntroductionTypes of Random Variables: Nominal, Ordinal, and ContinuousDiscrete Probability Distribution FunctionsContinuous Probability Distribution FunctionsSome Calculus-Derivatives and Least SquaresMore Calculus-Integrals and Cumulative Distribution FunctionsProbability Calculation and SimulationIntroductionAnalytic Calculations, Discrete and Continuous CasesSimulation-Based ApproximationGenerating Random NumbersIdentifying DistributionsIntroductionIdentifying Distributions from Theory AloneUsing Data: Estimating Distributions via the HistogramQuantiles: Theoretical and Data-Based Estimate...
Finite Discrete Gabor Analysis
DEFF Research Database (Denmark)
Søndergaard, Peter Lempel
2007-01-01
frequency bands at certain times. Gabor theory can be formulated for both functions on the real line and for discrete signals of finite length. The two theories are largely the same because many aspects come from the same underlying theory of locally compact Abelian groups. The two types of Gabor systems...... can also be related by sampling and periodization. This thesis extends on this theory by showing new results for window construction. It also provides a discussion of the problems associated to discrete Gabor bases. The sampling and periodization connection is handy because it allows Gabor systems...... on the real line to be well approximated by finite and discrete Gabor frames. This method of approximation is especially attractive because efficient numerical methods exists for doing computations with finite, discrete Gabor systems. This thesis presents new algorithms for the efficient computation of finite...
Adaptive Discrete Hypergraph Matching.
Yan, Junchi; Li, Changsheng; Li, Yin; Cao, Guitao
2018-02-01
This paper addresses the problem of hypergraph matching using higher-order affinity information. We propose a solver that iteratively updates the solution in the discrete domain by linear assignment approximation. The proposed method is guaranteed to converge to a stationary discrete solution and avoids the annealing procedure and ad-hoc post binarization step that are required in several previous methods. Specifically, we start with a simple iterative discrete gradient assignment solver. This solver can be trapped in an -circle sequence under moderate conditions, where is the order of the graph matching problem. We then devise an adaptive relaxation mechanism to jump out this degenerating case and show that the resulting new path will converge to a fixed solution in the discrete domain. The proposed method is tested on both synthetic and real-world benchmarks. The experimental results corroborate the efficacy of our method.
Goodrich, Christopher
2015-01-01
This text provides the first comprehensive treatment of the discrete fractional calculus. Experienced researchers will find the text useful as a reference for discrete fractional calculus and topics of current interest. Students who are interested in learning about discrete fractional calculus will find this text to provide a useful starting point. Several exercises are offered at the end of each chapter and select answers have been provided at the end of the book. The presentation of the content is designed to give ample flexibility for potential use in a myriad of courses and for independent study. The novel approach taken by the authors includes a simultaneous treatment of the fractional- and integer-order difference calculus (on a variety of time scales, including both the usual forward and backwards difference operators). The reader will acquire a solid foundation in the classical topics of the discrete calculus while being introduced to exciting recent developments, bringing them to the frontiers of the...
International Nuclear Information System (INIS)
Williams, Ruth M
2006-01-01
A review is given of a number of approaches to discrete quantum gravity, with a restriction to those likely to be relevant in four dimensions. This paper is dedicated to Rafael Sorkin on the occasion of his sixtieth birthday
Understanding Statistics - Cancer Statistics
Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.
Novel ageing-biomarker discovery using data-intensive technologies
Griffiths, H.R.; Augustyniak, E.M.; Bennett, S.J.; Debacq-Chainiaux, F.; Dunston, C.R.; Kristensen, P.; Melchjorsen, C.J.; Navarrete, Santos A.; Simm, A.; Toussaint, O.
2015-01-01
Ageing is accompanied by many visible characteristics. Other biological and physiological markers are also well-described e.g. loss of circulating sex hormones and increased inflammatory cytokines. Biomarkers for healthy ageing studies are presently predicated on existing knowledge of ageing traits. The increasing availability of data-intensive methods enables deep-analysis of biological samples for novel biomarkers. We have adopted two discrete approaches in MARK-AGE Work Package 7 for bioma...
Discrete computational structures
Korfhage, Robert R
1974-01-01
Discrete Computational Structures describes discrete mathematical concepts that are important to computing, covering necessary mathematical fundamentals, computer representation of sets, graph theory, storage minimization, and bandwidth. The book also explains conceptual framework (Gorn trees, searching, subroutines) and directed graphs (flowcharts, critical paths, information network). The text discusses algebra particularly as it applies to concentrates on semigroups, groups, lattices, propositional calculus, including a new tabular method of Boolean function minimization. The text emphasize
Phase II cancer clinical trials for biomarker-guided treatments.
Jung, Sin-Ho
2018-01-01
The design and analysis of cancer clinical trials with biomarker depend on various factors, such as the phase of trials, the type of biomarker, whether the used biomarker is validated or not, and the study objectives. In this article, we demonstrate the design and analysis of two Phase II cancer clinical trials, one with a predictive biomarker and the other with an imaging prognostic biomarker. Statistical testing methods and their sample size calculation methods are presented for each trial. We assume that the primary endpoint of these trials is a time to event variable, but this concept can be used for any type of endpoint.
Homogenization of discrete media
Energy Technology Data Exchange (ETDEWEB)
Pradel, F.; Sab, K. [CERAM-ENPC, Marne-la-Vallee (France)
1998-11-01
Material such as granular media, beam assembly are easily seen as discrete media. They look like geometrical points linked together thanks to energetic expressions. Our purpose is to extend discrete kinematics to the one of an equivalent continuous material. First we explain how we build the localisation tool for periodic materials according to estimated continuum medium type (classical Cauchy, and Cosserat media). Once the bridge built between discrete and continuum media, we exhibit its application over two bidimensional beam assembly structures : the honey comb and a structural reinforced variation. The new behavior is then applied for the simple plan shear problem in a Cosserat continuum and compared with the real discrete solution. By the mean of this example, we establish the agreement of our new model with real structures. The exposed method has a longer range than mechanics and can be applied to every discrete problems like electromagnetism in which relationship between geometrical points can be summed up by an energetic function. (orig.) 7 refs.
DISCRETE MATHEMATICS/NUMBER THEORY
Mrs. Manju Devi*
2017-01-01
Discrete mathematics is the study of mathematical structures that are fundamentally discrete rather than continuous. In contrast to real numbers that have the property of varying "smoothly", the objects studied in discrete mathematics such as integers, graphs, and statements do not vary smoothly in this way, but have distinct, separated values. Discrete mathematics therefore excludes topics in "continuous mathematics" such as calculus and analysis. Discrete objects can often be enumerated by ...
Directory of Open Access Journals (Sweden)
Prateek Sharma
2015-04-01
Full Text Available Abstract Simulation can be regarded as the emulation of the behavior of a real-world system over an interval of time. The process of simulation relies upon the generation of the history of a system and then analyzing that history to predict the outcome and improve the working of real systems. Simulations can be of various kinds but the topic of interest here is one of the most important kind of simulation which is Discrete-Event Simulation which models the system as a discrete sequence of events in time. So this paper aims at introducing about Discrete-Event Simulation and analyzing how it is beneficial to the real world systems.
Discrete systems and integrability
Hietarinta, J; Nijhoff, F W
2016-01-01
This first introductory text to discrete integrable systems introduces key notions of integrability from the vantage point of discrete systems, also making connections with the continuous theory where relevant. While treating the material at an elementary level, the book also highlights many recent developments. Topics include: Darboux and Bäcklund transformations; difference equations and special functions; multidimensional consistency of integrable lattice equations; associated linear problems (Lax pairs); connections with Padé approximants and convergence algorithms; singularities and geometry; Hirota's bilinear formalism for lattices; intriguing properties of discrete Painlevé equations; and the novel theory of Lagrangian multiforms. The book builds the material in an organic way, emphasizing interconnections between the various approaches, while the exposition is mostly done through explicit computations on key examples. Written by respected experts in the field, the numerous exercises and the thoroug...
Introductory discrete mathematics
Balakrishnan, V K
2010-01-01
This concise text offers an introduction to discrete mathematics for undergraduate students in computer science and mathematics. Mathematics educators consider it vital that their students be exposed to a course in discrete methods that introduces them to combinatorial mathematics and to algebraic and logical structures focusing on the interplay between computer science and mathematics. The present volume emphasizes combinatorics, graph theory with applications to some stand network optimization problems, and algorithms to solve these problems.Chapters 0-3 cover fundamental operations involv
Prateek Sharma
2015-01-01
Abstract Simulation can be regarded as the emulation of the behavior of a real-world system over an interval of time. The process of simulation relies upon the generation of the history of a system and then analyzing that history to predict the outcome and improve the working of real systems. Simulations can be of various kinds but the topic of interest here is one of the most important kind of simulation which is Discrete-Event Simulation which models the system as a discrete sequence of ev...
DEFF Research Database (Denmark)
Thurfjell, Lennart; Lötjönen, Jyrki; Lundqvist, Roger
2012-01-01
The New National Institute on Aging-Alzheimer's Association diagnostic guidelines for Alzheimer's disease (AD) incorporate biomarkers in the diagnostic criteria and suggest division of biomarkers into two categories: Aβ accumulation and neuronal degeneration or injury.......The New National Institute on Aging-Alzheimer's Association diagnostic guidelines for Alzheimer's disease (AD) incorporate biomarkers in the diagnostic criteria and suggest division of biomarkers into two categories: Aβ accumulation and neuronal degeneration or injury....
Indian Academy of Sciences (India)
We also describe discrete-time systems in terms of difference ... A more modern alternative, especially for larger systems, is to convert ... In other words, ..... picture?) State-variable equations are also called state-space equations because the ...
Discrete Lorentzian quantum gravity
Loll, R.
2000-01-01
Just as for non-abelian gauge theories at strong coupling, discrete lattice methods are a natural tool in the study of non-perturbative quantum gravity. They have to reflect the fact that the geometric degrees of freedom are dynamical, and that therefore also the lattice theory must be formulated
Sharp, Karen Tobey
This paper cites information received from a number of sources, e.g., mathematics teachers in two-year colleges, publishers, and convention speakers, about the nature of discrete mathematics and about what topics a course in this subject should contain. Note is taken of the book edited by Ralston and Young which discusses the future of college…
Directory of Open Access Journals (Sweden)
Ujjwal Maulik
Full Text Available Microarray and beadchip are two most efficient techniques for measuring gene expression and methylation data in bioinformatics. Biclustering deals with the simultaneous clustering of genes and samples. In this article, we propose a computational rule mining framework, StatBicRM (i.e., statistical biclustering-based rule mining to identify special type of rules and potential biomarkers using integrated approaches of statistical and binary inclusion-maximal biclustering techniques from the biological datasets. At first, a novel statistical strategy has been utilized to eliminate the insignificant/low-significant/redundant genes in such way that significance level must satisfy the data distribution property (viz., either normal distribution or non-normal distribution. The data is then discretized and post-discretized, consecutively. Thereafter, the biclustering technique is applied to identify maximal frequent closed homogeneous itemsets. Corresponding special type of rules are then extracted from the selected itemsets. Our proposed rule mining method performs better than the other rule mining algorithms as it generates maximal frequent closed homogeneous itemsets instead of frequent itemsets. Thus, it saves elapsed time, and can work on big dataset. Pathway and Gene Ontology analyses are conducted on the genes of the evolved rules using David database. Frequency analysis of the genes appearing in the evolved rules is performed to determine potential biomarkers. Furthermore, we also classify the data to know how much the evolved rules are able to describe accurately the remaining test (unknown data. Subsequently, we also compare the average classification accuracy, and other related factors with other rule-based classifiers. Statistical significance tests are also performed for verifying the statistical relevance of the comparative results. Here, each of the other rule mining methods or rule-based classifiers is also starting with the same post-discretized
Maulik, Ujjwal; Mallik, Saurav; Mukhopadhyay, Anirban; Bandyopadhyay, Sanghamitra
2015-01-01
Microarray and beadchip are two most efficient techniques for measuring gene expression and methylation data in bioinformatics. Biclustering deals with the simultaneous clustering of genes and samples. In this article, we propose a computational rule mining framework, StatBicRM (i.e., statistical biclustering-based rule mining) to identify special type of rules and potential biomarkers using integrated approaches of statistical and binary inclusion-maximal biclustering techniques from the biological datasets. At first, a novel statistical strategy has been utilized to eliminate the insignificant/low-significant/redundant genes in such way that significance level must satisfy the data distribution property (viz., either normal distribution or non-normal distribution). The data is then discretized and post-discretized, consecutively. Thereafter, the biclustering technique is applied to identify maximal frequent closed homogeneous itemsets. Corresponding special type of rules are then extracted from the selected itemsets. Our proposed rule mining method performs better than the other rule mining algorithms as it generates maximal frequent closed homogeneous itemsets instead of frequent itemsets. Thus, it saves elapsed time, and can work on big dataset. Pathway and Gene Ontology analyses are conducted on the genes of the evolved rules using David database. Frequency analysis of the genes appearing in the evolved rules is performed to determine potential biomarkers. Furthermore, we also classify the data to know how much the evolved rules are able to describe accurately the remaining test (unknown) data. Subsequently, we also compare the average classification accuracy, and other related factors with other rule-based classifiers. Statistical significance tests are also performed for verifying the statistical relevance of the comparative results. Here, each of the other rule mining methods or rule-based classifiers is also starting with the same post-discretized data
Discrete Exterior Calculus Discretization of Incompressible Navier-Stokes Equations
Mohamed, Mamdouh S.; Hirani, Anil N.; Samtaney, Ravi
2017-01-01
A conservative discretization of incompressible Navier-Stokes equations over surface simplicial meshes is developed using discrete exterior calculus (DEC). Numerical experiments for flows over surfaces reveal a second order accuracy
Discrete mKdV and discrete sine-Gordon flows on discrete space curves
International Nuclear Information System (INIS)
Inoguchi, Jun-ichi; Kajiwara, Kenji; Matsuura, Nozomu; Ohta, Yasuhiro
2014-01-01
In this paper, we consider the discrete deformation of the discrete space curves with constant torsion described by the discrete mKdV or the discrete sine-Gordon equations, and show that it is formulated as the torsion-preserving equidistant deformation on the osculating plane which satisfies the isoperimetric condition. The curve is reconstructed from the deformation data by using the Sym–Tafel formula. The isoperimetric equidistant deformation of the space curves does not preserve the torsion in general. However, it is possible to construct the torsion-preserving deformation by tuning the deformation parameters. Further, it is also possible to make an arbitrary choice of the deformation described by the discrete mKdV equation or by the discrete sine-Gordon equation at each step. We finally show that the discrete deformation of discrete space curves yields the discrete K-surfaces. (paper)
Chronic Obstructive Pulmonary Disease Biomarkers
Directory of Open Access Journals (Sweden)
Tatsiana Beiko
2016-04-01
Full Text Available Despite significant decreases in morbidity and mortality of cardiovascular diseases (CVD and cancers, morbidity and cost associated with chronic obstructive pulmonary disease (COPD continue to be increasing. Failure to improve disease outcomes has been related to the paucity of interventions improving survival. Insidious onset and slow progression halter research successes in developing disease-modifying therapies. In part, the difficulty in finding new therapies is because of the extreme heterogeneity within recognized COPD phenotypes. Novel biomarkers are necessary to help understand the natural history and pathogenesis of the different COPD subtypes. A more accurate phenotyping and the ability to assess the therapeutic response to new interventions and pharmaceutical agents may improve the statistical power of longitudinal clinical studies. In this study, we will review known candidate biomarkers for COPD, proposed pathways of pathogenesis, and future directions in the field.
Discrete Biogeography Based Optimization for Feature Selection in Molecular Signatures.
Liu, Bo; Tian, Meihong; Zhang, Chunhua; Li, Xiangtao
2015-04-01
Biomarker discovery from high-dimensional data is a complex task in the development of efficient cancer diagnoses and classification. However, these data are usually redundant and noisy, and only a subset of them present distinct profiles for different classes of samples. Thus, selecting high discriminative genes from gene expression data has become increasingly interesting in the field of bioinformatics. In this paper, a discrete biogeography based optimization is proposed to select the good subset of informative gene relevant to the classification. In the proposed algorithm, firstly, the fisher-markov selector is used to choose fixed number of gene data. Secondly, to make biogeography based optimization suitable for the feature selection problem; discrete migration model and discrete mutation model are proposed to balance the exploration and exploitation ability. Then, discrete biogeography based optimization, as we called DBBO, is proposed by integrating discrete migration model and discrete mutation model. Finally, the DBBO method is used for feature selection, and three classifiers are used as the classifier with the 10 fold cross-validation method. In order to show the effective and efficiency of the algorithm, the proposed algorithm is tested on four breast cancer dataset benchmarks. Comparison with genetic algorithm, particle swarm optimization, differential evolution algorithm and hybrid biogeography based optimization, experimental results demonstrate that the proposed method is better or at least comparable with previous method from literature when considering the quality of the solutions obtained. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Discrete mathematics with applications
Koshy, Thomas
2003-01-01
This approachable text studies discrete objects and the relationsips that bind them. It helps students understand and apply the power of discrete math to digital computer systems and other modern applications. It provides excellent preparation for courses in linear algebra, number theory, and modern/abstract algebra and for computer science courses in data structures, algorithms, programming languages, compilers, databases, and computation.* Covers all recommended topics in a self-contained, comprehensive, and understandable format for students and new professionals * Emphasizes problem-solving techniques, pattern recognition, conjecturing, induction, applications of varying nature, proof techniques, algorithm development and correctness, and numeric computations* Weaves numerous applications into the text* Helps students learn by doing with a wealth of examples and exercises: - 560 examples worked out in detail - More than 3,700 exercises - More than 150 computer assignments - More than 600 writing projects*...
Discrete and computational geometry
Devadoss, Satyan L
2011-01-01
Discrete geometry is a relatively new development in pure mathematics, while computational geometry is an emerging area in applications-driven computer science. Their intermingling has yielded exciting advances in recent years, yet what has been lacking until now is an undergraduate textbook that bridges the gap between the two. Discrete and Computational Geometry offers a comprehensive yet accessible introduction to this cutting-edge frontier of mathematics and computer science. This book covers traditional topics such as convex hulls, triangulations, and Voronoi diagrams, as well as more recent subjects like pseudotriangulations, curve reconstruction, and locked chains. It also touches on more advanced material, including Dehn invariants, associahedra, quasigeodesics, Morse theory, and the recent resolution of the Poincaré conjecture. Connections to real-world applications are made throughout, and algorithms are presented independently of any programming language. This richly illustrated textbook also fe...
2002-01-01
Discrete geometry investigates combinatorial properties of configurations of geometric objects. To a working mathematician or computer scientist, it offers sophisticated results and techniques of great diversity and it is a foundation for fields such as computational geometry or combinatorial optimization. This book is primarily a textbook introduction to various areas of discrete geometry. In each area, it explains several key results and methods, in an accessible and concrete manner. It also contains more advanced material in separate sections and thus it can serve as a collection of surveys in several narrower subfields. The main topics include: basics on convex sets, convex polytopes, and hyperplane arrangements; combinatorial complexity of geometric configurations; intersection patterns and transversals of convex sets; geometric Ramsey-type results; polyhedral combinatorics and high-dimensional convexity; and lastly, embeddings of finite metric spaces into normed spaces. Jiri Matousek is Professor of Com...
Time Discretization Techniques
Gottlieb, S.
2016-10-12
The time discretization of hyperbolic partial differential equations is typically the evolution of a system of ordinary differential equations obtained by spatial discretization of the original problem. Methods for this time evolution include multistep, multistage, or multiderivative methods, as well as a combination of these approaches. The time step constraint is mainly a result of the absolute stability requirement, as well as additional conditions that mimic physical properties of the solution, such as positivity or total variation stability. These conditions may be required for stability when the solution develops shocks or sharp gradients. This chapter contains a review of some of the methods historically used for the evolution of hyperbolic PDEs, as well as cutting edge methods that are now commonly used.
Czech Academy of Sciences Publication Activity Database
Mesiar, Radko; Li, J.; Pap, E.
2013-01-01
Roč. 54, č. 3 (2013), s. 357-364 ISSN 0888-613X R&D Projects: GA ČR GAP402/11/0378 Institutional support: RVO:67985556 Keywords : concave integral * pseudo-addition * pseudo-multiplication Subject RIV: BA - General Mathematics Impact factor: 1.977, year: 2013 http://library.utia.cas.cz/separaty/2013/E/mesiar-discrete pseudo-integrals.pdf
Discrete variational Hamiltonian mechanics
International Nuclear Information System (INIS)
Lall, S; West, M
2006-01-01
The main contribution of this paper is to present a canonical choice of a Hamiltonian theory corresponding to the theory of discrete Lagrangian mechanics. We make use of Lagrange duality and follow a path parallel to that used for construction of the Pontryagin principle in optimal control theory. We use duality results regarding sensitivity and separability to show the relationship between generating functions and symplectic integrators. We also discuss connections to optimal control theory and numerical algorithms
International Nuclear Information System (INIS)
Jalnapurkar, Sameer M; Leok, Melvin; Marsden, Jerrold E; West, Matthew
2006-01-01
This paper develops the theory of Abelian Routh reduction for discrete mechanical systems and applies it to the variational integration of mechanical systems with Abelian symmetry. The reduction of variational Runge-Kutta discretizations is considered, as well as the extent to which symmetry reduction and discretization commute. These reduced methods allow the direct simulation of dynamical features such as relative equilibria and relative periodic orbits that can be obscured or difficult to identify in the unreduced dynamics. The methods are demonstrated for the dynamics of an Earth orbiting satellite with a non-spherical J 2 correction, as well as the double spherical pendulum. The J 2 problem is interesting because in the unreduced picture, geometric phases inherent in the model and those due to numerical discretization can be hard to distinguish, but this issue does not appear in the reduced algorithm, where one can directly observe interesting dynamical structures in the reduced phase space (the cotangent bundle of shape space), in which the geometric phases have been removed. The main feature of the double spherical pendulum example is that it has a non-trivial magnetic term in its reduced symplectic form. Our method is still efficient as it can directly handle the essential non-canonical nature of the symplectic structure. In contrast, a traditional symplectic method for canonical systems could require repeated coordinate changes if one is evoking Darboux' theorem to transform the symplectic structure into canonical form, thereby incurring additional computational cost. Our method allows one to design reduced symplectic integrators in a natural way, despite the non-canonical nature of the symplectic structure
Discrete port-Hamiltonian systems
Talasila, V.; Clemente-Gallardo, J.; Schaft, A.J. van der
2006-01-01
Either from a control theoretic viewpoint or from an analysis viewpoint it is necessary to convert smooth systems to discrete systems, which can then be implemented on computers for numerical simulations. Discrete models can be obtained either by discretizing a smooth model, or by directly modeling
A paradigm for discrete physics
International Nuclear Information System (INIS)
Noyes, H.P.; McGoveran, D.; Etter, T.; Manthey, M.J.; Gefwert, C.
1987-01-01
An example is outlined for constructing a discrete physics using as a starting point the insight from quantum physics that events are discrete, indivisible and non-local. Initial postulates are finiteness, discreteness, finite computability, absolute nonuniqueness (i.e., homogeneity in the absence of specific cause) and additivity
Two new discrete integrable systems
International Nuclear Information System (INIS)
Chen Xiao-Hong; Zhang Hong-Qing
2013-01-01
In this paper, we focus on the construction of new (1+1)-dimensional discrete integrable systems according to a subalgebra of loop algebra Ã 1 . By designing two new (1+1)-dimensional discrete spectral problems, two new discrete integrable systems are obtained, namely, a 2-field lattice hierarchy and a 3-field lattice hierarchy. When deriving the two new discrete integrable systems, we find the generalized relativistic Toda lattice hierarchy and the generalized modified Toda lattice hierarchy. Moreover, we also obtain the Hamiltonian structures of the two lattice hierarchies by means of the discrete trace identity
Cardiac biomarkers in Neonatology
Vijlbrief, D.C.
2015-01-01
In this thesis, the role for cardiac biomarkers in neonatology was investigated. Several clinically relevant results were reported. In term and preterm infants, hypoxia and subsequent adaptation play an important role in cardiac biomarker elevation. The elevated natriuretic peptides are indicative of abnormal function; elevated troponins are suggestive for cardiomyocyte damage. This methodology makes these biomarkers of additional value in the treatment of newborn infants, separate or as a co...
Topology and statistics in zero dimensions
International Nuclear Information System (INIS)
Aneziris, Charilaos.
1992-05-01
It has been suggested that space-time may be intrinsically not continuous, but discrete. Here we review some topological notions of discrete manifolds, in particular ones made out of final number of points, and discuss the possibilties for statistics in such spaces. (author)
Hirsch, M; Peinado, E; Valle, J W F
2010-01-01
We propose a new motivation for the stability of dark matter (DM). We suggest that the same non-abelian discrete flavor symmetry which accounts for the observed pattern of neutrino oscillations, spontaneously breaks to a Z2 subgroup which renders DM stable. The simplest scheme leads to a scalar doublet DM potentially detectable in nuclear recoil experiments, inverse neutrino mass hierarchy, hence a neutrinoless double beta decay rate accessible to upcoming searches, while reactor angle equal to zero gives no CP violation in neutrino oscillations.
Wuensche, Andrew
DDLab is interactive graphics software for creating, visualizing, and analyzing many aspects of Cellular Automata, Random Boolean Networks, and Discrete Dynamical Networks in general and studying their behavior, both from the time-series perspective — space-time patterns, and from the state-space perspective — attractor basins. DDLab is relevant to research, applications, and education in the fields of complexity, self-organization, emergent phenomena, chaos, collision-based computing, neural networks, content addressable memory, genetic regulatory networks, dynamical encryption, generative art and music, and the study of the abstract mathematical/physical/dynamical phenomena in their own right.
International Nuclear Information System (INIS)
Lim, Gyeong Hui
2008-03-01
This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics
Mathematical statistics and stochastic processes
Bosq, Denis
2013-01-01
Generally, books on mathematical statistics are restricted to the case of independent identically distributed random variables. In this book however, both this case AND the case of dependent variables, i.e. statistics for discrete and continuous time processes, are studied. This second case is very important for today's practitioners.Mathematical Statistics and Stochastic Processes is based on decision theory and asymptotic statistics and contains up-to-date information on the relevant topics of theory of probability, estimation, confidence intervals, non-parametric statistics and rob
Scheck, Florian
2016-01-01
Scheck’s textbook starts with a concise introduction to classical thermodynamics, including geometrical aspects. Then a short introduction to probabilities and statistics lays the basis for the statistical interpretation of thermodynamics. Phase transitions, discrete models and the stability of matter are explained in great detail. Thermodynamics has a special role in theoretical physics. Due to the general approach of thermodynamics the field has a bridging function between several areas like the theory of condensed matter, elementary particle physics, astrophysics and cosmology. The classical thermodynamics describes predominantly averaged properties of matter, reaching from few particle systems and state of matter to stellar objects. Statistical Thermodynamics covers the same fields, but explores them in greater depth and unifies classical statistical mechanics with quantum theory of multiple particle systems. The content is presented as two tracks: the fast track for master students, providing the essen...
International Nuclear Information System (INIS)
Souza, Manoelito M. de
1997-01-01
We discuss the physical meaning and the geometric interpretation of implementation in classical field theories. The origin of infinities and other inconsistencies in field theories is traced to fields defined with support on the light cone; a finite and consistent field theory requires a light-cone generator as the field support. Then, we introduce a classical field theory with support on the light cone generators. It results on a description of discrete (point-like) interactions in terms of localized particle-like fields. We find the propagators of these particle-like fields and discuss their physical meaning, properties and consequences. They are conformally invariant, singularity-free, and describing a manifestly covariant (1 + 1)-dimensional dynamics in a (3 = 1) spacetime. Remarkably this conformal symmetry remains even for the propagation of a massive field in four spacetime dimensions. We apply this formalism to Classical electrodynamics and to the General Relativity Theory. The standard formalism with its distributed fields is retrieved in terms of spacetime average of the discrete field. Singularities are the by-products of the averaging process. This new formalism enlighten the meaning and the problem of field theory, and may allow a softer transition to a quantum theory. (author)
Evaluating biomarkers for prognostic enrichment of clinical trials.
Kerr, Kathleen F; Roth, Jeremy; Zhu, Kehao; Thiessen-Philbrook, Heather; Meisner, Allison; Wilson, Francis Perry; Coca, Steven; Parikh, Chirag R
2017-12-01
A potential use of biomarkers is to assist in prognostic enrichment of clinical trials, where only patients at relatively higher risk for an outcome of interest are eligible for the trial. We investigated methods for evaluating biomarkers for prognostic enrichment. We identified five key considerations when considering a biomarker and a screening threshold for prognostic enrichment: (1) clinical trial sample size, (2) calendar time to enroll the trial, (3) total patient screening costs and the total per-patient trial costs, (4) generalizability of trial results, and (5) ethical evaluation of trial eligibility criteria. Items (1)-(3) are amenable to quantitative analysis. We developed the Biomarker Prognostic Enrichment Tool for evaluating biomarkers for prognostic enrichment at varying levels of screening stringency. We demonstrate that both modestly prognostic and strongly prognostic biomarkers can improve trial metrics using Biomarker Prognostic Enrichment Tool. Biomarker Prognostic Enrichment Tool is available as a webtool at http://prognosticenrichment.com and as a package for the R statistical computing platform. In some clinical settings, even biomarkers with modest prognostic performance can be useful for prognostic enrichment. In addition to the quantitative analysis provided by Biomarker Prognostic Enrichment Tool, investigators must consider the generalizability of trial results and evaluate the ethics of trial eligibility criteria.
Discrete Exterior Calculus Discretization of Incompressible Navier-Stokes Equations
Mohamed, Mamdouh S.
2017-05-23
A conservative discretization of incompressible Navier-Stokes equations over surface simplicial meshes is developed using discrete exterior calculus (DEC). Numerical experiments for flows over surfaces reveal a second order accuracy for the developed scheme when using structured-triangular meshes, and first order accuracy otherwise. The mimetic character of many of the DEC operators provides exact conservation of both mass and vorticity, in addition to superior kinetic energy conservation. The employment of barycentric Hodge star allows the discretization to admit arbitrary simplicial meshes. The discretization scheme is presented along with various numerical test cases demonstrating its main characteristics.
Directory of Open Access Journals (Sweden)
Ji Wei
2010-10-01
Full Text Available Abstract Background Microarray data discretization is a basic preprocess for many algorithms of gene regulatory network inference. Some common discretization methods in informatics are used to discretize microarray data. Selection of the discretization method is often arbitrary and no systematic comparison of different discretization has been conducted, in the context of gene regulatory network inference from time series gene expression data. Results In this study, we propose a new discretization method "bikmeans", and compare its performance with four other widely-used discretization methods using different datasets, modeling algorithms and number of intervals. Sensitivities, specificities and total accuracies were calculated and statistical analysis was carried out. Bikmeans method always gave high total accuracies. Conclusions Our results indicate that proper discretization methods can consistently improve gene regulatory network inference independent of network modeling algorithms and datasets. Our new method, bikmeans, resulted in significant better total accuracies than other methods.
Mass spectrometry imaging enriches biomarker discovery approaches with candidate mapping.
Scott, Alison J; Jones, Jace W; Orschell, Christie M; MacVittie, Thomas J; Kane, Maureen A; Ernst, Robert K
2014-01-01
Integral to the characterization of radiation-induced tissue damage is the identification of unique biomarkers. Biomarker discovery is a challenging and complex endeavor requiring both sophisticated experimental design and accessible technology. The resources within the National Institute of Allergy and Infectious Diseases (NIAID)-sponsored Consortium, Medical Countermeasures Against Radiological Threats (MCART), allow for leveraging robust animal models with novel molecular imaging techniques. One such imaging technique, MALDI (matrix-assisted laser desorption ionization) mass spectrometry imaging (MSI), allows for the direct spatial visualization of lipids, proteins, small molecules, and drugs/drug metabolites-or biomarkers-in an unbiased manner. MALDI-MSI acquires mass spectra directly from an intact tissue slice in discrete locations across an x, y grid that are then rendered into a spatial distribution map composed of ion mass and intensity. The unique mass signals can be plotted to generate a spatial map of biomarkers that reflects pathology and molecular events. The crucial unanswered questions that can be addressed with MALDI-MSI include identification of biomarkers for radiation damage that reflect the response to radiation dose over time and the efficacy of therapeutic interventions. Techniques in MALDI-MSI also enable integration of biomarker identification among diverse animal models. Analysis of early, sublethally irradiated tissue injury samples from diverse mouse tissues (lung and ileum) shows membrane phospholipid signatures correlated with histological features of these unique tissues. This paper will discuss the application of MALDI-MSI for use in a larger biomarker discovery pipeline.
Advances in discrete differential geometry
2016-01-01
This is one of the first books on a newly emerging field of discrete differential geometry and an excellent way to access this exciting area. It surveys the fascinating connections between discrete models in differential geometry and complex analysis, integrable systems and applications in computer graphics. The authors take a closer look at discrete models in differential geometry and dynamical systems. Their curves are polygonal, surfaces are made from triangles and quadrilaterals, and time is discrete. Nevertheless, the difference between the corresponding smooth curves, surfaces and classical dynamical systems with continuous time can hardly be seen. This is the paradigm of structure-preserving discretizations. Current advances in this field are stimulated to a large extent by its relevance for computer graphics and mathematical physics. This book is written by specialists working together on a common research project. It is about differential geometry and dynamical systems, smooth and discrete theories, ...
Poisson hierarchy of discrete strings
International Nuclear Information System (INIS)
Ioannidou, Theodora; Niemi, Antti J.
2016-01-01
The Poisson geometry of a discrete string in three dimensional Euclidean space is investigated. For this the Frenet frames are converted into a spinorial representation, the discrete spinor Frenet equation is interpreted in terms of a transfer matrix formalism, and Poisson brackets are introduced in terms of the spinor components. The construction is then generalised, in a self-similar manner, into an infinite hierarchy of Poisson algebras. As an example, the classical Virasoro (Witt) algebra that determines reparametrisation diffeomorphism along a continuous string, is identified as a particular sub-algebra, in the hierarchy of the discrete string Poisson algebra. - Highlights: • Witt (classical Virasoro) algebra is derived in the case of discrete string. • Infinite dimensional hierarchy of Poisson bracket algebras is constructed for discrete strings. • Spinor representation of discrete Frenet equations is developed.
Poisson hierarchy of discrete strings
Energy Technology Data Exchange (ETDEWEB)
Ioannidou, Theodora, E-mail: ti3@auth.gr [Faculty of Civil Engineering, School of Engineering, Aristotle University of Thessaloniki, 54249, Thessaloniki (Greece); Niemi, Antti J., E-mail: Antti.Niemi@physics.uu.se [Department of Physics and Astronomy, Uppsala University, P.O. Box 803, S-75108, Uppsala (Sweden); Laboratoire de Mathematiques et Physique Theorique CNRS UMR 6083, Fédération Denis Poisson, Université de Tours, Parc de Grandmont, F37200, Tours (France); Department of Physics, Beijing Institute of Technology, Haidian District, Beijing 100081 (China)
2016-01-28
The Poisson geometry of a discrete string in three dimensional Euclidean space is investigated. For this the Frenet frames are converted into a spinorial representation, the discrete spinor Frenet equation is interpreted in terms of a transfer matrix formalism, and Poisson brackets are introduced in terms of the spinor components. The construction is then generalised, in a self-similar manner, into an infinite hierarchy of Poisson algebras. As an example, the classical Virasoro (Witt) algebra that determines reparametrisation diffeomorphism along a continuous string, is identified as a particular sub-algebra, in the hierarchy of the discrete string Poisson algebra. - Highlights: • Witt (classical Virasoro) algebra is derived in the case of discrete string. • Infinite dimensional hierarchy of Poisson bracket algebras is constructed for discrete strings. • Spinor representation of discrete Frenet equations is developed.
Directory of Open Access Journals (Sweden)
Robert eHendren
2014-08-01
Full Text Available Autism spectrum disorders (ASD are complex, heterogeneous disorders caused by an interaction between genetic vulnerability and environmental factors. In an effort to better target the underlying roots of ASD for diagnosis and treatment, efforts to identify reliable biomarkers in genetics, neuroimaging, gene expression and measures of the body’s metabolism are growing. For this article, we review the published studies of potential biomarkers in autism and conclude that while there is increasing promise of finding biomarkers that can help us target treatment, there are none with enough evidence to support routine clinical use unless medical illness is suspected. Promising biomarkers include those for mitochondrial function, oxidative stress, and immune function. Genetic clusters are also suggesting the potential for useful biomarkers.
Principles of discrete time mechanics
Jaroszkiewicz, George
2014-01-01
Could time be discrete on some unimaginably small scale? Exploring the idea in depth, this unique introduction to discrete time mechanics systematically builds the theory up from scratch, beginning with the historical, physical and mathematical background to the chronon hypothesis. Covering classical and quantum discrete time mechanics, this book presents all the tools needed to formulate and develop applications of discrete time mechanics in a number of areas, including spreadsheet mechanics, classical and quantum register mechanics, and classical and quantum mechanics and field theories. A consistent emphasis on contextuality and the observer-system relationship is maintained throughout.
Dark discrete gauge symmetries
International Nuclear Information System (INIS)
Batell, Brian
2011-01-01
We investigate scenarios in which dark matter is stabilized by an Abelian Z N discrete gauge symmetry. Models are surveyed according to symmetries and matter content. Multicomponent dark matter arises when N is not prime and Z N contains one or more subgroups. The dark sector interacts with the visible sector through the renormalizable kinetic mixing and Higgs portal operators, and we highlight the basic phenomenology in these scenarios. In particular, multiple species of dark matter can lead to an unconventional nuclear recoil spectrum in direct detection experiments, while the presence of new light states in the dark sector can dramatically affect the decays of the Higgs at the Tevatron and LHC, thus providing a window into the gauge origin of the stability of dark matter.
International Nuclear Information System (INIS)
Noyes, H.P.; Starson, S.
1991-03-01
Discrete physics, because it replaces time evolution generated by the energy operator with a global bit-string generator (program universe) and replaces ''fields'' with the relativistic Wheeler-Feynman ''action at a distance,'' allows the consistent formulation of the concept of signed gravitational charge for massive particles. The resulting prediction made by this version of the theory is that free anti-particles near the surface of the earth will ''fall'' up with the same acceleration that the corresponding particles fall down. So far as we can see, no current experimental information is in conflict with this prediction of our theory. The experiment crusis will be one of the anti-proton or anti-hydrogen experiments at CERN. Our prediction should be much easier to test than the small effects which those experiments are currently designed to detect or bound. 23 refs
... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...
Prognostic biomarkers in osteoarthritis
Attur, Mukundan; Krasnokutsky-Samuels, Svetlana; Samuels, Jonathan; Abramson, Steven B.
2013-01-01
Purpose of review Identification of patients at risk for incident disease or disease progression in osteoarthritis remains challenging, as radiography is an insensitive reflection of molecular changes that presage cartilage and bone abnormalities. Thus there is a widely appreciated need for biochemical and imaging biomarkers. We describe recent developments with such biomarkers to identify osteoarthritis patients who are at risk for disease progression. Recent findings The biochemical markers currently under evaluation include anabolic, catabolic, and inflammatory molecules representing diverse biological pathways. A few promising cartilage and bone degradation and synthesis biomarkers are in various stages of development, awaiting further validation in larger populations. A number of studies have shown elevated expression levels of inflammatory biomarkers, both locally (synovial fluid) and systemically (serum and plasma). These chemical biomarkers are under evaluation in combination with imaging biomarkers to predict early onset and the burden of disease. Summary Prognostic biomarkers may be used in clinical knee osteoarthritis to identify subgroups in whom the disease progresses at different rates. This could facilitate our understanding of the pathogenesis and allow us to differentiate phenotypes within a heterogeneous knee osteoarthritis population. Ultimately, such findings may help facilitate the development of disease-modifying osteoarthritis drugs (DMOADs). PMID:23169101
Computational statistics handbook with Matlab
Martinez, Wendy L
2007-01-01
Prefaces Introduction What Is Computational Statistics? An Overview of the Book Probability Concepts Introduction Probability Conditional Probability and Independence Expectation Common Distributions Sampling Concepts Introduction Sampling Terminology and Concepts Sampling Distributions Parameter Estimation Empirical Distribution Function Generating Random Variables Introduction General Techniques for Generating Random Variables Generating Continuous Random Variables Generating Discrete Random Variables Exploratory Data Analysis Introduction Exploring Univariate Data Exploring Bivariate and Trivariate Data Exploring Multidimensional Data Finding Structure Introduction Projecting Data Principal Component Analysis Projection Pursuit EDA Independent Component Analysis Grand Tour Nonlinear Dimensionality Reduction Monte Carlo Methods for Inferential Statistics Introduction Classical Inferential Statistics Monte Carlo Methods for Inferential Statist...
Control of Discrete Event Systems
Smedinga, Rein
1989-01-01
Systemen met discrete gebeurtenissen spelen in vele gebieden een rol. In dit proefschrift staat de volgorde van gebeurtenissen centraal en worden tijdsaspecten buiten beschouwing gelaten. In dat geval kunnen systemen met discrete gebeurtenissen goed worden gemodelleerd door gebruik te maken van
Discrete Mathematics and Its Applications
Oxley, Alan
2010-01-01
The article gives ideas that lecturers of undergraduate Discrete Mathematics courses can use in order to make the subject more interesting for students and encourage them to undertake further studies in the subject. It is possible to teach Discrete Mathematics with little or no reference to computing. However, students are more likely to be…
Discrete Mathematics and Curriculum Reform.
Kenney, Margaret J.
1996-01-01
Defines discrete mathematics as the mathematics necessary to effect reasoned decision making in finite situations and explains how its use supports the current view of mathematics education. Discrete mathematics can be used by curriculum developers to improve the curriculum for students of all ages and abilities. (SLD)
Connections on discrete fibre bundles
International Nuclear Information System (INIS)
Manton, N.S.; Cambridge Univ.
1987-01-01
A new approach to gauge fields on a discrete space-time is proposed, in which the fundamental object is a discrete version of a principal fibre bundle. If the bundle is twisted, the gauge fields are topologically non-trivial automatically. (orig.)
Discrete dynamics versus analytic dynamics
DEFF Research Database (Denmark)
Toxværd, Søren
2014-01-01
For discrete classical Molecular dynamics obtained by the “Verlet” algorithm (VA) with the time increment h there exists a shadow Hamiltonian H˜ with energy E˜(h) , for which the discrete particle positions lie on the analytic trajectories for H˜ . Here, we proof that there, independent...... of such an analytic analogy, exists an exact hidden energy invariance E * for VA dynamics. The fact that the discrete VA dynamics has the same invariances as Newtonian dynamics raises the question, which of the formulations that are correct, or alternatively, the most appropriate formulation of classical dynamics....... In this context the relation between the discrete VA dynamics and the (general) discrete dynamics investigated by Lee [Phys. Lett. B122, 217 (1983)] is presented and discussed....
Modern approaches to discrete curvature
Romon, Pascal
2017-01-01
This book provides a valuable glimpse into discrete curvature, a rich new field of research which blends discrete mathematics, differential geometry, probability and computer graphics. It includes a vast collection of ideas and tools which will offer something new to all interested readers. Discrete geometry has arisen as much as a theoretical development as in response to unforeseen challenges coming from applications. Discrete and continuous geometries have turned out to be intimately connected. Discrete curvature is the key concept connecting them through many bridges in numerous fields: metric spaces, Riemannian and Euclidean geometries, geometric measure theory, topology, partial differential equations, calculus of variations, gradient flows, asymptotic analysis, probability, harmonic analysis, graph theory, etc. In spite of its crucial importance both in theoretical mathematics and in applications, up to now, almost no books have provided a coherent outlook on this emerging field.
Directory of Open Access Journals (Sweden)
Janice M Leung
2013-01-01
Full Text Available The inherent limitations of spirometry and clinical history have prompted clinicians and scientists to search for surrogate markers of airway diseases. Although few biomarkers have been widely accepted into the clinical armamentarium, the authors explore three sources of biomarkers that have shown promise as indicators of disease severity and treatment response. In asthma, exhaled nitric oxide measurements can predict steroid responsiveness and sputum eosinophil counts have been used to titrate anti-inflammatory therapies. In chronic obstructive pulmonary disease, inflammatory plasma biomarkers, such as fibrinogen, club cell secretory protein-16 and surfactant protein D, can denote greater severity and predict the risk of exacerbations. While the multitude of disease phenotypes in respiratory medicine make biomarker development especially challenging, these three may soon play key roles in the diagnosis and management of airway diseases.
U.S. Environmental Protection Agency — Amphibian metabolite data used in Snyder, M.N., Henderson, W.M., Glinski, D.G., Purucker, S. T., 2017. Biomarker analysis of american toad (Anaxyrus americanus) and...
Validation of New Cancer Biomarkers
DEFF Research Database (Denmark)
Duffy, Michael J; Sturgeon, Catherine M; Söletormos, Georg
2015-01-01
BACKGROUND: Biomarkers are playing increasingly important roles in the detection and management of patients with cancer. Despite an enormous number of publications on cancer biomarkers, few of these biomarkers are in widespread clinical use. CONTENT: In this review, we discuss the key steps...... in advancing a newly discovered cancer candidate biomarker from pilot studies to clinical application. Four main steps are necessary for a biomarker to reach the clinic: analytical validation of the biomarker assay, clinical validation of the biomarker test, demonstration of clinical value from performance...... of the biomarker test, and regulatory approval. In addition to these 4 steps, all biomarker studies should be reported in a detailed and transparent manner, using previously published checklists and guidelines. Finally, all biomarker studies relating to demonstration of clinical value should be registered before...
Discretion and Disproportionality
Directory of Open Access Journals (Sweden)
Jason A. Grissom
2015-12-01
Full Text Available Students of color are underrepresented in gifted programs relative to White students, but the reasons for this underrepresentation are poorly understood. We investigate the predictors of gifted assignment using nationally representative, longitudinal data on elementary students. We document that even among students with high standardized test scores, Black students are less likely to be assigned to gifted services in both math and reading, a pattern that persists when controlling for other background factors, such as health and socioeconomic status, and characteristics of classrooms and schools. We then investigate the role of teacher discretion, leveraging research from political science suggesting that clients of government services from traditionally underrepresented groups benefit from diversity in the providers of those services, including teachers. Even after conditioning on test scores and other factors, Black students indeed are referred to gifted programs, particularly in reading, at significantly lower rates when taught by non-Black teachers, a concerning result given the relatively low incidence of assignment to own-race teachers among Black students.
International Nuclear Information System (INIS)
Vlad, Valentin I.; Ionescu-Pallas, Nicholas
2000-10-01
The Planck radiation spectrum of ideal cubic and spherical cavities, in the region of small adiabatic invariance, γ = TV 1/3 , is shown to be discrete and strongly dependent on the cavity geometry and temperature. This behavior is the consequence of the random distribution of the state weights in the cubic cavity and of the random overlapping of the successive multiplet components, for the spherical cavity. The total energy (obtained by summing up the exact contributions of the eigenvalues and their weights, for low values of the adiabatic invariance) does not obey any longer Stefan-Boltzmann law. The new law includes a corrective factor depending on γ and imposes a faster decrease of the total energy to zero, for γ → 0. We have defined the double quantized regime both for cubic and spherical cavities by the superior and inferior limits put on the principal quantum numbers or the adiabatic invariance. The total energy of the double quantized cavities shows large differences from the classical calculations over unexpected large intervals, which are measurable and put in evidence important macroscopic quantum effects. (author)
Novel ageing-biomarker discovery using data-intensive technologies.
Griffiths, H R; Augustyniak, E M; Bennett, S J; Debacq-Chainiaux, F; Dunston, C R; Kristensen, P; Melchjorsen, C J; Navarrete, Santos A; Simm, A; Toussaint, O
2015-11-01
Ageing is accompanied by many visible characteristics. Other biological and physiological markers are also well-described e.g. loss of circulating sex hormones and increased inflammatory cytokines. Biomarkers for healthy ageing studies are presently predicated on existing knowledge of ageing traits. The increasing availability of data-intensive methods enables deep-analysis of biological samples for novel biomarkers. We have adopted two discrete approaches in MARK-AGE Work Package 7 for biomarker discovery; (1) microarray analyses and/or proteomics in cell systems e.g. endothelial progenitor cells or T cell ageing including a stress model; and (2) investigation of cellular material and plasma directly from tightly-defined proband subsets of different ages using proteomic, transcriptomic and miR array. The first approach provided longitudinal insight into endothelial progenitor and T cell ageing. This review describes the strategy and use of hypothesis-free, data-intensive approaches to explore cellular proteins, miR, mRNA and plasma proteins as healthy ageing biomarkers, using ageing models and directly within samples from adults of different ages. It considers the challenges associated with integrating multiple models and pilot studies as rational biomarkers for a large cohort study. From this approach, a number of high-throughput methods were developed to evaluate novel, putative biomarkers of ageing in the MARK-AGE cohort. Crown Copyright © 2015. Published by Elsevier Ireland Ltd. All rights reserved.
Statistical inference based on divergence measures
Pardo, Leandro
2005-01-01
The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete models, many statisticians remain unaware of this powerful approach.Statistical Inference Based on Divergence Measures explores classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence. The first two chapters form an overview, from a statistical perspective, of the most important measures of entropy and divergence and study their properties. The author then examines the statistical analysis of discrete multivariate data with emphasis is on problems in contingency tables and loglinear models using phi-divergence test statistics as well as minimum phi-divergence estimators. The final chapter looks at testing in general populations, prese...
Perfect discretization of path integrals
International Nuclear Information System (INIS)
Steinhaus, Sebastian
2012-01-01
In order to obtain a well-defined path integral one often employs discretizations. In the case of General Relativity these generically break diffeomorphism symmetry, which has severe consequences since these symmetries determine the dynamics of the corresponding system. In this article we consider the path integral of reparametrization invariant systems as a toy example and present an improvement procedure for the discretized propagator. Fixed points and convergence of the procedure are discussed. Furthermore we show that a reparametrization invariant path integral implies discretization independence and acts as a projector onto physical states.
Perfect discretization of path integrals
Steinhaus, Sebastian
2012-05-01
In order to obtain a well-defined path integral one often employs discretizations. In the case of General Relativity these generically break diffeomorphism symmetry, which has severe consequences since these symmetries determine the dynamics of the corresponding system. In this article we consider the path integral of reparametrization invariant systems as a toy example and present an improvement procedure for the discretized propagator. Fixed points and convergence of the procedure are discussed. Furthermore we show that a reparametrization invariant path integral implies discretization independence and acts as a projector onto physical states.
The origin of discrete particles
Bastin, T
2009-01-01
This book is a unique summary of the results of a long research project undertaken by the authors on discreteness in modern physics. In contrast with the usual expectation that discreteness is the result of mathematical tools for insertion into a continuous theory, this more basic treatment builds up the world from the discrimination of discrete entities. This gives an algebraic structure in which certain fixed numbers arise. As such, one agrees with the measured value of the fine-structure constant to one part in 10,000,000 (10 7 ). Sample Chapter(s). Foreword (56 KB). Chapter 1: Introduction
Synchronization Techniques in Parallel Discrete Event Simulation
Lindén, Jonatan
2018-01-01
Discrete event simulation is an important tool for evaluating system models in many fields of science and engineering. To improve the performance of large-scale discrete event simulations, several techniques to parallelize discrete event simulation have been developed. In parallel discrete event simulation, the work of a single discrete event simulation is distributed over multiple processing elements. A key challenge in parallel discrete event simulation is to ensure that causally dependent ...
3-D Discrete Analytical Ridgelet Transform
Helbert , David; Carré , Philippe; Andrès , Éric
2006-01-01
International audience; In this paper, we propose an implementation of the 3-D Ridgelet transform: the 3-D discrete analytical Ridgelet transform (3-D DART). This transform uses the Fourier strategy for the computation of the associated 3-D discrete Radon transform. The innovative step is the definition of a discrete 3-D transform with the discrete analytical geometry theory by the construction of 3-D discrete analytical lines in the Fourier domain. We propose two types of 3-D discrete lines:...
... this page: https://medlineplus.gov/usestatistics.html MedlinePlus Statistics To use the sharing features on this page, ... By Quarter View image full size Quarterly User Statistics Quarter Page Views Unique Visitors Oct-Dec-98 ...
Pestman, Wiebe R
2009-01-01
This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.
Discrete geometric structures for architecture
Pottmann, Helmut
2010-01-01
. The talk will provide an overview of recent progress in this field, with a particular focus on discrete geometric structures. Most of these result from practical requirements on segmenting a freeform shape into planar panels and on the physical realization
Causal Dynamics of Discrete Surfaces
Directory of Open Access Journals (Sweden)
Pablo Arrighi
2014-03-01
Full Text Available We formalize the intuitive idea of a labelled discrete surface which evolves in time, subject to two natural constraints: the evolution does not propagate information too fast; and it acts everywhere the same.
Whole Frog Project and Virtual Frog Dissection Statistics wwwstats output for January 1 through duplicate or extraneous accesses. For example, in these statistics, while a POST requesting an image is as well. Note that this under-represents the bytes requested. Starting date for following statistics
2013-01-01
Sepsis is an unusual systemic reaction to what is sometimes an otherwise ordinary infection, and it probably represents a pattern of response by the immune system to injury. A hyper-inflammatory response is followed by an immunosuppressive phase during which multiple organ dysfunction is present and the patient is susceptible to nosocomial infection. Biomarkers to diagnose sepsis may allow early intervention which, although primarily supportive, can reduce the risk of death. Although lactate is currently the most commonly used biomarker to identify sepsis, other biomarkers may help to enhance lactate’s effectiveness; these include markers of the hyper-inflammatory phase of sepsis, such as pro-inflammatory cytokines and chemokines; proteins such as C-reactive protein and procalcitonin which are synthesized in response to infection and inflammation; and markers of neutrophil and monocyte activation. Recently, markers of the immunosuppressive phase of sepsis, such as anti-inflammatory cytokines, and alterations of the cell surface markers of monocytes and lymphocytes have been examined. Combinations of pro- and anti-inflammatory biomarkers in a multi-marker panel may help identify patients who are developing severe sepsis before organ dysfunction has advanced too far. Combined with innovative approaches to treatment that target the immunosuppressive phase, these biomarkers may help to reduce the mortality rate associated with severe sepsis which, despite advances in supportive measures, remains high. PMID:23480440
Perfect discretization of path integrals
Steinhaus, Sebastian
2011-01-01
In order to obtain a well-defined path integral one often employs discretizations. In the case of General Relativity these generically break diffeomorphism symmetry, which has severe consequences since these symmetries determine the dynamics of the corresponding system. In this article we consider the path integral of reparametrization invariant systems as a toy example and present an improvement procedure for the discretized propagator. Fixed points and convergence of the procedure are discu...
Bayesian methods for proteomic biomarker development
Directory of Open Access Journals (Sweden)
Belinda Hernández
2015-12-01
In this review we provide an introduction to Bayesian inference and demonstrate some of the advantages of using a Bayesian framework. We summarize how Bayesian methods have been used previously in proteomics and other areas of bioinformatics. Finally, we describe some popular and emerging Bayesian models from the statistical literature and provide a worked tutorial including code snippets to show how these methods may be applied for the evaluation of proteomic biomarkers.
Biomarkers of tolerance: searching for the hidden phenotype.
Perucha, Esperanza; Rebollo-Mesa, Irene; Sagoo, Pervinder; Hernandez-Fuentes, Maria P
2011-08-01
Induction of transplantation tolerance remains the ideal long-term clinical and logistic solution to the current challenges facing the management of renal allograft recipients. In this review, we describe the recent studies and advances made in identifying biomarkers of renal transplant tolerance, from study inceptions, to the lessons learned and their implications for current and future studies with the same goal. With the age of biomarker discovery entering a new dimension of high-throughput technologies, here we also review the current approaches, developments, and pitfalls faced in the subsequent statistical analysis required to identify valid biomarker candidates.
Mass spectrometry for biomarker development
Energy Technology Data Exchange (ETDEWEB)
Wu, Chaochao; Liu, Tao; Baker, Erin Shammel; Rodland, Karin D.; Smith, Richard D.
2015-06-19
Biomarkers potentially play a crucial role in early disease diagnosis, prognosis and targeted therapy. In the past decade, mass spectrometry based proteomics has become increasingly important in biomarker development due to large advances in technology and associated methods. This chapter mainly focuses on the application of broad (e.g. shotgun) proteomics in biomarker discovery and the utility of targeted proteomics in biomarker verification and validation. A range of mass spectrometry methodologies are discussed emphasizing their efficacy in the different stages in biomarker development, with a particular emphasis on blood biomarker development.
Alfa, Attahiru S
2016-01-01
This book introduces the theoretical fundamentals for modeling queues in discrete-time, and the basic procedures for developing queuing models in discrete-time. There is a focus on applications in modern telecommunication systems. It presents how most queueing models in discrete-time can be set up as discrete-time Markov chains. Techniques such as matrix-analytic methods (MAM) that can used to analyze the resulting Markov chains are included. This book covers single node systems, tandem system and queueing networks. It shows how queues with time-varying parameters can be analyzed, and illustrates numerical issues associated with computations for the discrete-time queueing systems. Optimal control of queues is also covered. Applied Discrete-Time Queues targets researchers, advanced-level students and analysts in the field of telecommunication networks. It is suitable as a reference book and can also be used as a secondary text book in computer engineering and computer science. Examples and exercises are includ...
Statistical data fusion for cross-tabulation
Kamakura, W.A.; Wedel, M.
The authors address the situation in which a researcher wants to cross-tabulate two sets of discrete variables collected in independent samples, but a subset of the variables is common to both samples. The authors propose a statistical data-fusion model that allows for statistical tests of
Directory of Open Access Journals (Sweden)
Mikio Shoji
2011-01-01
Full Text Available Recent advances in biomarker studies on dementia are summarized here. CSF Aβ40, Aβ42, total tau, and phosphorylated tau are the most sensitive biomarkers for diagnosis of Alzheimer's disease (AD and prediction of onset of AD from mild cognitive impairment (MCI. Based on this progress, new diagnostic criteria for AD, MCI, and preclinical AD were proposed by National Institute of Aging (NIA and Alzheimer's Association in August 2010. In these new criteria, progress in biomarker identification and amyloid imaging studies in the past 10 years have added critical information. Huge contributions of basic and clinical studies have established clinical evidence supporting these markers. Based on this progress, essential therapy for cure of AD is urgently expected.
Inflammatory biomarkers and cancer
DEFF Research Database (Denmark)
Rasmussen, Line Jee Hartmann; Schultz, Martin; Gaardsting, Anne
2017-01-01
and previous cancer diagnoses compared to patients who were not diagnosed with cancer. Previous cancer, C-reactive protein (CRP) and suPAR were significantly associated with newly diagnosed cancer during follow-up in multiple logistic regression analyses adjusted for age, sex and CRP. Neither any of the PRRs......In Denmark, patients with serious nonspecific symptoms and signs of cancer (NSSC) are referred to the diagnostic outpatient clinics (DOCs) where an accelerated cancer diagnostic program is initiated. Various immunological and inflammatory biomarkers have been associated with cancer, including...... soluble urokinase plasminogen activator receptor (suPAR) and the pattern recognition receptors (PRRs) pentraxin-3, mannose-binding lectin, ficolin-1, ficolin-2 and ficolin-3. We aimed to evaluate these biomarkers and compare their diagnostic ability to classical biomarkers for diagnosing cancer...
Discrete Curvature Theories and Applications
Sun, Xiang
2016-08-25
Discrete Di erential Geometry (DDG) concerns discrete counterparts of notions and methods in di erential geometry. This thesis deals with a core subject in DDG, discrete curvature theories on various types of polyhedral surfaces that are practically important for free-form architecture, sunlight-redirecting shading systems, and face recognition. Modeled as polyhedral surfaces, the shapes of free-form structures may have to satisfy di erent geometric or physical constraints. We study a combination of geometry and physics { the discrete surfaces that can stand on their own, as well as having proper shapes for the manufacture. These proper shapes, known as circular and conical meshes, are closely related to discrete principal curvatures. We study curvature theories that make such surfaces possible. Shading systems of freeform building skins are new types of energy-saving structures that can re-direct the sunlight. From these systems, discrete line congruences across polyhedral surfaces can be abstracted. We develop a new curvature theory for polyhedral surfaces equipped with normal congruences { a particular type of congruences de ned by linear interpolation of vertex normals. The main results are a discussion of various de nitions of normality, a detailed study of the geometry of such congruences, and a concept of curvatures and shape operators associated with the faces of a triangle mesh. These curvatures are compatible with both normal congruences and the Steiner formula. In addition to architecture, we consider the role of discrete curvatures in face recognition. We use geometric measure theory to introduce the notion of asymptotic cones associated with a singular subspace of a Riemannian manifold, which is an extension of the classical notion of asymptotic directions. We get a simple expression of these cones for polyhedral surfaces, as well as convergence and approximation theorems. We use the asymptotic cones as facial descriptors and demonstrate the
Sadovskii, Michael V
2012-01-01
This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.
Analysis of Discrete Mittag - Leffler Functions
Directory of Open Access Journals (Sweden)
N. Shobanadevi
2015-03-01
Full Text Available Discrete Mittag - Leffler functions play a major role in the development of the theory of discrete fractional calculus. In the present article, we analyze qualitative properties of discrete Mittag - Leffler functions and establish sufficient conditions for convergence, oscillation and summability of the infinite series associated with discrete Mittag - Leffler functions.
Goodman, Joseph W
2015-01-01
This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications. The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i
Energy Technology Data Exchange (ETDEWEB)
Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il
2017-05-15
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
International Nuclear Information System (INIS)
Eliazar, Iddo
2017-01-01
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
Szulc, Stefan
1965-01-01
Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then
... Testing Treatment & Outcomes Health Professionals Statistics More Resources Candidiasis Candida infections of the mouth, throat, and esophagus Vaginal candidiasis Invasive candidiasis Definition Symptoms Risk & Prevention Sources Diagnosis ...
A goodness of fit statistic for the geometric distribution
J.A. Ferreira
2003-01-01
textabstractWe propose a goodness of fit statistic for the geometric distribution and compare it in terms of power, via simulation, with the chi-square statistic. The statistic is based on the Lau-Rao theorem and can be seen as a discrete analogue of the total time on test statistic. The results
Distribution for fermionic discrete lattice gas within the canonical ensemble
International Nuclear Information System (INIS)
Kutner, R.; Barszczak, T.
1991-01-01
The distinct deviations from the Fermi-Dirac statistics ascertained recently at low temperatures for a one-dimensional, spinless fermionic discrete lattice gas with conserved number of noninteracting particles hopping on the nondegenerated, well-separated single-particle energy levels are studied in numerical and theoretical terms. The generalized distribution is derived in the form n(h) = {Y h exp[(var-epsilon h -μ)β]+1} -1 valid even in the thermodynamic limit, when the discreteness of the energy levels is kept. This distribution demonstrates good agreement with the data obtained numerically both by the canonical partition-function technique and by Monte Carlo simulation
Connection between Fourier coefficient and Discretized Cartesian path integration
International Nuclear Information System (INIS)
Coalson, R.D.
1986-01-01
The relationship between so-called Discretized and Fourier coefficient formulations of Cartesian path integration is examined. In particular, an intimate connection between the two is established by rewriting the Discretized formulation in a manifestly Fourier-like way. This leads to improved understanding of both the limit behavior and the convergence properties of computational prescriptions based on the two formalisms. The performance of various prescriptions is compared with regard to calculation of on-diagonal statistical density matrix elements for a number of prototypical 1-d potentials. A consistent convergence order among these prescriptions is established
Foundations of a discrete physics
International Nuclear Information System (INIS)
McGoveran, D.; Noyes, P.
1988-01-01
Starting from the principles of finiteness, discreteness, finite computability and absolute nonuniqueness, we develop the ordering operator calculus, a strictly constructive mathematical system having the empirical properties required by quantum mechanical and special relativistic phenomena. We show how to construct discrete distance functions, and both rectangular and spherical coordinate systems(with a discrete version of ''π''). The richest discrete space constructible without a preferred axis and preserving translational and rotational invariance is shown to be a discrete 3-space with the usual symmetries. We introduce a local ordering parameter with local (proper) time-like properties and universal ordering parameters with global (cosmological) time-like properties. Constructed ''attribute velocities'' connect ensembles with attributes that are invariant as the appropriate time-like parameter increases. For each such attribute, we show how to construct attribute velocities which must satisfy the '' relativistic Doppler shift'' and the ''relativistic velocity composition law,'' as well as the Lorentz transformations. By construction, these velocities have finite maximum and minimum values. In the space of all attributes, the minimum of these maximum velocities will predominate in all multiple attribute computations, and hence can be identified as a fundamental limiting velocity, General commutation relations are constructed which under the physical interpretation are shown to reduce to the usual quantum mechanical commutation relations. 50 refs., 18 figs
Biomarkers for anorexia nervosa
DEFF Research Database (Denmark)
Sjøgren, Jan Magnus
2017-01-01
Biomarkers for anorexia nervosa (AN) which reflect the pathophysiology and relate to the aetiology of the disease, are warranted and could bring us one step closer to targeted treatment of AN. Some leads may be found in the biochemistry which often is found disturbed in AN, although normalization...
Biomarkers of cancer cachexia.
Loumaye, Audrey; Thissen, Jean-Paul
2017-12-01
Cachexia is a complex multifactorial syndrome, characterized by loss of skeletal muscle and fat mass, which affects the majority of advanced cancer patients and is associated with poor prognosis. Interestingly, reversing muscle loss in animal models of cancer cachexia leads to prolong survival. Therefore, detecting cachexia and maintaining muscle mass represent a major goal in the care of cancer patients. However, early diagnosis of cancer cachexia is currently limited for several reasons. Indeed, cachexia development is variable according to tumor and host characteristics. In addition, safe, accessible and non-invasive tools to detect skeletal muscle atrophy are desperately lacking in clinical practice. Finally, the precise molecular mechanisms and the key players involved in cancer cachexia remain poorly characterized. The need for an early diagnosis of cancer cachexia supports therefore the quest for a biomarker that might reflect skeletal muscle atrophy process. Current research offers different promising ways to identify such a biomarker. Initially, the quest for a biomarker of cancer cachexia has mostly focused on mediators of muscle atrophy, produced by both tumor and host, in an attempt to define new therapeutic approaches. In another hand, molecules released by the muscle into the circulation during the atrophy process have been also considered as potential biomarkers. More recently, several "omics" studies are emerging to identify new muscular or circulating markers of cancer cachexia. Some genetic markers could also contribute to identify patients more susceptible to develop cachexia. This article reviews our current knowledge regarding potential biomarkers of cancer cachexia. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Petocz, Peter; Sowey, Eric
2012-01-01
The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…
Petocz, Peter; Sowey, Eric
2008-01-01
In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…
Glaz, Joseph
2009-01-01
Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.
Lyons, L.
2016-01-01
Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.
Discrete differential geometry. Consistency as integrability
Bobenko, Alexander I.; Suris, Yuri B.
2005-01-01
A new field of discrete differential geometry is presently emerging on the border between differential and discrete geometry. Whereas classical differential geometry investigates smooth geometric shapes (such as surfaces), and discrete geometry studies geometric shapes with finite number of elements (such as polyhedra), the discrete differential geometry aims at the development of discrete equivalents of notions and methods of smooth surface theory. Current interest in this field derives not ...
Integrable structure in discrete shell membrane theory.
Schief, W K
2014-05-08
We present natural discrete analogues of two integrable classes of shell membranes. By construction, these discrete shell membranes are in equilibrium with respect to suitably chosen internal stresses and external forces. The integrability of the underlying equilibrium equations is proved by relating the geometry of the discrete shell membranes to discrete O surface theory. We establish connections with generalized barycentric coordinates and nine-point centres and identify a discrete version of the classical Gauss equation of surface theory.
Degree distribution in discrete case
International Nuclear Information System (INIS)
Wang, Li-Na; Chen, Bin; Yan, Zai-Zai
2011-01-01
Vertex degree of many network models and real-life networks is limited to non-negative integer. By means of measure and integral, the relation of the degree distribution and the cumulative degree distribution in discrete case is analyzed. The degree distribution, obtained by the differential of its cumulative, is only suitable for continuous case or discrete case with constant degree change. When degree change is not a constant but proportional to degree itself, power-law degree distribution and its cumulative have the same exponent and the mean value is finite for power-law exponent greater than 1. -- Highlights: → Degree change is the crux for using the cumulative degree distribution method. → It suits for discrete case with constant degree change. → If degree change is proportional to degree, power-law degree distribution and its cumulative have the same exponent. → In addition, the mean value is finite for power-law exponent greater than 1.
Nick, Todd G
2007-01-01
Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.
On the discrete Gabor transform and the discrete Zak transform
Bastiaans, M.J.; Geilen, M.C.W.
1996-01-01
Gabor's expansion of a discrete-time signal into a set of shifted and modulated versions of an elementary signal (or synthesis window) and the inverse operation -- the Gabor transform -- with which Gabor's expansion coefficients can be determined, are introduced. It is shown how, in the case of a
Discrete Choice and Rational Inattention
DEFF Research Database (Denmark)
Fosgerau, Mogens; Melo, Emerson; de Palma, André
2017-01-01
This paper establishes a general equivalence between discrete choice and rational inattention models. Matejka and McKay (2015, AER) showed that when information costs are modelled using the Shannon entropy, the result- ing choice probabilities in the rational inattention model take the multinomial...... logit form. We show that when information costs are modelled using a class of generalized entropies, then the choice probabilities in any rational inattention model are observationally equivalent to some additive random utility discrete choice model and vice versa. This equivalence arises from convex...
Optimization of Operations Resources via Discrete Event Simulation Modeling
Joshi, B.; Morris, D.; White, N.; Unal, R.
1996-01-01
The resource levels required for operation and support of reusable launch vehicles are typically defined through discrete event simulation modeling. Minimizing these resources constitutes an optimization problem involving discrete variables and simulation. Conventional approaches to solve such optimization problems involving integer valued decision variables are the pattern search and statistical methods. However, in a simulation environment that is characterized by search spaces of unknown topology and stochastic measures, these optimization approaches often prove inadequate. In this paper, we have explored the applicability of genetic algorithms to the simulation domain. Genetic algorithms provide a robust search strategy that does not require continuity and differentiability of the problem domain. The genetic algorithm successfully minimized the operation and support activities for a space vehicle, through a discrete event simulation model. The practical issues associated with simulation optimization, such as stochastic variables and constraints, were also taken into consideration.
Discrete hierarchical organization of social group sizes.
Zhou, W-X; Sornette, D; Hill, R A; Dunbar, R I M
2005-02-22
The 'social brain hypothesis' for the evolution of large brains in primates has led to evidence for the coevolution of neocortical size and social group sizes, suggesting that there is a cognitive constraint on group size that depends, in some way, on the volume of neural material available for processing and synthesizing information on social relationships. More recently, work on both human and non-human primates has suggested that social groups are often hierarchically structured. We combine data on human grouping patterns in a comprehensive and systematic study. Using fractal analysis, we identify, with high statistical confidence, a discrete hierarchy of group sizes with a preferred scaling ratio close to three: rather than a single or a continuous spectrum of group sizes, humans spontaneously form groups of preferred sizes organized in a geometrical series approximating 3-5, 9-15, 30-45, etc. Such discrete scale invariance could be related to that identified in signatures of herding behaviour in financial markets and might reflect a hierarchical processing of social nearness by human brains.
Directory of Open Access Journals (Sweden)
Dey Dipak K
2008-01-01
Full Text Available Abstract Background The discovery of biomarkers is an important step towards the development of criteria for early diagnosis of disease status. Recently electrospray ionization (ESI and matrix assisted laser desorption (MALDI time-of-flight (TOF mass spectrometry have been used to identify biomarkers both in proteomics and metabonomics studies. Data sets generated from such studies are generally very large in size and thus require the use of sophisticated statistical techniques to glean useful information. Most recent attempts to process these types of data model each compound's intensity either discretely by positional (mass to charge ratio clustering or through each compounds' own intensity distribution. Traditionally data processing steps such as noise removal, background elimination and m/z alignment, are generally carried out separately resulting in unsatisfactory propagation of signals in the final model. Results In the present study a novel semi-parametric approach has been developed to distinguish urinary metabolic profiles in a group of traumatic patients from those of a control group consisting of normal individuals. Data sets obtained from the replicates of a single subject were used to develop a functional profile through Dirichlet mixture of beta distribution. This functional profile is flexible enough to accommodate variability of the instrument and the inherent variability of each individual, thus simultaneously addressing different sources of systematic error. To address instrument variability, all data sets were analyzed in replicate, an important issue ignored by most studies in the past. Different model comparisons were performed to select the best model for each subject. The m/z values in the window of the irregular pattern are then further recommended for possible biomarker discovery. Conclusion To the best of our knowledge this is the very first attempt to model the physical process behind the time-of flight mass
A GMM-IG framework for selecting genes as expression panel biomarkers.
Wang, Mingyi; Chen, Jake Y
2010-01-01
The limitation of small sample size of functional genomics experiments has made it necessary to integrate DNA microarray experimental data from different sources. However, experimentation noises and biases of different microarray platforms have made integrated data analysis challenging. In this work, we propose an integrative computational framework to identify candidate biomarker genes from publicly available functional genomics studies. We developed a new framework, Gaussian Mixture Modeling-Coupled Information Gain (GMM-IG). In this framework, we first apply a two-component Gaussian mixture model (GMM) to estimate the conditional probability distributions of gene expression data between two different types of samples, for example, normal versus cancer. An expectation-maximization algorithm is then used to estimate the maximum likelihood parameters of a mixture of two Gaussian models in the feature space and determine the underlying expression levels of genes. Gene expression results from different studies are discretized, based on GMM estimations and then unified. Significantly differentially-expressed genes are filtered and assessed with information gain (IG) measures. DNA microarray experimental data for lung cancers from three different prior studies was processed using the new GMM-IG method. Target gene markers from a gene expression panel were selected and compared with several conventional computational biomarker data analysis methods. GMM-IG showed consistently high accuracy for several classification assessments. A high reproducibility of gene selection results was also determined from statistical validations. Our study shows that the GMM-IG framework can overcome poor reliability issues from single-study DNA microarray experiment while maintaining high accuracies by combining true signals from multiple studies. We present a conceptually simple framework that enables reliable integration of true differential gene expression signals from multiple
Blakemore, J S
1962-01-01
Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co
Wannier, Gregory Hugh
1966-01-01
Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for
Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James
2014-01-01
Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.
Energy Technology Data Exchange (ETDEWEB)
Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-08-08
In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.
Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...
U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...
Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Data about the usage of the WPRDC site and its various datasets, obtained by combining Google Analytics statistics with information from the WPRDC's data portal.
Serdobolskii, Vadim Ivanovich
2007-01-01
This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...
... Search Form Controls Cancel Submit Search the CDC Gonorrhea Note: Javascript is disabled or is not supported ... Twitter STD on Facebook Sexually Transmitted Diseases (STDs) Gonorrhea Statistics Recommend on Facebook Tweet Share Compartir Gonorrhea ...
DEFF Research Database (Denmark)
Tryggestad, Kjell
2004-01-01
The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...
Discrete Hamiltonian evolution and quantum gravity
International Nuclear Information System (INIS)
Husain, Viqar; Winkler, Oliver
2004-01-01
We study constrained Hamiltonian systems by utilizing general forms of time discretization. We show that for explicit discretizations, the requirement of preserving the canonical Poisson bracket under discrete evolution imposes strong conditions on both allowable discretizations and Hamiltonians. These conditions permit time discretizations for a limited class of Hamiltonians, which does not include homogeneous cosmological models. We also present two general classes of implicit discretizations which preserve Poisson brackets for any Hamiltonian. Both types of discretizations generically do not preserve first class constraint algebras. Using this observation, we show that time discretization provides a complicated time gauge fixing for quantum gravity models, which may be compared with the alternative procedure of gauge fixing before discretization
MacKenzie, Dana
2004-01-01
The drawbacks of using 19th-century mathematics in physics and astronomy are illustrated. To continue with the expansion of the knowledge about the cosmos, the scientists will have to come in terms with modern statistics. Some researchers have deliberately started importing techniques that are used in medical research. However, the physicists need to identify the brand of statistics that will be suitable for them, and make a choice between the Bayesian and the frequentists approach. (Edited abstract).
Mohamed, Mamdouh S.; Hirani, Anil N.; Samtaney, Ravi
2016-01-01
A conservative discretization of incompressible Navier–Stokes equations is developed based on discrete exterior calculus (DEC). A distinguishing feature of our method is the use of an algebraic discretization of the interior product operator and a
Solving discrete zero point problems
van der Laan, G.; Talman, A.J.J.; Yang, Z.F.
2004-01-01
In this paper an algorithm is proposed to .nd a discrete zero point of a function on the collection of integral points in the n-dimensional Euclidean space IRn.Starting with a given integral point, the algorithm generates a .nite sequence of adjacent integral simplices of varying dimension and
Succinct Sampling from Discrete Distributions
DEFF Research Database (Denmark)
Bringmann, Karl; Larsen, Kasper Green
2013-01-01
We revisit the classic problem of sampling from a discrete distribution: Given n non-negative w-bit integers x_1,...,x_n, the task is to build a data structure that allows sampling i with probability proportional to x_i. The classic solution is Walker's alias method that takes, when implemented...
Symplectomorphisms and discrete braid invariants
Czechowski, Aleksander; Vandervorst, Robert
2017-01-01
Area and orientation preserving diffeomorphisms of the standard 2-disc, referred to as symplectomorphisms of D2, allow decompositions in terms of positive twist diffeomorphisms. Using the latter decomposition, we utilize the Conley index theory of discrete braid classes as introduced in Ghrist et
The remarkable discreteness of being
Indian Academy of Sciences (India)
Life is a discrete, stochastic phenomenon: for a biological organism, the time of the two most important events of its life (reproduction and death) is random and these events change the number of individuals of the species by single units. These facts can have surprising, counterintuitive consequences. I review here three ...
Discrete tomography in neutron radiography
International Nuclear Information System (INIS)
Kuba, Attila; Rodek, Lajos; Kiss, Zoltan; Rusko, Laszlo; Nagy, Antal; Balasko, Marton
2005-01-01
Discrete tomography (DT) is an imaging technique for reconstructing discrete images from their projections using the knowledge that the object to be reconstructed contains only a few homogeneous materials characterized by known discrete absorption values. One of the main reasons for applying DT is that we will hopefully require relatively few projections. Using discreteness and some a priori information (such as an approximate shape of the object) we can apply two DT methods in neutron imaging by reducing the problem to an optimization task. The first method is a special one because it is only suitable if the object is composed of cylinders and sphere shapes. The second method is a general one in the sense that it can be used for reconstructing objects of any shape. Software was developed and physical experiments performed in order to investigate the effects of several reconstruction parameters: the number of projections, noise levels, and complexity of the object to be reconstructed. We give a summary of the experimental results and make a comparison of the results obtained using a classical reconstruction technique (FBP). The programs we developed are available in our DT reconstruction program package DIRECT
Discrete and mesoscopic regimes of finite-size wave turbulence
International Nuclear Information System (INIS)
L'vov, V. S.; Nazarenko, S.
2010-01-01
Bounding volume results in discreteness of eigenmodes in wave systems. This leads to a depletion or complete loss of wave resonances (three-wave, four-wave, etc.), which has a strong effect on wave turbulence (WT) i.e., on the statistical behavior of broadband sets of weakly nonlinear waves. This paper describes three different regimes of WT realizable for different levels of the wave excitations: discrete, mesoscopic and kinetic WT. Discrete WT comprises chaotic dynamics of interacting wave 'clusters' consisting of discrete (often finite) number of connected resonant wave triads (or quarters). Kinetic WT refers to the infinite-box theory, described by well-known wave-kinetic equations. Mesoscopic WT is a regime in which either the discrete and the kinetic evolutions alternate or when none of these two types is purely realized. We argue that in mesoscopic systems the wave spectrum experiences a sandpile behavior. Importantly, the mesoscopic regime is realized for a broad range of wave amplitudes which typically spans over several orders on magnitude, and not just for a particular intermediate level.
DEFF Research Database (Denmark)
Larsen, Frederik Fruergaard; Petersen, J Asger
2017-01-01
BACKGROUND: Sepsis is a prevalent condition among hospitalized patients that carries a high risk of morbidity and mortality. Rapid recognition of sepsis as the cause of deterioration is desirable, so effective treatment can be initiated rapidly. Traditionally, diagnosis was based on presence of two...... or more positive SIRS criteria due to infection. However, recently published sepsis-3 criteria put more emphasis on organ dysfunction caused by infection in the definition of sepsis. Regardless of this, no gold standard for diagnosis exist, and clinicians still rely on a number of traditional and novel...... biomarkers to discriminate between patients with and without infection, as the cause of deterioration. METHOD: Narrative review of current literature. RESULTS: A number of the most promising biomarkers for diagnoses and prognostication of sepsis are presented. CONCLUSION: Procalcitonin, presepsin, CD64, su...
voomDDA: discovery of diagnostic biomarkers and classification of RNA-seq data
Directory of Open Access Journals (Sweden)
Gokmen Zararsiz
2017-10-01
Full Text Available RNA-Seq is a recent and efficient technique that uses the capabilities of next-generation sequencing technology for characterizing and quantifying transcriptomes. One important task using gene-expression data is to identify a small subset of genes that can be used to build diagnostic classifiers particularly for cancer diseases. Microarray based classifiers are not directly applicable to RNA-Seq data due to its discrete nature. Overdispersion is another problem that requires careful modeling of mean and variance relationship of the RNA-Seq data. In this study, we present voomDDA classifiers: variance modeling at the observational level (voom extensions of the nearest shrunken centroids (NSC and the diagonal discriminant classifiers. VoomNSC is one of these classifiers and brings voom and NSC approaches together for the purpose of gene-expression based classification. For this purpose, we propose weighted statistics and put these weighted statistics into the NSC algorithm. The VoomNSC is a sparse classifier that models the mean-variance relationship using the voom method and incorporates voom’s precision weights into the NSC classifier via weighted statistics. A comprehensive simulation study was designed and four real datasets are used for performance assessment. The overall results indicate that voomNSC performs as the sparsest classifier. It also provides the most accurate results together with power-transformed Poisson linear discriminant analysis, rlog transformed support vector machines and random forests algorithms. In addition to prediction purposes, the voomNSC classifier can be used to identify the potential diagnostic biomarkers for a condition of interest. Through this work, statistical learning methods proposed for microarrays can be reused for RNA-Seq data. An interactive web application is freely available at http://www.biosoft.hacettepe.edu.tr/voomDDA/.
voomDDA: discovery of diagnostic biomarkers and classification of RNA-seq data.
Zararsiz, Gokmen; Goksuluk, Dincer; Klaus, Bernd; Korkmaz, Selcuk; Eldem, Vahap; Karabulut, Erdem; Ozturk, Ahmet
2017-01-01
RNA-Seq is a recent and efficient technique that uses the capabilities of next-generation sequencing technology for characterizing and quantifying transcriptomes. One important task using gene-expression data is to identify a small subset of genes that can be used to build diagnostic classifiers particularly for cancer diseases. Microarray based classifiers are not directly applicable to RNA-Seq data due to its discrete nature. Overdispersion is another problem that requires careful modeling of mean and variance relationship of the RNA-Seq data. In this study, we present voomDDA classifiers: variance modeling at the observational level (voom) extensions of the nearest shrunken centroids (NSC) and the diagonal discriminant classifiers. VoomNSC is one of these classifiers and brings voom and NSC approaches together for the purpose of gene-expression based classification. For this purpose, we propose weighted statistics and put these weighted statistics into the NSC algorithm. The VoomNSC is a sparse classifier that models the mean-variance relationship using the voom method and incorporates voom's precision weights into the NSC classifier via weighted statistics. A comprehensive simulation study was designed and four real datasets are used for performance assessment. The overall results indicate that voomNSC performs as the sparsest classifier. It also provides the most accurate results together with power-transformed Poisson linear discriminant analysis, rlog transformed support vector machines and random forests algorithms. In addition to prediction purposes, the voomNSC classifier can be used to identify the potential diagnostic biomarkers for a condition of interest. Through this work, statistical learning methods proposed for microarrays can be reused for RNA-Seq data. An interactive web application is freely available at http://www.biosoft.hacettepe.edu.tr/voomDDA/.
[Biomarkers of Alzheimer disease].
Rachel, Wojciech; Grela, Agatha; Zyss, Tomasz; Zieba, Andrzej; Piekoszewski, Wojciech
2014-01-01
Cognitive impairment is one of the most abundant age-related psychiatric disorders. The outcome of cognitive impairment in Alzheimer's disease has both individual (the patients and their families) and socio-economic effects. The prevalence of Alzheimer's disease doubles after the age of 65 years, every 4.5 years. An etiologically heterogenic group of disorders related to aging as well as genetic and environmental interactions probably underlie the impairment in Alzheimer's disease. Those factors cause the degeneration of brain tissue which leads to significant cognitive dysfunction. There are two main hypotheses that are linked to the process of neurodegeneration: (i) amyloid cascade and (ii) the role of secretases and dysfunction of mitochondria. From the therapeutic standpoint it is crucial to get an early diagnosis and start with an adequate treatment. The undeniable progress in the field of biomarker research should lead to a better understanding of the early stages of the disorder. So far, the best recognised and described biomarkers of Alzheimer's disease, which can be detected in both cerebrospinal fluid and blood, are: beta-amyloid, tau-protein and phosphorylated tau-protein (phospho-tau). The article discusses the usefulness of the known biomarkers of Alzheimer's disease in early diagnosis.
Discrete elements method of neutron transport
International Nuclear Information System (INIS)
Mathews, K.A.
1988-01-01
In this paper a new neutron transport method, called discrete elements (L N ) is derived and compared to discrete ordinates methods, theoretically and by numerical experimentation. The discrete elements method is based on discretizing the Boltzmann equation over a set of elements of angle. The discrete elements method is shown to be more cost-effective than discrete ordinates, in terms of accuracy versus execution time and storage, for the cases tested. In a two-dimensional test case, a vacuum duct in a shield, the L N method is more consistently convergent toward a Monte Carlo benchmark solution
Martinez-Torteya, Antonio; Treviño-Alvarado, Víctor; Tamez-Peña, José
2013-02-01
The accurate diagnosis of Alzheimer's disease (AD) and mild cognitive impairment (MCI) confers many clinical research and patient care benefits. Studies have shown that multimodal biomarkers provide better diagnosis accuracy of AD and MCI than unimodal biomarkers, but their construction has been based on traditional statistical approaches. The objective of this work was the creation of accurate AD and MCI diagnostic multimodal biomarkers using advanced bioinformatics tools. The biomarkers were created by exploring multimodal combinations of features using machine learning techniques. Data was obtained from the ADNI database. The baseline information (e.g. MRI analyses, PET analyses and laboratory essays) from AD, MCI and healthy control (HC) subjects with available diagnosis up to June 2012 was mined for case/controls candidates. The data mining yielded 47 HC, 83 MCI and 43 AD subjects for biomarker creation. Each subject was characterized by at least 980 ADNI features. A genetic algorithm feature selection strategy was used to obtain compact and accurate cross-validated nearest centroid biomarkers. The biomarkers achieved training classification accuracies of 0.983, 0.871 and 0.917 for HC vs. AD, HC vs. MCI and MCI vs. AD respectively. The constructed biomarkers were relatively compact: from 5 to 11 features. Those multimodal biomarkers included several widely accepted univariate biomarkers and novel image and biochemical features. Multimodal biomarkers constructed from previously and non-previously AD associated features showed improved diagnostic performance when compared to those based solely on previously AD associated features.
A simple method to combine multiple molecular biomarkers for dichotomous diagnostic classification
Directory of Open Access Journals (Sweden)
Amin Manik A
2006-10-01
Full Text Available Abstract Background In spite of the recognized diagnostic potential of biomarkers, the quest for squelching noise and wringing in information from a given set of biomarkers continues. Here, we suggest a statistical algorithm that – assuming each molecular biomarker to be a diagnostic test – enriches the diagnostic performance of an optimized set of independent biomarkers employing established statistical techniques. We validated the proposed algorithm using several simulation datasets in addition to four publicly available real datasets that compared i subjects having cancer with those without; ii subjects with two different cancers; iii subjects with two different types of one cancer; and iv subjects with same cancer resulting in differential time to metastasis. Results Our algorithm comprises of three steps: estimating the area under the receiver operating characteristic curve for each biomarker, identifying a subset of biomarkers using linear regression and combining the chosen biomarkers using linear discriminant function analysis. Combining these established statistical methods that are available in most statistical packages, we observed that the diagnostic accuracy of our approach was 100%, 99.94%, 96.67% and 93.92% for the real datasets used in the study. These estimates were comparable to or better than the ones previously reported using alternative methods. In a synthetic dataset, we also observed that all the biomarkers chosen by our algorithm were indeed truly differentially expressed. Conclusion The proposed algorithm can be used for accurate diagnosis in the setting of dichotomous classification of disease states.
Biomarkers in Diabetic Retinopathy
Jenkins, Alicia J.; Joglekar, Mugdha V.; Hardikar, Anandwardhan A.; Keech, Anthony C.; O'Neal, David N.; Januszewski, Andrzej S.
2015-01-01
There is a global diabetes epidemic correlating with an increase in obesity. This coincidence may lead to a rise in the prevalence of type 2 diabetes. There is also an as yet unexplained increase in the incidence of type 1 diabetes, which is not related to adiposity. Whilst improved diabetes care has substantially improved diabetes outcomes, the disease remains a common cause of working age adult-onset blindness. Diabetic retinopathy is the most frequently occurring complication of diabetes; it is greatly feared by many diabetes patients. There are multiple risk factors and markers for the onset and progression of diabetic retinopathy, yet residual risk remains. Screening for diabetic retinopathy is recommended to facilitate early detection and treatment. Common biomarkers of diabetic retinopathy and its risk in clinical practice today relate to the visualization of the retinal vasculature and measures of glycemia, lipids, blood pressure, body weight, smoking, and pregnancy status. Greater knowledge of novel biomarkers and mediators of diabetic retinopathy, such as those related to inflammation and angiogenesis, has contributed to the development of additional therapeutics, in particular for late-stage retinopathy, including intra-ocular corticosteroids and intravitreal vascular endothelial growth factor inhibitors ('anti-VEGFs') agents. Unfortunately, in spite of a range of treatments (including laser photocoagulation, intraocular steroids, and anti-VEGF agents, and more recently oral fenofibrate, a PPAR-alpha agonist lipid-lowering drug), many patients with diabetic retinopathy do not respond well to current therapeutics. Therefore, more effective treatments for diabetic retinopathy are necessary. New analytical techniques, in particular those related to molecular markers, are accelerating progress in diabetic retinopathy research. Given the increasing incidence and prevalence of diabetes, and the limited capacity of healthcare systems to screen and treat
Biomarkers in Diabetic Retinopathy.
Jenkins, Alicia J; Joglekar, Mugdha V; Hardikar, Anandwardhan A; Keech, Anthony C; O'Neal, David N; Januszewski, Andrzej S
2015-01-01
There is a global diabetes epidemic correlating with an increase in obesity. This coincidence may lead to a rise in the prevalence of type 2 diabetes. There is also an as yet unexplained increase in the incidence of type 1 diabetes, which is not related to adiposity. Whilst improved diabetes care has substantially improved diabetes outcomes, the disease remains a common cause of working age adult-onset blindness. Diabetic retinopathy is the most frequently occurring complication of diabetes; it is greatly feared by many diabetes patients. There are multiple risk factors and markers for the onset and progression of diabetic retinopathy, yet residual risk remains. Screening for diabetic retinopathy is recommended to facilitate early detection and treatment. Common biomarkers of diabetic retinopathy and its risk in clinical practice today relate to the visualization of the retinal vasculature and measures of glycemia, lipids, blood pressure, body weight, smoking, and pregnancy status. Greater knowledge of novel biomarkers and mediators of diabetic retinopathy, such as those related to inflammation and angiogenesis, has contributed to the development of additional therapeutics, in particular for late-stage retinopathy, including intra-ocular corticosteroids and intravitreal vascular endothelial growth factor inhibitors ('anti-VEGFs') agents. Unfortunately, in spite of a range of treatments (including laser photocoagulation, intraocular steroids, and anti-VEGF agents, and more recently oral fenofibrate, a PPAR-alpha agonist lipid-lowering drug), many patients with diabetic retinopathy do not respond well to current therapeutics. Therefore, more effective treatments for diabetic retinopathy are necessary. New analytical techniques, in particular those related to molecular markers, are accelerating progress in diabetic retinopathy research. Given the increasing incidence and prevalence of diabetes, and the limited capacity of healthcare systems to screen and treat
Discrete gauge symmetries in discrete MSSM-like orientifolds
International Nuclear Information System (INIS)
Ibáñez, L.E.; Schellekens, A.N.; Uranga, A.M.
2012-01-01
Motivated by the necessity of discrete Z N symmetries in the MSSM to insure baryon stability, we study the origin of discrete gauge symmetries from open string sector U(1)'s in orientifolds based on rational conformal field theory. By means of an explicit construction, we find an integral basis for the couplings of axions and U(1) factors for all simple current MIPFs and orientifolds of all 168 Gepner models, a total of 32 990 distinct cases. We discuss how the presence of discrete symmetries surviving as a subgroup of broken U(1)'s can be derived using this basis. We apply this procedure to models with MSSM chiral spectrum, concretely to all known U(3)×U(2)×U(1)×U(1) and U(3)×Sp(2)×U(1)×U(1) configurations with chiral bi-fundamentals, but no chiral tensors, as well as some SU(5) GUT models. We find examples of models with Z 2 (R-parity) and Z 3 symmetries that forbid certain B and/or L violating MSSM couplings. Their presence is however relatively rare, at the level of a few percent of all cases.
Goodman, J. W.
This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.
Schwabl, Franz
2006-01-01
The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...
Jana, Madhusudan
2015-01-01
Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...
Guénault, Tony
2007-01-01
In this revised and enlarged second edition of an established text Tony Guénault provides a clear and refreshingly readable introduction to statistical physics, an essential component of any first degree in physics. The treatment itself is self-contained and concentrates on an understanding of the physical ideas, without requiring a high level of mathematical sophistication. A straightforward quantum approach to statistical averaging is adopted from the outset (easier, the author believes, than the classical approach). The initial part of the book is geared towards explaining the equilibrium properties of a simple isolated assembly of particles. Thus, several important topics, for example an ideal spin-½ solid, can be discussed at an early stage. The treatment of gases gives full coverage to Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein statistics. Towards the end of the book the student is introduced to a wider viewpoint and new chapters are included on chemical thermodynamics, interactions in, for exam...
Mandl, Franz
1988-01-01
The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient
Levine-Wissing, Robin
2012-01-01
All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep
Davidson, Norman
2003-01-01
Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody
Predictive Biomarkers for Asthma Therapy.
Medrek, Sarah K; Parulekar, Amit D; Hanania, Nicola A
2017-09-19
Asthma is a heterogeneous disease characterized by multiple phenotypes. Treatment of patients with severe disease can be challenging. Predictive biomarkers are measurable characteristics that reflect the underlying pathophysiology of asthma and can identify patients that are likely to respond to a given therapy. This review discusses current knowledge regarding predictive biomarkers in asthma. Recent trials evaluating biologic therapies targeting IgE, IL-5, IL-13, and IL-4 have utilized predictive biomarkers to identify patients who might benefit from treatment. Other work has suggested that using composite biomarkers may offer enhanced predictive capabilities in tailoring asthma therapy. Multiple biomarkers including sputum eosinophil count, blood eosinophil count, fractional concentration of nitric oxide in exhaled breath (FeNO), and serum periostin have been used to identify which patients will respond to targeted asthma medications. Further work is needed to integrate predictive biomarkers into clinical practice.
Positivity for Convective Semi-discretizations
Fekete, Imre; Ketcheson, David I.; Loczi, Lajos
2017-01-01
We propose a technique for investigating stability properties like positivity and forward invariance of an interval for method-of-lines discretizations, and apply the technique to study positivity preservation for a class of TVD semi-discretizations
2015 ICSA/Graybill Applied Statistics Symposium
Wang, Bushi; Hu, Xiaowen; Chen, Kun; Liu, Ray
2016-01-01
The papers in this volume represent a broad, applied swath of advanced contributions to the 2015 ICSA/Graybill Applied Statistics Symposium of the International Chinese Statistical Association, held at Colorado State University in Fort Collins. The contributions cover topics that range from statistical applications in business and finance to applications in clinical trials and biomarker analysis. Each papers was peer-reviewed by at least two referees and also by an editor. The conference was attended by over 400 participants from academia, industry, and government agencies around the world, including from North America, Asia, and Europe. Focuses on statistical applications from clinical trials, biomarker analysis, and personalized medicine to applications in finance and business analytics A unique selection of papers from broad and multi-disciplinary critical hot topics - from academic, government, and industry perspectives - to appeal to a wide variety of applied research interests All papers feature origina...
Indian Academy of Sciences (India)
inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.
Schrödinger, Erwin
1952-01-01
Nobel Laureate's brilliant attempt to develop a simple, unified standard method of dealing with all cases of statistical thermodynamics - classical, quantum, Bose-Einstein, Fermi-Dirac, and more.The work also includes discussions of Nernst theorem, Planck's oscillator, fluctuations, the n-particle problem, problem of radiation, much more.
Quantum chaos on discrete graphs
International Nuclear Information System (INIS)
Smilansky, Uzy
2007-01-01
Adapting a method developed for the study of quantum chaos on quantum (metric) graphs (Kottos and Smilansky 1997 Phys. Rev. Lett. 79 4794, Kottos and Smilansky 1999 Ann. Phys., NY 274 76), spectral ζ functions and trace formulae for discrete Laplacians on graphs are derived. This is achieved by expressing the spectral secular equation in terms of the periodic orbits of the graph and obtaining functions which belong to the class of ζ functions proposed originally by Ihara (1966 J. Mat. Soc. Japan 18 219) and expanded by subsequent authors (Stark and Terras 1996 Adv. Math. 121 124, Kotani and Sunada 2000 J. Math. Sci. Univ. Tokyo 7 7). Finally, a model of 'classical dynamics' on the discrete graph is proposed. It is analogous to the corresponding classical dynamics derived for quantum graphs (Kottos and Smilansky 1997 Phys. Rev. Lett. 79 4794, Kottos and Smilansky 1999 Ann. Phys., NY 274 76). (fast track communication)
Dark energy from discrete spacetime.
Directory of Open Access Journals (Sweden)
Aaron D Trout
Full Text Available Dark energy accounts for most of the matter-energy content of our universe, yet current theories of its origin rely on radical physical assumptions such as the holographic principle or controversial anthropic arguments. We give a better motivated explanation for dark energy, claiming that it arises from a small negative scalar-curvature present even in empty spacetime. The vacuum has this curvature because spacetime is fundamentally discrete and there are more ways for a discrete geometry to have negative curvature than positive. We explicitly compute this effect using a variant of the well known dynamical-triangulations (DT model for quantum gravity. Our model predicts a time-varying non-zero cosmological constant with a current value, [Formula: see text] in natural units, in agreement with observation. This calculation is made possible by a novel characterization of the possible DT action values combined with numerical evidence concerning their degeneracies.
Applied geometry and discrete mathematics
Sturm; Gritzmann, Peter; Sturmfels, Bernd
1991-01-01
This volume, published jointly with the Association for Computing Machinery, comprises a collection of research articles celebrating the occasion of Victor Klee's sixty-fifth birthday in September 1990. During his long career, Klee has made contributions to a wide variety of areas, such as discrete and computational geometry, convexity, combinatorics, graph theory, functional analysis, mathematical programming and optimization, and theoretical computer science. In addition, Klee made important contributions to mathematics education, mathematical methods in economics and the decision sciences, applications of discrete mathematics in the biological and social sciences, and the transfer of knowledge from applied mathematics to industry. In honor of Klee's achievements, this volume presents more than forty papers on topics related to Klee's research. While the majority of the papers are research articles, a number of survey articles are also included. Mirroring the breadth of Klee's mathematical contributions, th...
Emissivity of discretized diffusion problems
International Nuclear Information System (INIS)
Densmore, Jeffery D.; Davidson, Gregory; Carrington, David B.
2006-01-01
The numerical modeling of radiative transfer by the diffusion approximation can produce artificially damped radiation propagation if spatial cells are too optically thick. In this paper, we investigate this nonphysical behavior at external problem boundaries by examining the emissivity of the discretized diffusion approximation. We demonstrate that the standard cell-centered discretization produces an emissivity that is too low for optically thick cells, a situation that leads to the lack of radiation propagation. We then present a modified boundary condition that yields an accurate emissivity regardless of cell size. This modified boundary condition can be used with a deterministic calculation or as part of a hybrid transport-diffusion method for increasing the efficiency of Monte Carlo simulations. We also discuss the range of applicability, as a function of cell size and material properties, when this modified boundary condition is employed in a hybrid technique. With a set of numerical calculations, we demonstrate the accuracy and usefulness of this modified boundary condition
Discrete symmetries in the MSSM
Energy Technology Data Exchange (ETDEWEB)
Schieren, Roland
2010-12-02
The use of discrete symmetries, especially abelian ones, in physics beyond the standard model of particle physics is discussed. A method is developed how a general, abelian, discrete symmetry can be obtained via spontaneous symmetry breaking. In addition, anomalies are treated in the path integral approach with special attention to anomaly cancellation via the Green-Schwarz mechanism. All this is applied to the minimal supersymmetric standard model. A unique Z{sup R}{sub 4} symmetry is discovered which solves the {mu}-problem as well as problems with proton decay and allows to embed the standard model gauge group into a simple group, i.e. the Z{sup R}{sub 4} is compatible with grand unification. Also the flavor problem in the context of minimal flavor violation is addressed. Finally, a string theory model is presented which exhibits the mentioned Z{sup R}{sub 4} symmetry and other desirable features. (orig.)
Domain Discretization and Circle Packings
DEFF Research Database (Denmark)
Dias, Kealey
A circle packing is a configuration of circles which are tangent with one another in a prescribed pattern determined by a combinatorial triangulation, where the configuration fills a planar domain or a two-dimensional surface. The vertices in the triangulation correspond to centers of circles...... to domain discretization problems such as triangulation and unstructured mesh generation techniques. We wish to ask ourselves the question: given a cloud of points in the plane (we restrict ourselves to planar domains), is it possible to construct a circle packing preserving the positions of the vertices...... and constrained meshes having predefined vertices as constraints. A standard method of two-dimensional mesh generation involves conformal mapping of the surface or domain to standardized shapes, such as a disk. Since circle packing is a new technique for constructing discrete conformal mappings, it is possible...
Discrete Bose-Einstein spectra
International Nuclear Information System (INIS)
Vlad, Valentin I.; Ionescu-Pallas, Nicholas
2001-03-01
The Bose-Einstein energy spectrum of a quantum gas, confined in a rigid cubic box, is shown to become discrete and strongly dependent on the box geometry (size L), temperature, T and atomic mass number, A at , in the region of small γ=A at TV 1/3 . This behavior is the consequence of the random state degeneracy in the box. Furthermore, we demonstrate that the total energy does not obey the conventional law any longer, but a new law, which depends on γ and on the quantum gas fugacity. This energy law imposes a faster decrease to zero than it is classically expected, for γ→0. The lighter the gas atoms, the higher the temperatures or the box size, for the same effects in the discrete Bose-Einstein regime. (author)
Discrete symmetries in the MSSM
International Nuclear Information System (INIS)
Schieren, Roland
2010-01-01
The use of discrete symmetries, especially abelian ones, in physics beyond the standard model of particle physics is discussed. A method is developed how a general, abelian, discrete symmetry can be obtained via spontaneous symmetry breaking. In addition, anomalies are treated in the path integral approach with special attention to anomaly cancellation via the Green-Schwarz mechanism. All this is applied to the minimal supersymmetric standard model. A unique Z R 4 symmetry is discovered which solves the μ-problem as well as problems with proton decay and allows to embed the standard model gauge group into a simple group, i.e. the Z R 4 is compatible with grand unification. Also the flavor problem in the context of minimal flavor violation is addressed. Finally, a string theory model is presented which exhibits the mentioned Z R 4 symmetry and other desirable features. (orig.)
Dark energy from discrete spacetime.
Trout, Aaron D
2013-01-01
Dark energy accounts for most of the matter-energy content of our universe, yet current theories of its origin rely on radical physical assumptions such as the holographic principle or controversial anthropic arguments. We give a better motivated explanation for dark energy, claiming that it arises from a small negative scalar-curvature present even in empty spacetime. The vacuum has this curvature because spacetime is fundamentally discrete and there are more ways for a discrete geometry to have negative curvature than positive. We explicitly compute this effect using a variant of the well known dynamical-triangulations (DT) model for quantum gravity. Our model predicts a time-varying non-zero cosmological constant with a current value, [Formula: see text] in natural units, in agreement with observation. This calculation is made possible by a novel characterization of the possible DT action values combined with numerical evidence concerning their degeneracies.
Discrete mathematics using a computer
Hall, Cordelia
2000-01-01
Several areas of mathematics find application throughout computer science, and all students of computer science need a practical working understanding of them. These core subjects are centred on logic, sets, recursion, induction, relations and functions. The material is often called discrete mathematics, to distinguish it from the traditional topics of continuous mathematics such as integration and differential equations. The central theme of this book is the connection between computing and discrete mathematics. This connection is useful in both directions: • Mathematics is used in many branches of computer science, in applica tions including program specification, datastructures,design and analysis of algorithms, database systems, hardware design, reasoning about the correctness of implementations, and much more; • Computers can help to make the mathematics easier to learn and use, by making mathematical terms executable, making abstract concepts more concrete, and through the use of software tools su...
Duality for discrete integrable systems
International Nuclear Information System (INIS)
Quispel, G R W; Capel, H W; Roberts, J A G
2005-01-01
A new class of discrete dynamical systems is introduced via a duality relation for discrete dynamical systems with a number of explicitly known integrals. The dual equation can be defined via the difference of an arbitrary linear combination of integrals and its upshifted version. We give an example of an integrable mapping with two parameters and four integrals leading to a (four-dimensional) dual mapping with four parameters and two integrals. We also consider a more general class of higher-dimensional mappings arising via a travelling-wave reduction from the (integrable) MKdV partial-difference equation. By differencing the trace of the monodromy matrix we obtain a class of novel dual mappings which is shown to be integrable as level-set-dependent versions of the original ones
Observability of discretized partial differential equations
Cohn, Stephen E.; Dee, Dick P.
1988-01-01
It is shown that complete observability of the discrete model used to assimilate data from a linear partial differential equation (PDE) system is necessary and sufficient for asymptotic stability of the data assimilation process. The observability theory for discrete systems is reviewed and applied to obtain simple observability tests for discretized constant-coefficient PDEs. Examples are used to show how numerical dispersion can result in discrete dynamics with multiple eigenvalues, thereby detracting from observability.
On the putative essential discreteness of q-generalized entropies
Plastino, A.; Rocca, M. C.
2017-12-01
It has been argued in Abe (2010), entitled Essential discreteness in generalized thermostatistics with non-logarithmic entropy, that ;continuous Hamiltonian systems with long-range interactions and the so-called q-Gaussian momentum distributions are seen to be outside the scope of non-extensive statistical mechanics;. The arguments are clever and appealing. We show here that, however, some mathematical subtleties render them unconvincing.
Biomarkers in Prostate Cancer Epidemiology
Directory of Open Access Journals (Sweden)
Mudit Verma
2011-09-01
Full Text Available Understanding the etiology of a disease such as prostate cancer may help in identifying populations at high risk, timely intervention of the disease, and proper treatment. Biomarkers, along with exposure history and clinical data, are useful tools to achieve these goals. Individual risk and population incidence of prostate cancer result from the intervention of genetic susceptibility and exposure. Biochemical, epigenetic, genetic, and imaging biomarkers are used to identify people at high risk for developing prostate cancer. In cancer epidemiology, epigenetic biomarkers offer advantages over other types of biomarkers because they are expressed against a person’s genetic background and environmental exposure, and because abnormal events occur early in cancer development, which includes several epigenetic alterations in cancer cells. This article describes different biomarkers that have potential use in studying the epidemiology of prostate cancer. We also discuss the characteristics of an ideal biomarker for prostate cancer, and technologies utilized for biomarker assays. Among epigenetic biomarkers, most reports indicate GSTP1 hypermethylation as the diagnostic marker for prostate cancer; however, NKX2-5, CLSTN1, SPOCK2, SLC16A12, DPYS, and NSE1 also have been reported to be regulated by methylation mechanisms in prostate cancer. Current challenges in utilization of biomarkers in prostate cancer diagnosis and epidemiologic studies and potential solutions also are discussed.
Biomarker Identification Using Text Mining
Directory of Open Access Journals (Sweden)
Hui Li
2012-01-01
Full Text Available Identifying molecular biomarkers has become one of the important tasks for scientists to assess the different phenotypic states of cells or organisms correlated to the genotypes of diseases from large-scale biological data. In this paper, we proposed a text-mining-based method to discover biomarkers from PubMed. First, we construct a database based on a dictionary, and then we used a finite state machine to identify the biomarkers. Our method of text mining provides a highly reliable approach to discover the biomarkers in the PubMed database.
Effective lagrangian description on discrete gauge symmetries
International Nuclear Information System (INIS)
Banks, T.
1989-01-01
We exhibit a simple low-energy lagrangian which describes a system with a discrete remnant of a spontaneously broken continuous gauge symmetry. The lagrangian gives a simple description of the effects ascribed to such systems by Krauss and Wilczek: black holes carry discrete hair and interact with cosmic strings, and wormholes cannot lead to violation of discrete gauge symmetries. (orig.)
Discrete port-Hamiltonian systems : mixed interconnections
Talasila, Viswanath; Clemente-Gallardo, J.; Schaft, A.J. van der
2005-01-01
Either from a control theoretic viewpoint or from an analysis viewpoint it is necessary to convert smooth systems to discrete systems, which can then be implemented on computers for numerical simulations. Discrete models can be obtained either by discretizing a smooth model, or by directly modeling
Discrete fractional solutions of a Legendre equation
Yılmazer, Resat
2018-01-01
One of the most popular research interests of science and engineering is the fractional calculus theory in recent times. Discrete fractional calculus has also an important position in fractional calculus. In this work, we acquire new discrete fractional solutions of the homogeneous and non homogeneous Legendre differential equation by using discrete fractional nabla operator.
Chiral Biomarkers in Meteorites
Hoover, Richard B.
2010-01-01
The chirality of organic molecules with the asymmetric location of group radicals was discovered in 1848 by Louis Pasteur during his investigations of the rotation of the plane of polarization of light by crystals of sodium ammonium paratartrate. It is well established that the amino acids in proteins are exclusively Levorotary (L-aminos) and the sugars in DNA and RNA are Dextrorotary (D-sugars). This phenomenon of homochirality of biological polymers is a fundamental property of all life known on Earth. Furthermore, abiotic production mechanisms typically yield recemic mixtures (i.e. equal amounts of the two enantiomers). When amino acids were first detected in carbonaceous meteorites, it was concluded that they were racemates. This conclusion was taken as evidence that they were extraterrestrial and produced by abiologically. Subsequent studies by numerous researchers have revealed that many of the amino acids in carbonaceous meteorites exhibit a significant L-excess. The observed chirality is much greater than that produced by any currently known abiotic processes (e.g. Linearly polarized light from neutron stars; Circularly polarized ultraviolet light from faint stars; optically active quartz powders; inclusion polymerization in clay minerals; Vester-Ulbricht hypothesis of parity violations, etc.). This paper compares the measured chirality detected in the amino acids of carbonaceous meteorites with the effect of these diverse abiotic processes. IT is concluded that the levels observed are inconsistent with post-arrival biological contamination or with any of the currently known abiotic production mechanisms. However, they are consistent with ancient biological processes on the meteorite parent body. This paper will consider these chiral biomarkers in view of the detection of possible microfossils found in the Orgueil and Murchison carbonaceous meteorites. Energy dispersive x-ray spectroscopy (EDS) data obtained on these morphological biomarkers will be
Hepcidin- A Burgeoning Biomarker
Directory of Open Access Journals (Sweden)
Hemkant Manikrao Deshmukh
2017-10-01
Full Text Available The discovery of hepcidin has triggered a virtual ignition of studies on iron metabolism and related disorders. The peptide hormone hepcidin is a key homeostatic regulator of iron metabolism. The synthesis of hepcidin is induced by systemic iron levels and by inflammatory stimuli. Several human diseases are associated with variations in hepcidin concentrations. The evaluation of hepcidin in biological fluids is therefore a promising device in the diagnosis and management of medical situations in which iron metabolism is affected. Thus, it made us to recapitulate role of hepcidin as biomarker.
Towards Improved Biomarker Research
DEFF Research Database (Denmark)
Kjeldahl, Karin
This thesis takes a look at the data analytical challenges associated with the search for biomarkers in large-scale biological data such as transcriptomics, proteomics and metabolomics data. These studies aim to identify genes, proteins or metabolites which can be associated with e.g. a diet...... with very specific competencies. In order to optimize the basis of a sound and fruitful data analysis, suggestions are givenwhich focus on (1) collection of good data, (2) preparation of data for the data analysis and (3) a sound data analysis. If these steps are optimized, PLS is a also a very goodmethod...
Salivary pH: A diagnostic biomarker
Directory of Open Access Journals (Sweden)
Sharmila Baliga
2013-01-01
Full Text Available Objectives: Saliva contains a variety of host defense factors. It influences calculus formation and periodontal disease. Different studies have been done to find exact correlation of salivary biomarkers with periodontal disease. With a multitude of biomarkers and complexities in their determination, the salivary pH may be tried to be used as a quick chairside test. The aim of this study was to analyze the pH of saliva and determine its relevance to the severity of periodontal disease. Study Design: The study population consisted of 300 patients. They were divided into three groups of 100 patients each: Group A had clinically healthy gingiva, Group B who had generalized chronic gingivitis and Group C who had generalized chronic periodontitis. The randomized unstimulated saliva from each patient was collected and pH was tested. Data was analyzed statistically using analysis of variance technique. Results: The salivary pH was more alkaline for patients with generalized chronic gingivitis as compared with the control group (P = 0.001 whereas patients with generalized chronic periodontitis had more acidic pH as compared with the control group (P = 0.001. Conclusion: These results indicate a significant change in the pH depending on the severity of the periodontal condition. The salivary pH shows significant changes and thus relevance to the severity of periodontal disease. Salivary pH may thus be used as a quick chairside diagnostic biomarker.
Salivary pH: A diagnostic biomarker.
Baliga, Sharmila; Muglikar, Sangeeta; Kale, Rahul
2013-07-01
Saliva contains a variety of host defense factors. It influences calculus formation and periodontal disease. Different studies have been done to find exact correlation of salivary biomarkers with periodontal disease. With a multitude of biomarkers and complexities in their determination, the salivary pH may be tried to be used as a quick chairside test. The aim of this study was to analyze the pH of saliva and determine its relevance to the severity of periodontal disease. The study population consisted of 300 patients. They were divided into three groups of 100 patients each: Group A had clinically healthy gingiva, Group B who had generalized chronic gingivitis and Group C who had generalized chronic periodontitis. The randomized unstimulated saliva from each patient was collected and pH was tested. Data was analyzed statistically using analysis of variance technique. The salivary pH was more alkaline for patients with generalized chronic gingivitis as compared with the control group (P = 0.001) whereas patients with generalized chronic periodontitis had more acidic pH as compared with the control group (P = 0.001). These results indicate a significant change in the pH depending on the severity of the periodontal condition. The salivary pH shows significant changes and thus relevance to the severity of periodontal disease. Salivary pH may thus be used as a quick chairside diagnostic biomarker.
A goodness of fit statistic for the geometric distribution
Ferreira, J.A.
2003-01-01
textabstractWe propose a goodness of fit statistic for the geometric distribution and compare it in terms of power, via simulation, with the chi-square statistic. The statistic is based on the Lau-Rao theorem and can be seen as a discrete analogue of the total time on test statistic. The results suggest that the test based on the new statistic is generally superior to the chi-square test.
Continuous versus discrete structures II -- Discrete Hamiltonian systems and Helmholtz conditions
Cresson, Jacky; Pierret, Frédéric
2015-01-01
We define discrete Hamiltonian systems in the framework of discrete embeddings. An explicit comparison with previous attempts is given. We then solve the discrete Helmholtz's inverse problem for the discrete calculus of variation in the Hamiltonian setting. Several applications are discussed.
Asymptotic behavior of discrete holomorphic maps z^c, log(z) and discrete Painleve transcedents
Agafonov, S. I.
2005-01-01
It is shown that discrete analogs of z^c and log(z) have the same asymptotic behavior as their smooth counterparts. These discrete maps are described in terms of special solutions of discrete Painleve-II equations, asymptotics of these solutions providing the behaviour of discrete z^c and log(z) at infinity.
International Nuclear Information System (INIS)
Zhang Yufeng; Fan Engui; Zhang Yongqing
2006-01-01
With the help of two semi-direct sum Lie algebras, an efficient way to construct discrete integrable couplings is proposed. As its applications, the discrete integrable couplings of the Toda-type lattice equations are obtained. The approach can be devoted to establishing other discrete integrable couplings of the discrete lattice integrable hierarchies of evolution equations
International Nuclear Information System (INIS)
Anon.
1994-01-01
For the years 1992 and 1993, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period. The tables and figures shown in this publication are: Changes in the volume of GNP and energy consumption; Coal consumption; Natural gas consumption; Peat consumption; Domestic oil deliveries; Import prices of oil; Price development of principal oil products; Fuel prices for power production; Total energy consumption by source; Electricity supply; Energy imports by country of origin in 1993; Energy exports by recipient country in 1993; Consumer prices of liquid fuels; Consumer prices of hard coal and natural gas, prices of indigenous fuels; Average electricity price by type of consumer; Price of district heating by type of consumer and Excise taxes and turnover taxes included in consumer prices of some energy sources
Urine Metabonomics Reveals Early Biomarkers in Diabetic Cognitive Dysfunction.
Song, Lili; Zhuang, Pengwei; Lin, Mengya; Kang, Mingqin; Liu, Hongyue; Zhang, Yuping; Yang, Zhen; Chen, Yunlong; Zhang, Yanjun
2017-09-01
Recently, increasing attention has been paid to diabetic encephalopathy, which is a frequent diabetic complication and affects nearly 30% of diabetics. Because cognitive dysfunction from diabetic encephalopathy might develop into irreversible dementia, early diagnosis and detection of this disease is of great significance for its prevention and treatment. This study is to investigate the early specific metabolites biomarkers in urine prior to the onset of diabetic cognitive dysfunction (DCD) by using metabolomics technology. An ultra-high performance liquid-chromatography-quadrupole time-of-flight-mass spectrometry (UPLC-Q/TOF-MS) platform was used to analyze the urine samples from diabetic mice that were associated with mild cognitive impairment (MCI) and nonassociated with MCI in the stage of diabetes (prior to the onset of DCD). We then screened and validated the early biomarkers using OPLS-DA model and support vector machine (SVM) method. Following multivariate statistical and integration analysis, we found that seven metabolites could be accepted as early biomarkers of DCD, and the SVM results showed that the prediction accuracy is as high as 91.66%. The identities of four biomarkers were determined by mass spectrometry. The identified biomarkers were largely involved in nicotinate and nicotinamide metabolism, glutathione metabolism, tryptophan metabolism, and sphingolipid metabolism. The present study first revealed reliable biomarkers for early diagnosis of DCD. It provides new insight and strategy for the early diagnosis and treatment of DCD.
Goodman, Joseph W.
2000-07-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research
Pivato, Marcus
2013-01-01
We show that, in a sufficiently large population satisfying certain statistical regularities, it is often possible to accurately estimate the utilitarian social welfare function, even if we only have very noisy data about individual utility functions and interpersonal utility comparisons. In particular, we show that it is often possible to identify an optimal or close-to-optimal utilitarian social choice using voting rules such as the Borda rule, approval voting, relative utilitarianism, or a...
Natrella, Mary Gibbons
1963-01-01
Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations
Cardiac Biomarkers and Cycling Race
Directory of Open Access Journals (Sweden)
Caroline Le Goff, Jean-François Kaux, Sébastien Goffaux, Etienne Cavalier
2015-06-01
Full Text Available In cycling as in other types of strenuous exercise, there exists a risk of sudden death. It is important both to understand its causes and to see if the behavior of certain biomarkers might highlight athletes at risk. Many reports describe changes in biomarkers after strenuous exercise (Nie et al., 2011, but interpreting these changes, and notably distinguishing normal physiological responses from pathological changes, is not easy. Here we have focused on the kinetics of different cardiac biomarkers: creatin kinase (CK, creating kinase midbrain (CK-MB, myoglobin (MYO, highly sensitive troponin T (hs-TnT and N-terminal brain natriuretic peptide (NT-proBNP. The population studied was a group of young trained cyclists participating in a 177-km cycling race. The group of individuals was selected for maximal homogeneity. Their annual training volume was between 10,000 and 16,000 kilometers. The rhythm of races is comparable and averages 35 km/h, depending on the race’s difficulty. The cardiac frequency was recorded via a heart rate monitor. Three blood tests were taken. The first blood test, T0, was taken approximately 2 hours before the start of the race and was intended to gather values which would act as references for the following tests. The second blood test, T1, was realized within 5 minutes of their arrival. The third and final blood test, T3, was taken 3 hours following their arrival. The CK, CK-MB, MYO, hs-TnT and NT-proBNP were measured on the Roche Diagnostic modular E (Manhein, Germany. For the statistical analysis, an ANOVA and post hoc test of Scheffé were calculated with the Statistica Software version 9.1. We noticed an important significant variation in the cardiac frequency between T0 and T1 (p < 0.0001, T0 and T3 (p < 0.0001, and T1 and T3 (p < 0.01. Table 1 shows the results obtained for the different biomarkers. CK and CK-MB showed significant variation between T0-T1 and T0-T3 (p < 0.0001. Myoglobin increased significantly
Cuspidal discrete series for projective hyperbolic spaces
DEFF Research Database (Denmark)
Andersen, Nils Byrial; Flensted-Jensen, Mogens
2013-01-01
Abstract. We have in [1] proposed a definition of cusp forms on semisimple symmetric spaces G/H, involving the notion of a Radon transform and a related Abel transform. For the real non-Riemannian hyperbolic spaces, we showed that there exists an infinite number of cuspidal discrete series......, and at most finitely many non-cuspidal discrete series, including in particular the spherical discrete series. For the projective spaces, the spherical discrete series are the only non-cuspidal discrete series. Below, we extend these results to the other hyperbolic spaces, and we also study the question...
Space-Time Discrete KPZ Equation
Cannizzaro, G.; Matetski, K.
2018-03-01
We study a general family of space-time discretizations of the KPZ equation and show that they converge to its solution. The approach we follow makes use of basic elements of the theory of regularity structures (Hairer in Invent Math 198(2):269-504, 2014) as well as its discrete counterpart (Hairer and Matetski in Discretizations of rough stochastic PDEs, 2015. arXiv:1511.06937). Since the discretization is in both space and time and we allow non-standard discretization for the product, the methods mentioned above have to be suitably modified in order to accommodate the structure of the models under study.
Monach, Paul A.
2014-01-01
Purpose of review Better biomarkers are needed for guiding management of patients with vasculitis. Large cohorts and technological advances had led to an increase in pre-clinical studies of potential biomarkers. Recent findings The most interesting markers described recently include a gene expression signature in CD8+ T cells that predicts tendency to relapse or remain relapse-free in ANCA-associated vasculitis, and a pair of urinary proteins that are elevated in Kawasaki disease but not other febrile illnesses. Both of these studies used “omics” technologies to generate and then test hypotheses. More conventional hypothesis-based studies have indicated that the following circulating proteins have potential to improve upon clinically available tests: pentraxin-3 in giant cell arteritis and Takayasu’s arteritis; von Willebrand factor antigen in childhood central nervous system vasculitis; eotaxin-3 and other markers related to eosinophils or Th2 immune responses in eosinophilic granulomatosis with polyangiitis (Churg-Strauss syndrome); and MMP-3, TIMP-1, and CXCL13 in ANCA-associated vasculitis. Summary New markers testable in blood and urine have the potential to assist with diagnosis, staging, assessment of current disease activity, and prognosis. However, the standards for clinical usefulness, in particular the demonstration of either very high sensitivity or very high specificity, have yet to be met for clinically relevant outcomes. PMID:24257367
Lassere, Marissa N
2008-06-01
There are clear advantages to using biomarkers and surrogate endpoints, but concerns about clinical and statistical validity and systematic methods to evaluate these aspects hinder their efficient application. Section 2 is a systematic, historical review of the biomarker-surrogate endpoint literature with special reference to the nomenclature, the systems of classification and statistical methods developed for their evaluation. In Section 3 an explicit, criterion-based, quantitative, multidimensional hierarchical levels of evidence schema - Biomarker-Surrogacy Evaluation Schema - is proposed to evaluate and co-ordinate the multiple dimensions (biological, epidemiological, statistical, clinical trial and risk-benefit evidence) of the biomarker clinical endpoint relationships. The schema systematically evaluates and ranks the surrogacy status of biomarkers and surrogate endpoints using defined levels of evidence. The schema incorporates the three independent domains: Study Design, Target Outcome and Statistical Evaluation. Each domain has items ranked from zero to five. An additional category called Penalties incorporates additional considerations of biological plausibility, risk-benefit and generalizability. The total score (0-15) determines the level of evidence, with Level 1 the strongest and Level 5 the weakest. The term ;surrogate' is restricted to markers attaining Levels 1 or 2 only. Surrogacy status of markers can then be directly compared within and across different areas of medicine to guide individual, trial-based or drug-development decisions. This schema would facilitate communication between clinical, researcher, regulatory, industry and consumer participants necessary for evaluation of the biomarker-surrogate-clinical endpoint relationship in their different settings.
Biomarkers for Wilms Tumor: a Systematic Review
Cone, Eugene B.; Dalton, Stewart S.; Van Noord, Megan; Tracy, Elizabeth T.; Rice, Henry E.; Routh, Jonathan C.
2016-01-01
Purpose Wilms tumor is the most common childhood renal malignancy and the fourth most common childhood cancer. Many biomarkers have been studied but there has been no comprehensive summary. We systematically reviewed the literature on biomarkers in Wilms Tumor with the objective of quantifying the prognostic implication of the presence of individual tumor markers. Methods We searched for English language studies from 1980–2015 performed on children with Wilms Tumor under 18 years old with prognostic data. The protocol was conducted as per PRISMA guidelines. Two reviewers abstracted data in duplicate using a standard evaluation form. We performed descriptive statistics, then calculated relative risks and 95% confidence intervals for markers appearing in multiple level 2 or 3 studies. Results 40 studies were included examining 32 biomarkers in 7381 Wilms patients. Studies had a median of 61 patients with 24 biomarker positive patients per study, and a median follow-up of 68.4 months. Median percent of patients in Stage 1, 2, 3, 4, and 5 were 28.5%, 26.4%, 24.5%, 14.1%, and 1.7%, with 10.2% anaplasia. The strongest negative prognostic association was loss of heterozygosity on 11p15, with a risk of recurrence of 5.00, although loss of heterozygosity on 1p and gain of function on 1q were also strongly linked to increased recurrence (2.93 and 2.86 respectively). Conclusions Several tumor markers are associated with an increased risk of recurrence or a decreased risk of overall survival in Wilms Tumor. These data suggest targets for development of diagnostic tests and potential therapies. PMID:27259655
Statistical Model Checking for Biological Systems
DEFF Research Database (Denmark)
David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel
2014-01-01
Statistical Model Checking (SMC) is a highly scalable simulation-based verification approach for testing and estimating the probability that a stochastic system satisfies a given linear temporal property. The technique has been applied to (discrete and continuous time) Markov chains, stochastic...
Applicability of statistical process control techniques
Schippers, W.A.J.
1998-01-01
This paper concerns the application of Process Control Techniques (PCTs) for the improvement of the technical performance of discrete production processes. Successful applications of these techniques, such as Statistical Process Control Techniques (SPC), can be found in the literature. However, some
Infinite Random Graphs as Statistical Mechanical Models
DEFF Research Database (Denmark)
Durhuus, Bergfinnur Jøgvan; Napolitano, George Maria
2011-01-01
We discuss two examples of infinite random graphs obtained as limits of finite statistical mechanical systems: a model of two-dimensional dis-cretized quantum gravity defined in terms of causal triangulated surfaces, and the Ising model on generic random trees. For the former model we describe a ...
Integrable discretizations of the short pulse equation
International Nuclear Information System (INIS)
Feng Baofeng; Maruno, Ken-ichi; Ohta, Yasuhiro
2010-01-01
In this paper, we propose integrable semi-discrete and full-discrete analogues of the short pulse (SP) equation. The key construction is the bilinear form and determinant structure of solutions of the SP equation. We also give the determinant formulas of N-soliton solutions of the semi-discrete and full-discrete analogues of the SP equations, from which the multi-loop and multi-breather solutions can be generated. In the continuous limit, the full-discrete SP equation converges to the semi-discrete SP equation, and then to the continuous SP equation. Based on the semi-discrete SP equation, an integrable numerical scheme, i.e. a self-adaptive moving mesh scheme, is proposed and used for the numerical computation of the short pulse equation.
Probability and statistics for computer science
Johnson, James L
2011-01-01
Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem
Kerr, Kathleen F; Meisner, Allison; Thiessen-Philbrook, Heather; Coca, Steven G; Parikh, Chirag R
2014-08-07
The field of nephrology is actively involved in developing biomarkers and improving models for predicting patients' risks of AKI and CKD and their outcomes. However, some important aspects of evaluating biomarkers and risk models are not widely appreciated, and statistical methods are still evolving. This review describes some of the most important statistical concepts for this area of research and identifies common pitfalls. Particular attention is paid to metrics proposed within the last 5 years for quantifying the incremental predictive value of a new biomarker. Copyright © 2014 by the American Society of Nephrology.
International Nuclear Information System (INIS)
Anon.
1989-01-01
World data from the United Nation's latest Energy Statistics Yearbook, first published in our last issue, are completed here. The 1984-86 data were revised and 1987 data added for world commercial energy production and consumption, world natural gas plant liquids production, world LP-gas production, imports, exports, and consumption, world residual fuel oil production, imports, exports, and consumption, world lignite production, imports, exports, and consumption, world peat production and consumption, world electricity production, imports, exports, and consumption (Table 80), and world nuclear electric power production
A residual Monte Carlo method for discrete thermal radiative diffusion
International Nuclear Information System (INIS)
Evans, T.M.; Urbatsch, T.J.; Lichtenstein, H.; Morel, J.E.
2003-01-01
Residual Monte Carlo methods reduce statistical error at a rate of exp(-bN), where b is a positive constant and N is the number of particle histories. Contrast this convergence rate with 1/√N, which is the rate of statistical error reduction for conventional Monte Carlo methods. Thus, residual Monte Carlo methods hold great promise for increased efficiency relative to conventional Monte Carlo methods. Previous research has shown that the application of residual Monte Carlo methods to the solution of continuum equations, such as the radiation transport equation, is problematic for all but the simplest of cases. However, the residual method readily applies to discrete systems as long as those systems are monotone, i.e., they produce positive solutions given positive sources. We develop a residual Monte Carlo method for solving a discrete 1D non-linear thermal radiative equilibrium diffusion equation, and we compare its performance with that of the discrete conventional Monte Carlo method upon which it is based. We find that the residual method provides efficiency gains of many orders of magnitude. Part of the residual gain is due to the fact that we begin each timestep with an initial guess equal to the solution from the previous timestep. Moreover, fully consistent non-linear solutions can be obtained in a reasonable amount of time because of the effective lack of statistical noise. We conclude that the residual approach has great potential and that further research into such methods should be pursued for more general discrete and continuum systems
Discrete geometric structures for architecture
Pottmann, Helmut
2010-06-13
The emergence of freeform structures in contemporary architecture raises numerous challenging research problems, most of which are related to the actual fabrication and are a rich source of research topics in geometry and geometric computing. The talk will provide an overview of recent progress in this field, with a particular focus on discrete geometric structures. Most of these result from practical requirements on segmenting a freeform shape into planar panels and on the physical realization of supporting beams and nodes. A study of quadrilateral meshes with planar faces reveals beautiful relations to discrete differential geometry. In particular, we discuss meshes which discretize the network of principal curvature lines. Conical meshes are among these meshes; they possess conical offset meshes at a constant face/face distance, which in turn leads to a supporting beam layout with so-called torsion free nodes. This work can be generalized to a variety of multilayer structures and laid the ground for an adapted curvature theory for these meshes. There are also efforts on segmenting surfaces into planar hexagonal panels. Though these are less constrained than planar quadrilateral panels, this problem is still waiting for an elegant solution. Inspired by freeform designs in architecture which involve circles and spheres, we present a new kind of triangle mesh whose faces\\' in-circles form a packing, i.e., the in-circles of two triangles with a common edge have the same contact point on that edge. These "circle packing (CP) meshes" exhibit an aesthetic balance of shape and size of their faces. They are closely tied to sphere packings on surfaces and to various remarkable structures and patterns which are of interest in art, architecture, and design. CP meshes constitute a new link between architectural freeform design and computational conformal geometry. Recently, certain timber structures motivated us to study discrete patterns of geodesics on surfaces. This
Radiative transfer on discrete spaces
Preisendorfer, Rudolph W; Stark, M; Ulam, S
1965-01-01
Pure and Applied Mathematics, Volume 74: Radiative Transfer on Discrete Spaces presents the geometrical structure of natural light fields. This book describes in detail with mathematical precision the radiometric interactions of light-scattering media in terms of a few well established principles.Organized into four parts encompassing 15 chapters, this volume begins with an overview of the derivations of the practical formulas and the arrangement of formulas leading to numerical solution procedures of radiative transfer problems in plane-parallel media. This text then constructs radiative tran
DEFF Research Database (Denmark)
Jørgensen, John Bagterp; Thomsen, Per Grove; Madsen, Henrik
2007-01-01
for nonlinear stochastic continuous-discrete time systems is more than two orders of magnitude faster than a conventional implementation. This is of significance in nonlinear model predictive control applications, statistical process monitoring as well as grey-box modelling of systems described by stochastic......We present a novel numerically robust and computationally efficient extended Kalman filter for state estimation in nonlinear continuous-discrete stochastic systems. The resulting differential equations for the mean-covariance evolution of the nonlinear stochastic continuous-discrete time systems...
Modeling Anti-Air Warfare With Discrete Event Simulation and Analyzing Naval Convoy Operations
2016-06-01
W., & Scheaffer, R. L. (2008). Mathematical statistics with applications . Belmont, CA: Cengage Learning. 118 THIS PAGE INTENTIONALLY LEFT BLANK...WARFARE WITH DISCRETE EVENT SIMULATION AND ANALYZING NAVAL CONVOY OPERATIONS by Ali E. Opcin June 2016 Thesis Advisor: Arnold H. Buss Co...REPORT DATE June 2016 3. REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE MODELING ANTI-AIR WARFARE WITH DISCRETE EVENT
Bioinformatics and biomarker discovery "Omic" data analysis for personalized medicine
Azuaje, Francisco
2010-01-01
This book is designed to introduce biologists, clinicians and computational researchers to fundamental data analysis principles, techniques and tools for supporting the discovery of biomarkers and the implementation of diagnostic/prognostic systems. The focus of the book is on how fundamental statistical and data mining approaches can support biomarker discovery and evaluation, emphasising applications based on different types of "omic" data. The book also discusses design factors, requirements and techniques for disease screening, diagnostic and prognostic applications. Readers are provided w
[Autoantibodies as biomarkers].
Tron, François
2014-01-01
Activation and differentiation of autoreactive B-lymphocytes lead to the production of autoantibodies, which are thus the direct consequence of the autoimmune process. They often constitute biomarkers of autoimmune diseases and are measured by tests displaying various diagnosis sensitivity and specificity. Autoantibody titers can be correlated to the disease activity and certain autoantibody populations associated with particular clinical manifestations or tissue lesions. The demonstration that autoantibodies appear years before the onset of autoimmune diseases indicates that their presence in healthy individuals may be a predictive marker of the occurrence of disease. Certain autoantibodies could also be predictive markers of a therapeutic response to biologics and of the occurrence of side effects as well. Thus, autoantibodies are useful tools in the diagnosis and the management of patients with organ specific or non-organ specific autoimmune diseases at different steps of the autoimmune process. Copyright © 2013. Published by Elsevier Masson SAS.
Biomarkers of adverse drug reactions.
Carr, Daniel F; Pirmohamed, Munir
2018-02-01
Adverse drug reactions can be caused by a wide range of therapeutics. Adverse drug reactions affect many bodily organ systems and vary widely in severity. Milder adverse drug reactions often resolve quickly following withdrawal of the casual drug or sometimes after dose reduction. Some adverse drug reactions are severe and lead to significant organ/tissue injury which can be fatal. Adverse drug reactions also represent a financial burden to both healthcare providers and the pharmaceutical industry. Thus, a number of stakeholders would benefit from development of new, robust biomarkers for the prediction, diagnosis, and prognostication of adverse drug reactions. There has been significant recent progress in identifying predictive genomic biomarkers with the potential to be used in clinical settings to reduce the burden of adverse drug reactions. These have included biomarkers that can be used to alter drug dose (for example, Thiopurine methyltransferase (TPMT) and azathioprine dose) and drug choice. The latter have in particular included human leukocyte antigen (HLA) biomarkers which identify susceptibility to immune-mediated injuries to major organs such as skin, liver, and bone marrow from a variety of drugs. This review covers both the current state of the art with regard to genomic adverse drug reaction biomarkers. We also review circulating biomarkers that have the potential to be used for both diagnosis and prognosis, and have the added advantage of providing mechanistic information. In the future, we will not be relying on single biomarkers (genomic/non-genomic), but on multiple biomarker panels, integrated through the application of different omics technologies, which will provide information on predisposition, early diagnosis, prognosis, and mechanisms. Impact statement • Genetic and circulating biomarkers present significant opportunities to personalize patient therapy to minimize the risk of adverse drug reactions. ADRs are a significant heath issue
3-D discrete analytical ridgelet transform.
Helbert, David; Carré, Philippe; Andres, Eric
2006-12-01
In this paper, we propose an implementation of the 3-D Ridgelet transform: the 3-D discrete analytical Ridgelet transform (3-D DART). This transform uses the Fourier strategy for the computation of the associated 3-D discrete Radon transform. The innovative step is the definition of a discrete 3-D transform with the discrete analytical geometry theory by the construction of 3-D discrete analytical lines in the Fourier domain. We propose two types of 3-D discrete lines: 3-D discrete radial lines going through the origin defined from their orthogonal projections and 3-D planes covered with 2-D discrete line segments. These discrete analytical lines have a parameter called arithmetical thickness, allowing us to define a 3-D DART adapted to a specific application. Indeed, the 3-D DART representation is not orthogonal, It is associated with a flexible redundancy factor. The 3-D DART has a very simple forward/inverse algorithm that provides an exact reconstruction without any iterative method. In order to illustrate the potentiality of this new discrete transform, we apply the 3-D DART and its extension to the Local-DART (with smooth windowing) to the denoising of 3-D image and color video. These experimental results show that the simple thresholding of the 3-D DART coefficients is efficient.
Directory of Open Access Journals (Sweden)
Li-xin XIE
2013-01-01
Full Text Available There is a higher sepsis rate in the intensive care unit (ICU patients, which is one of the most important causes for patient death, but the sepsis lacks specific clinical manifestations. Exploring sensitive and specific molecular markers for infection that accurately reflect infection severity and prognosis is very clinically important. In this article, based on our previous study, we introduce some new biomarkers with high sensitivity and specificity for the diagnosis and predicting the prognosis and severity of sepsis. Increase of serum soluble(s triggering receptor expressed on myeloid cells-1 (sTREM-1 suggests a poor prognosis of septic patients, and changes of locus rs2234237 of sTREM-1 may be the one of important mechanisms. Additionally, urine sTREM-1 can provide an early warning of possible secondary acute kidney injury (AKI in sepsis patients. Serum sCD163 level was found to be a more important factor than procalcitonin (PCT and C-reactive protein (CRP in prognosis of sepsis, especially severe sepsis. Moreover, urine sCD163 also shows excellent performance in the diagnosis of sepsis and sepsis-associated AKI. Circulating microRNAs, such as miR-150, miR-297, miR-574-5p, miR -146a , miR-223, miR -15a and miR-16, also play important roles in the evaluation of status of septic patients. In the foreseeable future, newly-emerging technologies, including proteomics, metabonomics and trans-omics, may exert profound effects on the discovery of valuable biomarkers for sepsis.
Discrete Model for the Structure and Strength of Cementitious Materials
Balopoulos, Victor D.; Archontas, Nikolaos; Pantazopoulou, Stavroula J.
2017-12-01
Cementitious materials are characterized by brittle behavior in direct tension and by transverse dilatation (due to microcracking) under compression. Microcracking causes increasingly larger transverse strains and a phenomenological Poisson's ratio that gradually increases to about ν =0.5 and beyond, at the limit point in compression. This behavior is due to the underlying structure of cementitious pastes which is simulated here with a discrete physical model. The computational model is generic, assembled from a statistically generated, continuous network of flaky dendrites consisting of cement hydrates that emanate from partially hydrated cement grains. In the actual amorphous material, the dendrites constitute the solid phase of the cement gel and interconnect to provide the strength and stiffness against load. The idealized dendrite solid is loaded in compression and tension to compute values for strength and Poisson's effects. Parametric studies are conducted, to calibrate the statistical parameters of the discrete model with the physical and mechanical characteristics of the material, so that the familiar experimental trends may be reproduced. The model provides a framework for the study of the mechanical behavior of the material under various states of stress and strain and can be used to model the effects of additives (e.g., fibers) that may be explicitly simulated in the discrete structure.
Statistical Hair on Black Holes
International Nuclear Information System (INIS)
Strominger, A.
1996-01-01
The Bekenstein-Hawking entropy for certain BPS-saturated black holes in string theory has recently been derived by counting internal black hole microstates at weak coupling. We argue that the black hole microstate can be measured by interference experiments even in the strong coupling region where there is clearly an event horizon. Extracting information which is naively behind the event horizon is possible due to the existence of statistical quantum hair carried by the black hole. This quantum hair arises from the arbitrarily large number of discrete gauge symmetries present in string theory. copyright 1996 The American Physical Society
Aptamer-based multiplexed proteomic technology for biomarker discovery.
Gold, Larry; Ayers, Deborah; Bertino, Jennifer; Bock, Christopher; Bock, Ashley; Brody, Edward N; Carter, Jeff; Dalby, Andrew B; Eaton, Bruce E; Fitzwater, Tim; Flather, Dylan; Forbes, Ashley; Foreman, Trudi; Fowler, Cate; Gawande, Bharat; Goss, Meredith; Gunn, Magda; Gupta, Shashi; Halladay, Dennis; Heil, Jim; Heilig, Joe; Hicke, Brian; Husar, Gregory; Janjic, Nebojsa; Jarvis, Thale; Jennings, Susan; Katilius, Evaldas; Keeney, Tracy R; Kim, Nancy; Koch, Tad H; Kraemer, Stephan; Kroiss, Luke; Le, Ngan; Levine, Daniel; Lindsey, Wes; Lollo, Bridget; Mayfield, Wes; Mehan, Mike; Mehler, Robert; Nelson, Sally K; Nelson, Michele; Nieuwlandt, Dan; Nikrad, Malti; Ochsner, Urs; Ostroff, Rachel M; Otis, Matt; Parker, Thomas; Pietrasiewicz, Steve; Resnicow, Daniel I; Rohloff, John; Sanders, Glenn; Sattin, Sarah; Schneider, Daniel; Singer, Britta; Stanton, Martin; Sterkel, Alana; Stewart, Alex; Stratford, Suzanne; Vaught, Jonathan D; Vrkljan, Mike; Walker, Jeffrey J; Watrobka, Mike; Waugh, Sheela; Weiss, Allison; Wilcox, Sheri K; Wolfson, Alexey; Wolk, Steven K; Zhang, Chi; Zichi, Dom
2010-12-07
The interrogation of proteomes ("proteomics") in a highly multiplexed and efficient manner remains a coveted and challenging goal in biology and medicine. We present a new aptamer-based proteomic technology for biomarker discovery capable of simultaneously measuring thousands of proteins from small sample volumes (15 µL of serum or plasma). Our current assay measures 813 proteins with low limits of detection (1 pM median), 7 logs of overall dynamic range (~100 fM-1 µM), and 5% median coefficient of variation. This technology is enabled by a new generation of aptamers that contain chemically modified nucleotides, which greatly expand the physicochemical diversity of the large randomized nucleic acid libraries from which the aptamers are selected. Proteins in complex matrices such as plasma are measured with a process that transforms a signature of protein concentrations into a corresponding signature of DNA aptamer concentrations, which is quantified on a DNA microarray. Our assay takes advantage of the dual nature of aptamers as both folded protein-binding entities with defined shapes and unique nucleotide sequences recognizable by specific hybridization probes. To demonstrate the utility of our proteomics biomarker discovery technology, we applied it to a clinical study of chronic kidney disease (CKD). We identified two well known CKD biomarkers as well as an additional 58 potential CKD biomarkers. These results demonstrate the potential utility of our technology to rapidly discover unique protein signatures characteristic of various disease states. We describe a versatile and powerful tool that allows large-scale comparison of proteome profiles among discrete populations. This unbiased and highly multiplexed search engine will enable the discovery of novel biomarkers in a manner that is unencumbered by our incomplete knowledge of biology, thereby helping to advance the next generation of evidence-based medicine.
Aptamer-based multiplexed proteomic technology for biomarker discovery.
Directory of Open Access Journals (Sweden)
Larry Gold
2010-12-01
Full Text Available The interrogation of proteomes ("proteomics" in a highly multiplexed and efficient manner remains a coveted and challenging goal in biology and medicine.We present a new aptamer-based proteomic technology for biomarker discovery capable of simultaneously measuring thousands of proteins from small sample volumes (15 µL of serum or plasma. Our current assay measures 813 proteins with low limits of detection (1 pM median, 7 logs of overall dynamic range (~100 fM-1 µM, and 5% median coefficient of variation. This technology is enabled by a new generation of aptamers that contain chemically modified nucleotides, which greatly expand the physicochemical diversity of the large randomized nucleic acid libraries from which the aptamers are selected. Proteins in complex matrices such as plasma are measured with a process that transforms a signature of protein concentrations into a corresponding signature of DNA aptamer concentrations, which is quantified on a DNA microarray. Our assay takes advantage of the dual nature of aptamers as both folded protein-binding entities with defined shapes and unique nucleotide sequences recognizable by specific hybridization probes. To demonstrate the utility of our proteomics biomarker discovery technology, we applied it to a clinical study of chronic kidney disease (CKD. We identified two well known CKD biomarkers as well as an additional 58 potential CKD biomarkers. These results demonstrate the potential utility of our technology to rapidly discover unique protein signatures characteristic of various disease states.We describe a versatile and powerful tool that allows large-scale comparison of proteome profiles among discrete populations. This unbiased and highly multiplexed search engine will enable the discovery of novel biomarkers in a manner that is unencumbered by our incomplete knowledge of biology, thereby helping to advance the next generation of evidence-based medicine.
Multiple Sclerosis Cerebrospinal Fluid Biomarkers
Directory of Open Access Journals (Sweden)
Gavin Giovannoni
2006-01-01
Full Text Available Cerebrospinal fluid (CSF is the body fluid closest to the pathology of multiple sclerosis (MS. For many candidate biomarkers CSF is the only fluid that can be investigated. Several factors need to be standardized when sampling CSF for biomarker research: time/volume of CSF collection, sample processing/storage, and the temporal relationship of sampling to clinical or MRI markers of disease activity. Assays used for biomarker detection must be validated so as to optimize the power of the studies. A formal method for establishing whether or not a particular biomarker can be used as a surrogate end-point needs to be adopted. This process is similar to that used in clinical trials, where the reporting of studies has to be done in a standardized way with sufficient detail to permit a critical review of the study and to enable others to reproduce the study design. A commitment must be made to report negative studies so as to prevent publication bias. Pre-defined consensus criteria need to be developed for MS-related prognostic biomarkers. Currently no candidate biomarker is suitable as a surrogate end-point. Bulk biomarkers of the neurodegenerative process such as glial fibrillary acidic protein (GFAP and neurofilaments (NF have advantages over intermittent inflammatory markers.
Reinventing clinical trials: a review of innovative biomarker trial designs in cancer therapies.
Lin, Ja-An; He, Pei
2015-06-01
Recently, new clinical trial designs involving biomarkers have been studied and proposed in cancer clinical research, in the hope of incorporating the rapid growing basic research into clinical practices. Journal articles related to various biomarkers and their role in cancer clinical trial, articles and books about statistical issues in trial design, and regulatory website, documents, and guidance for submission of targeted cancer therapies. The drug development process involves four phases. The confirmatory Phase III is essential in regulatory approval of a special treatment. Regulatory agency has restrictions on confirmatory trials 'using adaptive designs'. No rule of thumb to pick the most appropriate design for biomarker-related trials. Statistical issues to solve in new designs. Regulatory acceptance of the 'newly proposed trial designs'. Biomarker-related trial designs that can resolve the statistical issues and satisfy the regulatory requirement. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Biomarkers for Detecting Mitochondrial Disorders
Directory of Open Access Journals (Sweden)
Josef Finsterer
2018-01-01
Full Text Available (1 Objectives: Mitochondrial disorders (MIDs are a genetically and phenotypically heterogeneous group of slowly or rapidly progressive disorders with onset from birth to senescence. Because of their variegated clinical presentation, MIDs are difficult to diagnose and are frequently missed in their early and late stages. This is why there is a need to provide biomarkers, which can be easily obtained in the case of suspecting a MID to initiate the further diagnostic work-up. (2 Methods: Literature review. (3 Results: Biomarkers for diagnostic purposes are used to confirm a suspected diagnosis and to facilitate and speed up the diagnostic work-up. For diagnosing MIDs, a number of dry and wet biomarkers have been proposed. Dry biomarkers for MIDs include the history and clinical neurological exam and structural and functional imaging studies of the brain, muscle, or myocardium by ultrasound, computed tomography (CT, magnetic resonance imaging (MRI, MR-spectroscopy (MRS, positron emission tomography (PET, or functional MRI. Wet biomarkers from blood, urine, saliva, or cerebrospinal fluid (CSF for diagnosing MIDs include lactate, creatine-kinase, pyruvate, organic acids, amino acids, carnitines, oxidative stress markers, and circulating cytokines. The role of microRNAs, cutaneous respirometry, biopsy, exercise tests, and small molecule reporters as possible biomarkers is unsolved. (4 Conclusions: The disadvantages of most putative biomarkers for MIDs are that they hardly meet the criteria for being acceptable as a biomarker (missing longitudinal studies, not validated, not easily feasible, not cheap, not ubiquitously available and that not all MIDs manifest in the brain, muscle, or myocardium. There is currently a lack of validated biomarkers for diagnosing MIDs.
Tumor antigens as proteogenomic biomarkers in invasive ductal carcinomas
DEFF Research Database (Denmark)
Olsen, Lars Rønn; Campos, Benito; Winther, Ole
2014-01-01
directly linked to the hallmarks of cancer. The results found by proteogenomic analysis of the 32 tumor antigens studied here, capture largely the same pathway irregularities as those elucidated from large-scale screening of genomics analyses, where several thousands of genes are often found......Background: The majority of genetic biomarkers for human cancers are defined by statistical screening of high-throughput genomics data. While a large number of genetic biomarkers have been proposed for diagnostic and prognostic applications, only a small number have been applied in the clinic....... Similarly, the use of proteomics methods for the discovery of cancer biomarkers is increasing. The emerging field of proteogenomics seeks to enrich the value of genomics and proteomics approaches by studying the intersection of genomics and proteomics data. This task is challenging due to the complex nature...
Discrete geometry: speculations on a new framework for classical electrodynamics
International Nuclear Information System (INIS)
Hemion, G.
1988-01-01
An attempt is made to describe the basic principles of physics in terms of discrete partially ordered sets. Geometric ideas are introduced by means of an action at a distance formulation of classical electrodynamics. The speculations are in two main directions: (i) Gravity, one of the four elementary forces of nature, seems to be fundamentally different from the other three forces. Could it be that gravity can be explained as a natural consequence of the discrete structure? (ii) The problem of the observer in quantum mechanics continues to cause conceptual problems. Can quantum statistics be explained in terms of finite ensembles of possible partially ordered sets? The development is guided at all stages by reference to the simplest, and most well-established principles of physics
Fermion systems in discrete space-time
International Nuclear Information System (INIS)
Finster, Felix
2007-01-01
Fermion systems in discrete space-time are introduced as a model for physics on the Planck scale. We set up a variational principle which describes a non-local interaction of all fermions. This variational principle is symmetric under permutations of the discrete space-time points. We explain how for minimizers of the variational principle, the fermions spontaneously break this permutation symmetry and induce on space-time a discrete causal structure
Fermion systems in discrete space-time
Energy Technology Data Exchange (ETDEWEB)
Finster, Felix [NWF I - Mathematik, Universitaet Regensburg, 93040 Regensburg (Germany)
2007-05-15
Fermion systems in discrete space-time are introduced as a model for physics on the Planck scale. We set up a variational principle which describes a non-local interaction of all fermions. This variational principle is symmetric under permutations of the discrete space-time points. We explain how for minimizers of the variational principle, the fermions spontaneously break this permutation symmetry and induce on space-time a discrete causal structure.
Fermion Systems in Discrete Space-Time
Finster, Felix
2006-01-01
Fermion systems in discrete space-time are introduced as a model for physics on the Planck scale. We set up a variational principle which describes a non-local interaction of all fermions. This variational principle is symmetric under permutations of the discrete space-time points. We explain how for minimizers of the variational principle, the fermions spontaneously break this permutation symmetry and induce on space-time a discrete causal structure.
Fermion systems in discrete space-time
Finster, Felix
2007-05-01
Fermion systems in discrete space-time are introduced as a model for physics on the Planck scale. We set up a variational principle which describes a non-local interaction of all fermions. This variational principle is symmetric under permutations of the discrete space-time points. We explain how for minimizers of the variational principle, the fermions spontaneously break this permutation symmetry and induce on space-time a discrete causal structure.
Directory of Open Access Journals (Sweden)
Are Hugo Pripp
Full Text Available Chronic subdural hematoma (CSDH is characterized by an "old" encapsulated collection of blood and blood breakdown products between the brain and its outermost covering (the dura. Recognized risk factors for development of CSDH are head injury, old age and using anticoagulation medication, but its underlying pathophysiological processes are still unclear. It is assumed that a complex local process of interrelated mechanisms including inflammation, neomembrane formation, angiogenesis and fibrinolysis could be related to its development and propagation. However, the association between the biomarkers of inflammation and angiogenesis, and the clinical and radiological characteristics of CSDH patients, need further investigation. The high number of biomarkers compared to the number of observations, the correlation between biomarkers, missing data and skewed distributions may limit the usefulness of classical statistical methods. We therefore explored lasso regression to assess the association between 30 biomarkers of inflammation and angiogenesis at the site of lesions, and selected clinical and radiological characteristics in a cohort of 93 patients. Lasso regression performs both variable selection and regularization to improve the predictive accuracy and interpretability of the statistical model. The results from the lasso regression showed analysis exhibited lack of robust statistical association between the biomarkers in hematoma fluid with age, gender, brain infarct, neurological deficiencies and volume of hematoma. However, there were associations between several of the biomarkers with postoperative recurrence requiring reoperation. The statistical analysis with lasso regression supported previous findings that the immunological characteristics of CSDH are local. The relationship between biomarkers, the radiological appearance of lesions and recurrence requiring reoperation have been inclusive using classical statistical methods on these data
National Statistical Commission and Indian Official Statistics*
Indian Academy of Sciences (India)
IAS Admin
a good collection of official statistics of that time. With more .... statistical agencies and institutions to provide details of statistical activities .... ing several training programmes. .... ful completion of Indian Statistical Service examinations, the.
Biomarker identification and effect estimation on schizophrenia –a high dimensional data analysis
Directory of Open Access Journals (Sweden)
Yuanzhang eLi
2015-05-01
Full Text Available Biomarkers have been examined in schizophrenia research for decades. Medical morbidity and mortality rates, as well as personal and societal costs, are associated with schizophrenia patients. The identification of biomarkers and alleles, which often have a small effect individually, may help to develop new diagnostic tests for early identification and treatment. Currently, there is not a commonly accepted statistical approach to identify predictive biomarkers from high dimensional data. We used space Decomposition-Gradient-Regression method (DGR to select biomarkers, which are associated with the risk of schizophrenia. Then, we used the gradient scores, generated from the selected biomarkers, as the prediction factor in regression to estimate their effects. We also used an alternative approach, classification and regression tree (CART, to compare the biomarker selected by DGR and found about 70% of the selected biomarkers were the same. However, the advantage of DGR is that it can evaluate individual effects for each biomarker from their combined effect. In DGR analysis of serum specimens of US military service members with a diagnosis of schizophrenia from 1992 to 2005 and their controls, Alpha-1-Antitrypsin (AAT, Interleukin-6 receptor (IL-6r and Connective Tissue Growth Factor (CTGF were selected to identify schizophrenia for males; and Alpha-1-Antitrypsin (AAT, Apolipoprotein B (Apo B and Sortilin were selected for females. If these findings from military subjects are replicated by other studies, they suggest the possibility of a novel biomarker panel as an adjunct to earlier diagnosis and initiation of treatment.
Some Statistics for Measuring Large-Scale Structure
Brandenberger, Robert H.; Kaplan, David M.; A, Stephen; Ramsey
1993-01-01
Good statistics for measuring large-scale structure in the Universe must be able to distinguish between different models of structure formation. In this paper, two and three dimensional ``counts in cell" statistics and a new ``discrete genus statistic" are applied to toy versions of several popular theories of structure formation: random phase cold dark matter model, cosmic string models, and global texture scenario. All three statistics appear quite promising in terms of differentiating betw...
Wang, Jiahui; Fan, Zheng; Vandenborne, Krista; Walter, Glenn; Shiloh-Malawsky, Yael; An, Hongyu; Kornegay, Joe N; Styner, Martin A
2013-09-01
Golden retriever muscular dystrophy (GRMD) is a widely used canine model of Duchenne muscular dystrophy (DMD). Recent studies have shown that magnetic resonance imaging (MRI) can be used to non-invasively detect consistent changes in both DMD and GRMD. In this paper, we propose a semiautomated system to quantify MRI biomarkers of GRMD. Our system was applied to a database of 45 MRI scans from 8 normal and 10 GRMD dogs in a longitudinal natural history study. We first segmented six proximal pelvic limb muscles using a semiautomated full muscle segmentation method. We then performed preprocessing, including intensity inhomogeneity correction, spatial registration of different image sequences, intensity calibration of T2-weighted and T2-weighted fat-suppressed images, and calculation of MRI biomarker maps. Finally, for each of the segmented muscles, we automatically measured MRI biomarkers of muscle volume, intensity statistics over MRI biomarker maps, and statistical image texture features. The muscle volume and the mean intensities in T2 value, fat, and water maps showed group differences between normal and GRMD dogs. For the statistical texture biomarkers, both the histogram and run-length matrix features showed obvious group differences between normal and GRMD dogs. The full muscle segmentation showed significantly less error and variability in the proposed biomarkers when compared to the standard, limited muscle range segmentation. The experimental results demonstrated that this quantification tool could reliably quantify MRI biomarkers in GRMD dogs, suggesting that it would also be useful for quantifying disease progression and measuring therapeutic effect in DMD patients.
Discrete Haar transform and protein structure.
Morosetti, S
1997-12-01
The discrete Haar transform of the sequence of the backbone dihedral angles (phi and psi) was performed over a set of X-ray protein structures of high resolution from the Brookhaven Protein Data Bank. Afterwards, the new dihedral angles were calculated by the inverse transform, using a growing number of Haar functions, from the lower to the higher degree. New structures were obtained using these dihedral angles, with standard values for bond lengths and angles, and with omega = 0 degree. The reconstructed structures were compared with the experimental ones, and analyzed by visual inspection and statistical analysis. When half of the Haar coefficients were used, all the reconstructed structures were not yet collapsed to a tertiary folding, but they showed yet realized most of the secondary motifs. These results indicate a substantial separation of structural information in the space of Haar transform, with the secondary structural information mainly present in the Haar coefficients of lower degrees, and the tertiary one present in the higher degree coefficients. Because of this separation, the representation of the folded structures in the space of Haar transform seems a promising candidate to encompass the problem of premature convergence in genetic algorithms.
Inevitable randomness in discrete mathematics
Beck, Jozsef
2009-01-01
Mathematics has been called the science of order. The subject is remarkably good for generalizing specific cases to create abstract theories. However, mathematics has little to say when faced with highly complex systems, where disorder reigns. This disorder can be found in pure mathematical arenas, such as the distribution of primes, the 3n+1 conjecture, and class field theory. The purpose of this book is to provide examples--and rigorous proofs--of the complexity law: (1) discrete systems are either simple or they exhibit advanced pseudorandomness; (2) a priori probabilities often exist even when there is no intrinsic symmetry. Part of the difficulty in achieving this purpose is in trying to clarify these vague statements. The examples turn out to be fascinating instances of deep or mysterious results in number theory and combinatorics. This book considers randomness and complexity. The traditional approach to complexity--computational complexity theory--is to study very general complexity classes, such as P...
Quantum evolution by discrete measurements
International Nuclear Information System (INIS)
Roa, L; Guevara, M L Ladron de; Delgado, A; Olivares-RenterIa, G; Klimov, A B
2007-01-01
In this article we review two ways of driving a quantum system to a known pure state via a sequence discrete of von Neumann measurements. The first of them assumes that the initial state of the system is unknown, and the evolution is attained only with the help of two non-commuting observables. For this method, the overall success probability is maximized when the eigentstates of the involved observables constitute mutually unbiased bases. The second method assumes the initial state is known and it uses N observables which are consecutively measured to make the state of the system approach the target state. The probability of success of this procedure converges to 1 as the number of observables increases
Quantum evolution by discrete measurements
Energy Technology Data Exchange (ETDEWEB)
Roa, L [Center for Quantum Optics and Quantum Information, Departamento de Fisica, Universidad de Concepcion, Casilla 160-C, Concepcion (Chile); Guevara, M L Ladron de [Departamento de Fisica, Universidad Catolica del Norte, Casilla 1280, Antofagasta (Chile); Delgado, A [Center for Quantum Optics and Quantum Information, Departamento de Fisica, Universidad de Concepcion, Casilla 160-C, Concepcion (Chile); Olivares-RenterIa, G [Center for Quantum Optics and Quantum Information, Departamento de Fisica, Universidad de Concepcion, Casilla 160-C, Concepcion (Chile); Klimov, A B [Departamento de Fisica, Universidad de Guadalajara, Revolucion 1500, 44420 Guadalajara, Jalisco (Mexico)
2007-10-15
In this article we review two ways of driving a quantum system to a known pure state via a sequence discrete of von Neumann measurements. The first of them assumes that the initial state of the system is unknown, and the evolution is attained only with the help of two non-commuting observables. For this method, the overall success probability is maximized when the eigentstates of the involved observables constitute mutually unbiased bases. The second method assumes the initial state is known and it uses N observables which are consecutively measured to make the state of the system approach the target state. The probability of success of this procedure converges to 1 as the number of observables increases.
Discrete stochastic processes and applications
Collet, Jean-François
2018-01-01
This unique text for beginning graduate students gives a self-contained introduction to the mathematical properties of stochastics and presents their applications to Markov processes, coding theory, population dynamics, and search engine design. The book is ideal for a newly designed course in an introduction to probability and information theory. Prerequisites include working knowledge of linear algebra, calculus, and probability theory. The first part of the text focuses on the rigorous theory of Markov processes on countable spaces (Markov chains) and provides the basis to developing solid probabilistic intuition without the need for a course in measure theory. The approach taken is gradual beginning with the case of discrete time and moving on to that of continuous time. The second part of this text is more applied; its core introduces various uses of convexity in probability and presents a nice treatment of entropy.
Discrete calculus methods for counting
Mariconda, Carlo
2016-01-01
This book provides an introduction to combinatorics, finite calculus, formal series, recurrences, and approximations of sums. Readers will find not only coverage of the basic elements of the subjects but also deep insights into a range of less common topics rarely considered within a single book, such as counting with occupancy constraints, a clear distinction between algebraic and analytical properties of formal power series, an introduction to discrete dynamical systems with a thorough description of Sarkovskii’s theorem, symbolic calculus, and a complete description of the Euler-Maclaurin formulas and their applications. Although several books touch on one or more of these aspects, precious few cover all of them. The authors, both pure mathematicians, have attempted to develop methods that will allow the student to formulate a given problem in a precise mathematical framework. The aim is to equip readers with a sound strategy for classifying and solving problems by pursuing a mathematically rigorous yet ...
Modeling discrete competitive facility location
Karakitsiou, Athanasia
2015-01-01
This book presents an up-to-date review of modeling and optimization approaches for location problems along with a new bi-level programming methodology which captures the effect of competition of both producers and customers on facility location decisions. While many optimization approaches simplify location problems by assuming decision making in isolation, this monograph focuses on models which take into account the competitive environment in which such decisions are made. New insights in modeling, algorithmic and theoretical possibilities are opened by this approach and new applications are possible. Competition on equal term plus competition between market leader and followers are considered in this study, consequently bi-level optimization methodology is emphasized and further developed. This book provides insights regarding modeling complexity and algorithmic approaches to discrete competitive location problems. In traditional location modeling, assignment of customer demands to supply sources are made ...
Discrete modelling of drapery systems
Thoeni, Klaus; Giacomini, Anna
2016-04-01
Drapery systems are an efficient and cost-effective measure in preventing and controlling rockfall hazards on rock slopes. The simplest form consists of a row of ground anchors along the top of the slope connected to a horizontal support cable from which a wire mesh is suspended down the face of the slope. Such systems are generally referred to as simple or unsecured draperies (Badger and Duffy 2012). Variations such as secured draperies, where a pattern of ground anchors is incorporated within the field of the mesh, and hybrid systems, where the upper part of an unsecured drapery is elevated to intercept rockfalls originating upslope of the installation, are becoming more and more popular. This work presents a discrete element framework for simulation of unsecured drapery systems and its variations. The numerical model is based on the classical discrete element method (DEM) and implemented into the open-source framework YADE (Šmilauer et al., 2010). The model takes all relevant interactions between block, drapery and slope into account (Thoeni et al., 2014) and was calibrated and validated based on full-scale experiments (Giacomini et al., 2012).The block is modelled as a rigid clump made of spherical particles which allows any shape to be approximated. The drapery is represented by a set of spherical particle with remote interactions. The behaviour of the remote interactions is governed by the constitutive behaviour of the wire and generally corresponds to a piecewise linear stress-strain relation (Thoeni et al., 2013). The same concept is used to model wire ropes. The rock slope is represented by rigid triangular elements where material properties (e.g., normal coefficient of restitution, friction angle) are assigned to each triangle. The capabilities of the developed model to simulate drapery systems and estimate the residual hazard involved with such systems is shown. References Badger, T.C., Duffy, J.D. (2012) Drapery systems. In: Turner, A.K., Schuster R
Tellinghuisen, Joel
2008-01-01
The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.
Urinary Biomarkers of Brain Diseases
Directory of Open Access Journals (Sweden)
Manxia An
2015-12-01
Full Text Available Biomarkers are the measurable changes associated with a physiological or pathophysiological process. Unlike blood, urine is not subject to homeostatic mechanisms. Therefore, greater fluctuations could occur in urine than in blood, better reflecting the changes in human body. The roadmap of urine biomarker era was proposed. Although urine analysis has been attempted for clinical diagnosis, and urine has been monitored during the progression of many diseases, particularly urinary system diseases, whether urine can reflect brain disease status remains uncertain. As some biomarkers of brain diseases can be detected in the body fluids such as cerebrospinal fluid and blood, there is a possibility that urine also contain biomarkers of brain diseases. This review summarizes the clues of brain diseases reflected in the urine proteome and metabolome.
Biomarkers of latent TB infection
DEFF Research Database (Denmark)
Ruhwald, Morten; Ravn, Pernille
2009-01-01
For the last 100 years, the tuberculin skin test (TST) has been the only diagnostic tool available for latent TB infection (LTBI) and no biomarker per se is available to diagnose the presence of LTBI. With the introduction of M. tuberculosis-specific IFN-gamma release assays (IGRAs), a new area...... of in vitro immunodiagnostic tests for LTBI based on biomarker readout has become a reality. In this review, we discuss existing evidence on the clinical usefulness of IGRAs and the indefinite number of potential new biomarkers that can be used to improve diagnosis of latent TB infection. We also present...... early data suggesting that the monocyte-derived chemokine inducible protein-10 may be useful as a novel biomarker for the immunodiagnosis of latent TB infection....
COPD Exacerbation Biomarkers Validated Using Multiple Reaction Monitoring Mass Spectrometry.
Directory of Open Access Journals (Sweden)
Janice M Leung
Full Text Available Acute exacerbations of chronic obstructive pulmonary disease (AECOPD result in considerable morbidity and mortality. However, there are no objective biomarkers to diagnose AECOPD.We used multiple reaction monitoring mass spectrometry to quantify 129 distinct proteins in plasma samples from patients with COPD. This analytical approach was first performed in a biomarker cohort of patients hospitalized with AECOPD (Cohort A, n = 72. Proteins differentially expressed between AECOPD and convalescent states were chosen using a false discovery rate 1.2. Protein selection and classifier building were performed using an elastic net logistic regression model. The performance of the biomarker panel was then tested in two independent AECOPD cohorts (Cohort B, n = 37, and Cohort C, n = 109 using leave-pair-out cross-validation methods.Five proteins were identified distinguishing AECOPD and convalescent states in Cohort A. Biomarker scores derived from this model were significantly higher during AECOPD than in the convalescent state in the discovery cohort (p<0.001. The receiver operating characteristic cross-validation area under the curve (CV-AUC statistic was 0.73 in Cohort A, while in the replication cohorts the CV-AUC was 0.77 for Cohort B and 0.79 for Cohort C.A panel of five biomarkers shows promise in distinguishing AECOPD from convalescence and may provide the basis for a clinical blood test to diagnose AECOPD. Further validation in larger cohorts is necessary for future clinical translation.
Biomarkers in differentiating clinical dengue cases: A prospective cohort study
Directory of Open Access Journals (Sweden)
Gary Kim Kuan Low
2015-12-01
Full Text Available Objective: To evaluate five biomarkers (neopterin, vascular endothelial growth factor-A, thrombomodulin, soluble vascular cell adhesion molecule 1 and pentraxin 3 in differentiating clinical dengue cases. Methods: A prospective cohort study was conducted whereby the blood samples were obtained at day of presentation and the final diagnosis were obtained at the end of patients’ follow-up. All patients included in the study were 15 years old or older, not pregnant, not infected by dengue previously and did not have cancer, autoimmune or haematological disorder. Median test was performed to compare the biomarker levels. A subgroup Mann-Whitney U test was analysed between severe dengue and non-severe dengue cases. Monte Carlo method was used to estimate the 2-tailed probability (P value for independent variables with unequal number of patients. Results: All biomarkers except thrombomodulin has P value < 0.001 in differentiating among the healthy subjects, non-dengue fever, dengue without warning signs and dengue with warning signs/severe dengue. Subgroup analysis for all the biomarkers between severe dengue and non-severe dengue cases was not statistically significant except vascular endothelial growth factor-A (P < 0.05. Conclusions: Certain biomarkers were able to differentiate the clinical dengue cases. This could be potentially useful in classifying and determining the severity of dengue infected patients in the hospital.
Breath biomarkers in toxicology.
Pleil, Joachim D
2016-11-01
Exhaled breath has joined blood and urine as a valuable resource for sampling and analyzing biomarkers in human media for assessing exposure, uptake metabolism, and elimination of toxic chemicals. This article focuses current use of exhaled gas, aerosols, and vapor in human breath, the methods for collection, and ultimately the use of the resulting data. Some advantages of breath are the noninvasive and self-administered nature of collection, the essentially inexhaustible supply, and that breath sampling does not produce potentially infectious waste such as needles, wipes, bandages, and glassware. In contrast to blood and urine, breath samples can be collected on demand in rapid succession and so allow toxicokinetic observations of uptake and elimination in any time frame. Furthermore, new technologies now allow capturing condensed breath vapor directly, or just the aerosol fraction alone, to gain access to inorganic species, lung pH, proteins and protein fragments, cellular DNA, and whole microorganisms from the pulmonary microbiome. Future applications are discussed, especially the use of isotopically labeled probes, non-targeted (discovery) analysis, cellular level toxicity testing, and ultimately assessing "crowd breath" of groups of people and the relation to dose of airborne and other environmental chemicals at the population level.
Semiclassical analysis, Witten Laplacians, and statistical mechanis
Helffer, Bernard
2002-01-01
This important book explains how the technique of Witten Laplacians may be useful in statistical mechanics. It considers the problem of analyzing the decay of correlations, after presenting its origin in statistical mechanics. In addition, it compares the Witten Laplacian approach with other techniques, such as the transfer matrix approach and its semiclassical analysis. The author concludes by providing a complete proof of the uniform Log-Sobolev inequality. Contents: Witten Laplacians Approach; Problems in Statistical Mechanics with Discrete Spins; Laplace Integrals and Transfer Operators; S
Statistical mechanics of cellular automata
International Nuclear Information System (INIS)
Wolfram, S.
1983-01-01
Cellular automata are used as simple mathematical models to investigate self-organization in statistical mechanics. A detailed analysis is given of ''elementary'' cellular automata consisting of a sequence of sites with values 0 or 1 on a line, with each site evolving deterministically in discrete time steps according to p definite rules involving the values of its nearest neighbors. With simple initial configurations, the cellular automata either tend to homogeneous states, or generate self-similar patterns with fractal dimensions approx. =1.59 or approx. =1.69. With ''random'' initial configurations, the irreversible character of the cellular automaton evolution leads to several self-organization phenomena. Statistical properties of the structures generated are found to lie in two universality classes, independent of the details of the initial state or the cellular automaton rules. More complicated cellular automata are briefly considered, and connections with dynamical systems theory and the formal theory of computation are discussed
A Discrete Spectral Problem and Related Hierarchy of Discrete Hamiltonian Lattice Equations
International Nuclear Information System (INIS)
Xu Xixiang; Cao Weili
2007-01-01
Staring from a discrete matrix spectral problem, a hierarchy of lattice soliton equations is presented though discrete zero curvature representation. The resulting lattice soliton equations possess non-local Lax pairs. The Hamiltonian structures are established for the resulting hierarchy by the discrete trace identity. Liouville integrability of resulting hierarchy is demonstrated.
Lassere, Marissa N.; Johnson, Kent R.; Boers, Maarten; Tugwell, Peter; Brooks, Peter; Simon, Lee; Strand, Vibeke; Conaghan, Philip G.; Ostergaard, Mikkel; Maksymowych, Walter P.; Landewe, Robert; Bresnihan, Barry; Tak, Paul-Peter; Wakefield, Richard; Mease, Philip; Bingham, Clifton O.; Hughes, Michael; Altman, Doug; Buyse, Marc; Galbraith, Sally; Wells, George
2007-01-01
OBJECTIVE: There are clear advantages to using biomarkers and surrogate endpoints, but concerns about clinical and statistical validity and systematic methods to evaluate these aspects hinder their efficient application. Our objective was to review the literature on biomarkers and surrogates to
Statistical methods and their applications in constructional engineering
International Nuclear Information System (INIS)
1977-01-01
An introduction into the basic terms of statistics is followed by a discussion of elements of the probability theory, customary discrete and continuous distributions, simulation methods, statistical supporting framework dynamics, and a cost-benefit analysis of the methods introduced. (RW) [de
Geometry and Hamiltonian mechanics on discrete spaces
International Nuclear Information System (INIS)
Talasila, V; Clemente-Gallardo, J; Schaft, A J van der
2004-01-01
Numerical simulation is often crucial for analysing the behaviour of many complex systems which do not admit analytic solutions. To this end, one either converts a 'smooth' model into a discrete (in space and time) model, or models systems directly at a discrete level. The goal of this paper is to provide a discrete analogue of differential geometry, and to define on these discrete models a formal discrete Hamiltonian structure-in doing so we try to bring together various fundamental concepts from numerical analysis, differential geometry, algebraic geometry, simplicial homology and classical Hamiltonian mechanics. For example, the concept of a twisted derivation is borrowed from algebraic geometry for developing a discrete calculus. The theory is applied to a nonlinear pendulum and we compare the dynamics obtained through a discrete modelling approach with the dynamics obtained via the usual discretization procedures. Also an example of an energy-conserving algorithm on a simple harmonic oscillator is presented, and its effect on the Poisson structure is discussed
Cuspidal discrete series for semisimple symmetric spaces
DEFF Research Database (Denmark)
Andersen, Nils Byrial; Flensted-Jensen, Mogens; Schlichtkrull, Henrik
2012-01-01
We propose a notion of cusp forms on semisimple symmetric spaces. We then study the real hyperbolic spaces in detail, and show that there exists both cuspidal and non-cuspidal discrete series. In particular, we show that all the spherical discrete series are non-cuspidal. (C) 2012 Elsevier Inc. All...
Discrete Riccati equation solutions: Distributed algorithms
Directory of Open Access Journals (Sweden)
D. G. Lainiotis
1996-01-01
Full Text Available In this paper new distributed algorithms for the solution of the discrete Riccati equation are introduced. The algorithms are used to provide robust and computational efficient solutions to the discrete Riccati equation. The proposed distributed algorithms are theoretically interesting and computationally attractive.
Painleve test and discrete Boltzmann equations
International Nuclear Information System (INIS)
Euler, N.; Steeb, W.H.
1989-01-01
The Painleve test for various discrete Boltzmann equations is performed. The connection with integrability is discussed. Furthermore the Lie symmetry vector fields are derived and group-theoretical reduction of the discrete Boltzmann equations to ordinary differentiable equations is performed. Lie Backlund transformations are gained by performing the Painleve analysis for the ordinary differential equations. 16 refs
Variance Swap Replication: Discrete or Continuous?
Directory of Open Access Journals (Sweden)
Fabien Le Floc’h
2018-02-01
Full Text Available The popular replication formula to price variance swaps assumes continuity of traded option strikes. In practice, however, there is only a discrete set of option strikes traded on the market. We present here different discrete replication strategies and explain why the continuous replication price is more relevant.
Discretization vs. Rounding Error in Euler's Method
Borges, Carlos F.
2011-01-01
Euler's method for solving initial value problems is an excellent vehicle for observing the relationship between discretization error and rounding error in numerical computation. Reductions in stepsize, in order to decrease discretization error, necessarily increase the number of steps and so introduce additional rounding error. The problem is…
Discrete/PWM Ballast-Resistor Controller
King, Roger J.
1994-01-01
Circuit offers low switching loss and automatic compensation for failure of ballast resistor. Discrete/PWM ballast-resistor controller improved shunt voltage-regulator circuit designed to supply power from high-resistance source to low-impedance bus. Provides both coarse discrete voltage levels (by switching of ballast resistors) and continuous fine control of voltage via pulse-width modulation.
Current Density and Continuity in Discretized Models
Boykin, Timothy B.; Luisier, Mathieu; Klimeck, Gerhard
2010-01-01
Discrete approaches have long been used in numerical modelling of physical systems in both research and teaching. Discrete versions of the Schrodinger equation employing either one or several basis functions per mesh point are often used by senior undergraduates and beginning graduate students in computational physics projects. In studying…
Geometry and Hamiltonian mechanics on discrete spaces
Talasila, V.; Clemente-Gallardo, J.; Schaft, A.J. van der
2004-01-01
Numerical simulation is often crucial for analysing the behaviour of many complex systems which do not admit analytic solutions. To this end, one either converts a ‘smooth’ model into a discrete (in space and time) model, or models systems directly at a discrete level. The goal of this paper is to
Geometry and Hamiltonian mechanics on discrete spaces
Talasila, V.; Clemente Gallardo, J.J.; Clemente-Gallardo, J.; van der Schaft, Arjan
2004-01-01
Numerical simulation is often crucial for analysing the behaviour of many complex systems which do not admit analytic solutions. To this end, one either converts a 'smooth' model into a discrete (in space and time) model, or models systems directly at a discrete level. The goal of this paper is to
Discrete mathematics in the high school curriculum
Anderson, I.; Asch, van A.G.; van Lint, J.H.
2004-01-01
In this paper we present some topics from the field of discrete mathematics which might be suitable for the high school curriculum. These topics yield both easy to understand challenging problems and important applications of discrete mathematics. We choose elements from number theory and various
Discrete Fourier analysis of multigrid algorithms
van der Vegt, Jacobus J.W.; Rhebergen, Sander
2011-01-01
The main topic of this report is a detailed discussion of the discrete Fourier multilevel analysis of multigrid algorithms. First, a brief overview of multigrid methods is given for discretizations of both linear and nonlinear partial differential equations. Special attention is given to the
Genomic Biomarkers for Personalized Medicine: Development and Validation in Clinical Studies
Directory of Open Access Journals (Sweden)
Shigeyuki Matsui
2013-01-01
Full Text Available The establishment of high-throughput technologies has brought substantial advances to our understanding of the biology of many diseases at the molecular level and increasing expectations on the development of innovative molecularly targeted treatments and molecular biomarkers or diagnostic tests in the context of clinical studies. In this review article, we position the two critical statistical analyses of high-dimensional genomic data, gene screening and prediction, in the framework of development and validation of genomic biomarkers or signatures, through taking into consideration the possible different strategies for developing genomic signatures. A wide variety of biomarker-based clinical trial designs to assess clinical utility of a biomarker or a new treatment with a companion biomarker are also discussed.
Handbook on modelling for discrete optimization
Pitsoulis, Leonidas; Williams, H
2006-01-01
The primary objective underlying the Handbook on Modelling for Discrete Optimization is to demonstrate and detail the pervasive nature of Discrete Optimization. While its applications cut across an incredibly wide range of activities, many of the applications are only known to specialists. It is the aim of this handbook to correct this. It has long been recognized that "modelling" is a critically important mathematical activity in designing algorithms for solving these discrete optimization problems. Nevertheless solving the resultant models is also often far from straightforward. In recent years it has become possible to solve many large-scale discrete optimization problems. However, some problems remain a challenge, even though advances in mathematical methods, hardware, and software technology have pushed the frontiers forward. This handbook couples the difficult, critical-thinking aspects of mathematical modeling with the hot area of discrete optimization. It will be done in an academic handbook treatment...
Discrete elements method of neutral particle transport
International Nuclear Information System (INIS)
Mathews, K.A.
1983-01-01
A new discrete elements (L/sub N/) transport method is derived and compared to the discrete ordinates S/sub N/ method, theoretically and by numerical experimentation. The discrete elements method is more accurate than discrete ordinates and strongly ameliorates ray effects for the practical problems studied. The discrete elements method is shown to be more cost effective, in terms of execution time with comparable storage to attain the same accuracy, for a one-dimensional test case using linear characteristic spatial quadrature. In a two-dimensional test case, a vacuum duct in a shield, L/sub N/ is more consistently convergent toward a Monte Carlo benchmark solution than S/sub N/, using step characteristic spatial quadrature. An analysis of the interaction of angular and spatial quadrature in xy-geometry indicates the desirability of using linear characteristic spatial quadrature with the L/sub N/ method
Spatially localized, temporally quasiperiodic, discrete nonlinear excitations
International Nuclear Information System (INIS)
Cai, D.; Bishop, A.R.; Gronbech-Jensen, N.
1995-01-01
In contrast to the commonly discussed discrete breather, which is a spatially localized, time-periodic solution, we present an exact solution of a discrete nonlinear Schroedinger breather which is a spatially localized, temporally quasiperiodic nonlinear coherent excitation. This breather is a multiple-soliton solution in the sense of the inverse scattering transform. A discrete breather of multiple frequencies is conceptually important in studies of nonlinear lattice systems. We point out that, for this breather, the incommensurability of its frequencies is a discrete lattice effect and these frequencies become commensurate in the continuum limit. To understand the dynamical properties of the breather, we also discuss its stability and its behavior in the presence of an external potential. Finally, we indicate how to obtain an exact N-soliton breather as a discrete generalization of the continuum multiple-soliton solution
Laplacians on discrete and quantum geometries
International Nuclear Information System (INIS)
Calcagni, Gianluca; Oriti, Daniele; Thürigen, Johannes
2013-01-01
We extend discrete calculus for arbitrary (p-form) fields on embedded lattices to abstract discrete geometries based on combinatorial complexes. We then provide a general definition of discrete Laplacian using both the primal cellular complex and its combinatorial dual. The precise implementation of geometric volume factors is not unique and, comparing the definition with a circumcentric and a barycentric dual, we argue that the latter is, in general, more appropriate because it induces a Laplacian with more desirable properties. We give the expression of the discrete Laplacian in several different sets of geometric variables, suitable for computations in different quantum gravity formalisms. Furthermore, we investigate the possibility of transforming from position to momentum space for scalar fields, thus setting the stage for the calculation of heat kernel and spectral dimension in discrete quantum geometries. (paper)
Discrete breathers in graphane: Effect of temperature
Energy Technology Data Exchange (ETDEWEB)
Baimova, J. A., E-mail: julia.a.baimova@gmail.com [Russian Academy of Sciences, Institute of Metal Physics, Ural Branch (Russian Federation); Murzaev, R. T.; Lobzenko, I. P.; Dmitriev, S. V. [Russian Academy of Sciences, Institute for Metals Superplasticity Problems (Russian Federation); Zhou, Kun [Nanyang Technological University, School of Mechanical and Aerospace Engineering (Singapore)
2016-05-15
The discrete breathers in graphane in thermodynamic equilibrium in the temperature range 50–600 K are studied by molecular dynamics simulation. A discrete breather is a hydrogen atom vibrating along the normal to a sheet of graphane at a high amplitude. As was found earlier, the lifetime of a discrete breather at zero temperature corresponds to several tens of thousands of vibrations. The effect of temperature on the decay time of discrete breathers and the probability of their detachment from a sheet of graphane are studied in this work. It is shown that closely spaced breathers can exchange energy with each other at zero temperature. The data obtained suggest that thermally activated discrete breathers can be involved in the dehydrogenation of graphane, which is important for hydrogen energetics.
International Nuclear Information System (INIS)
Ding Qing
2007-01-01
We prove that the integrable-nonintegrable discrete nonlinear Schroedinger equation (AL-DNLS) introduced by Cai, Bishop and Gronbech-Jensen (Phys. Rev. Lett. 72 591(1994)) is the discrete gauge equivalent to an integrable-nonintegrable discrete Heisenberg model from the geometric point of view. Then we study whether the transmission and bifurcation properties of the AL-DNLS equation are preserved under the action of discrete gauge transformations. Our results reveal that the transmission property of the AL-DNLS equation is completely preserved and the bifurcation property is conditionally preserved to those of the integrable-nonintegrable discrete Heisenberg model
The dynamics of discrete populations and series of events
Hopcraft, Keith Iain; Ridley, Kevin D
2014-01-01
IntroductionReferencesStatistical PreliminariesIntroductionProbability DistributionsMoment-Generating FunctionsDiscrete ProcessesSeries of EventsSummaryFurther ReadingMarkovian Population ProcessesIntroductionBirths and DeathsImmigration and the Poisson ProcessThe Effect of MeasurementCorrelation of CountsSummaryFurther ReadingThe Birth-Death-Immigration ProcessIntroductionRate Equations for the ProcessEquation for the Generating FunctionGeneral Time-Dependent SolutionFluctuation Characteristics of a Birth-Death-Immigration PopulationSampling and Measurement ProcessesCorrelation of CountsSumma
Estimation in Discretely Observed Diffusions Killed at a Threshold
DEFF Research Database (Denmark)
Bibbona, Enrico; Ditlevsen, Susanne
2013-01-01
are modelled as discretely observed diffusions which are killed when the threshold is reached. Statistical inference is often based on a misspecified likelihood ignoring the presence of the threshold causing severe bias, e.g. the bias incurred in the drift parameters of the Ornstein–Uhlenbeck model...... for biological relevant parameters can be up to 25–100 per cent. We compute or approximate the likelihood function of the killed process. When estimating from a single trajectory, considerable bias may still be present, and the distribution of the estimates can be heavily skewed and with a huge variance...
Compatible Spatial Discretizations for Partial Differential Equations
Energy Technology Data Exchange (ETDEWEB)
Arnold, Douglas, N, ed.
2004-11-25
From May 11--15, 2004, the Institute for Mathematics and its Applications held a hot topics workshop on Compatible Spatial Discretizations for Partial Differential Equations. The numerical solution of partial differential equations (PDE) is a fundamental task in science and engineering. The goal of the workshop was to bring together a spectrum of scientists at the forefront of the research in the numerical solution of PDEs to discuss compatible spatial discretizations. We define compatible spatial discretizations as those that inherit or mimic fundamental properties of the PDE such as topology, conservation, symmetries, and positivity structures and maximum principles. A wide variety of discretization methods applied across a wide range of scientific and engineering applications have been designed to or found to inherit or mimic intrinsic spatial structure and reproduce fundamental properties of the solution of the continuous PDE model at the finite dimensional level. A profusion of such methods and concepts relevant to understanding them have been developed and explored: mixed finite element methods, mimetic finite differences, support operator methods, control volume methods, discrete differential forms, Whitney forms, conservative differencing, discrete Hodge operators, discrete Helmholtz decomposition, finite integration techniques, staggered grid and dual grid methods, etc. This workshop seeks to foster communication among the diverse groups of researchers designing, applying, and studying such methods as well as researchers involved in practical solution of large scale problems that may benefit from advancements in such discretizations; to help elucidate the relations between the different methods and concepts; and to generally advance our understanding in the area of compatible spatial discretization methods for PDE. Particular points of emphasis included: + Identification of intrinsic properties of PDE models that are critical for the fidelity of numerical
Discrete-Feature Model Implementation of SDM-Site Forsmark
International Nuclear Information System (INIS)
Geier, Joel
2010-03-01
A discrete-feature model (DFM) was implemented for the Forsmark repository site based on the final site descriptive model from surface based investigations. The discrete-feature conceptual model represents deformation zones, individual fractures, and other water-conducting features around a repository as discrete conductors surrounded by a rock matrix which, in the present study, is treated as impermeable. This approximation is reasonable for sites in crystalline rock which has very low permeability, apart from that which results from macroscopic fracturing. Models are constructed based on the geological and hydrogeological description of the sites and engineering designs. Hydraulic heads and flows through the network of water-conducting features are calculated by the finite-element method, and are used in turn to simulate migration of non-reacting solute by a particle-tracking method, in order to estimate the properties of pathways by which radionuclides could be released to the biosphere. Stochastic simulation is used to evaluate portions of the model that can only be characterized in statistical terms, since many water-conducting features within the model volume cannot be characterized deterministically. Chapter 2 describes the methodology by which discrete features are derived to represent water-conducting features around the hypothetical repository at Forsmark (including both natural features and features that result from the disturbance of excavation), and then assembled to produce a discrete-feature network model for numerical simulation of flow and transport. Chapter 3 describes how site-specific data and repository design are adapted to produce the discrete-feature model. Chapter 4 presents results of the calculations. These include utilization factors for deposition tunnels based on the emplacement criteria that have been set forth by the implementers, flow distributions to the deposition holes, and calculated properties of discharge paths as well as
Discrete-Feature Model Implementation of SDM-Site Forsmark
Energy Technology Data Exchange (ETDEWEB)
Geier, Joel (Clearwater Hardrock Consulting, Corvallis, OR (United States))
2010-03-15
A discrete-feature model (DFM) was implemented for the Forsmark repository site based on the final site descriptive model from surface based investigations. The discrete-feature conceptual model represents deformation zones, individual fractures, and other water-conducting features around a repository as discrete conductors surrounded by a rock matrix which, in the present study, is treated as impermeable. This approximation is reasonable for sites in crystalline rock which has very low permeability, apart from that which results from macroscopic fracturing. Models are constructed based on the geological and hydrogeological description of the sites and engineering designs. Hydraulic heads and flows through the network of water-conducting features are calculated by the finite-element method, and are used in turn to simulate migration of non-reacting solute by a particle-tracking method, in order to estimate the properties of pathways by which radionuclides could be released to the biosphere. Stochastic simulation is used to evaluate portions of the model that can only be characterized in statistical terms, since many water-conducting features within the model volume cannot be characterized deterministically. Chapter 2 describes the methodology by which discrete features are derived to represent water-conducting features around the hypothetical repository at Forsmark (including both natural features and features that result from the disturbance of excavation), and then assembled to produce a discrete-feature network model for numerical simulation of flow and transport. Chapter 3 describes how site-specific data and repository design are adapted to produce the discrete-feature model. Chapter 4 presents results of the calculations. These include utilization factors for deposition tunnels based on the emplacement criteria that have been set forth by the implementers, flow distributions to the deposition holes, and calculated properties of discharge paths as well as
Biomarkers in acute heart failure.
Mallick, Aditi; Januzzi, James L
2015-06-01
The care of patients with acutely decompensated heart failure is being reshaped by the availability and understanding of several novel and emerging heart failure biomarkers. The gold standard biomarkers in heart failure are B-type natriuretic peptide and N-terminal pro-B-type natriuretic peptide, which play an important role in the diagnosis, prognosis, and management of acute decompensated heart failure. Novel biomarkers that are increasingly involved in the processes of myocardial injury, neurohormonal activation, and ventricular remodeling are showing promise in improving diagnosis and prognosis among patients with acute decompensated heart failure. These include midregional proatrial natriuretic peptide, soluble ST2, galectin-3, highly-sensitive troponin, and midregional proadrenomedullin. There has also been an emergence of biomarkers for evaluation of acute decompensated heart failure that assist in the differential diagnosis of dyspnea, such as procalcitonin (for identification of acute pneumonia), as well as markers that predict complications of acute decompensated heart failure, such as renal injury markers. In this article, we will review the pathophysiology and usefulness of established and emerging biomarkers for the clinical diagnosis, prognosis, and management of acute decompensated heart failure. Copyright © 2015 Sociedad Española de Cardiología. Published by Elsevier España, S.L.U. All rights reserved.
The Power of Neuroimaging Biomarkers for Screening Frontotemporal Dementia
McMillan, Corey T.; Avants, Brian B.; Cook, Philip; Ungar, Lyle; Trojanowski, John Q.; Grossman, Murray
2014-01-01
Frontotemporal dementia (FTD) is a clinically and pathologically heterogeneous neurodegenerative disease that can result from either frontotemporal lobar degeneration (FTLD) or Alzheimer’s disease (AD) pathology. It is critical to establish statistically powerful biomarkers that can achieve substantial cost-savings and increase feasibility of clinical trials. We assessed three broad categories of neuroimaging methods to screen underlying FTLD and AD pathology in a clinical FTD series: global ...
Biomarkers to Measure Treatment Effects in Alzheimer's Disease: What Should We Look for?
Directory of Open Access Journals (Sweden)
Kenneth Rockwood
2011-01-01
Full Text Available It is often surprisingly difficult to tell whether a treatment for Alzheimer's disease is effective. Biomarkers might offer the potential of a quantifiable objective measure of treatment effectiveness. This paper suggests several criteria by which biomarkers might be evaluated as outcomes measures. These include biological plausibility, statistical significance, dose dependence, convergence across measures, and replicability. If biomarkers can meet these criteria, then, pending regulatory approval, they may have a role in the evaluation of treatment effectiveness in Alzheimer's disease. If not, their usefulness may be in supplementing, but not supplanting, clinical profiles of treatment effects.
Biomarker Gene Signature Discovery Integrating Network Knowledge
Directory of Open Access Journals (Sweden)
Holger Fröhlich
2012-02-01
Full Text Available Discovery of prognostic and diagnostic biomarker gene signatures for diseases, such as cancer, is seen as a major step towards a better personalized medicine. During the last decade various methods, mainly coming from the machine learning or statistical domain, have been proposed for that purpose. However, one important obstacle for making gene signatures a standard tool in clinical diagnosis is the typical low reproducibility of these signatures combined with the difficulty to achieve a clear biological interpretation. For that purpose in the last years there has been a growing interest in approaches that try to integrate information from molecular interaction networks. Here we review the current state of research in this field by giving an overview about so-far proposed approaches.
Perfect discretization of reparametrization invariant path integrals
International Nuclear Information System (INIS)
Bahr, Benjamin; Dittrich, Bianca; Steinhaus, Sebastian
2011-01-01
To obtain a well-defined path integral one often employs discretizations. In the case of gravity and reparametrization-invariant systems, the latter of which we consider here as a toy example, discretizations generically break diffeomorphism and reparametrization symmetry, respectively. This has severe implications, as these symmetries determine the dynamics of the corresponding system. Indeed we will show that a discretized path integral with reparametrization-invariance is necessarily also discretization independent and therefore uniquely determined by the corresponding continuum quantum mechanical propagator. We use this insight to develop an iterative method for constructing such a discretized path integral, akin to a Wilsonian RG flow. This allows us to address the problem of discretization ambiguities and of an anomaly-free path integral measure for such systems. The latter is needed to obtain a path integral, that can act as a projector onto the physical states, satisfying the quantum constraints. We will comment on implications for discrete quantum gravity models, such as spin foams.
Perfect discretization of reparametrization invariant path integrals
Bahr, Benjamin; Dittrich, Bianca; Steinhaus, Sebastian
2011-05-01
To obtain a well-defined path integral one often employs discretizations. In the case of gravity and reparametrization-invariant systems, the latter of which we consider here as a toy example, discretizations generically break diffeomorphism and reparametrization symmetry, respectively. This has severe implications, as these symmetries determine the dynamics of the corresponding system. Indeed we will show that a discretized path integral with reparametrization-invariance is necessarily also discretization independent and therefore uniquely determined by the corresponding continuum quantum mechanical propagator. We use this insight to develop an iterative method for constructing such a discretized path integral, akin to a Wilsonian RG flow. This allows us to address the problem of discretization ambiguities and of an anomaly-free path integral measure for such systems. The latter is needed to obtain a path integral, that can act as a projector onto the physical states, satisfying the quantum constraints. We will comment on implications for discrete quantum gravity models, such as spin foams.
Serum prognostic biomarkers in head and neck cancer patients.
Lin, Ho-Sheng; Siddiq, Fauzia; Talwar, Harvinder S; Chen, Wei; Voichita, Calin; Draghici, Sorin; Jeyapalan, Gerald; Chatterjee, Madhumita; Fribley, Andrew; Yoo, George H; Sethi, Seema; Kim, Harold; Sukari, Ammar; Folbe, Adam J; Tainsky, Michael A
2014-08-01
A reliable estimate of survival is important as it may impact treatment choice. The objective of this study is to identify serum autoantibody biomarkers that can be used to improve prognostication for patients affected with head and neck squamous cell carcinoma (HNSCC). Prospective cohort study. A panel of 130 serum biomarkers, previously selected for cancer detection using microarray-based serological profiling and specialized bioinformatics, were evaluated for their potential as prognostic biomarkers in a cohort of 119 HNSCC patients followed for up to 12.7 years. A biomarker was considered positive if its reactivity to the particular patient's serum was greater than one standard deviation above the mean reactivity to sera from the other 118 patients, using a leave-one-out cross-validation model. Survival curves were estimated according to the Kaplan-Meier method, and statistically significant differences in survival were examined using the log rank test. Independent prognostic biomarkers were identified following analysis using multivariate Cox proportional hazards models. Poor overall survival was associated with African Americans (hazard ratio [HR] for death = 2.61; 95% confidence interval [CI]: 1.58-4.33; P = .000), advanced stage (HR = 2.79; 95% CI: 1.40-5.57; P = .004), and recurrent disease (HR = 6.66; 95% CI: 2.54-17.44; P = .000). On multivariable Cox analysis adjusted for covariates (race and stage), six of the 130 markers evaluated were found to be independent prognosticators of overall survival. The results shown here are promising and demonstrate the potential use of serum biomarkers for prognostication in HNSCC patients. Further clinical trials to include larger samples of patients across multiple centers may be warranted. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.
Higher dimensional discrete Cheeger inequalities
Directory of Open Access Journals (Sweden)
Anna Gundert
2015-01-01
Full Text Available For graphs there exists a strong connection between spectral and combinatorial expansion properties. This is expressed, e.g., by the discrete Cheeger inequality, the lower bound of which states that $\\lambda(G \\leq h(G$, where $\\lambda(G$ is the second smallest eigenvalue of the Laplacian of a graph $G$ and $h(G$ is the Cheeger constant measuring the edge expansion of $G$. We are interested in generalizations of expansion properties to finite simplicial complexes of higher dimension (or uniform hypergraphs. Whereas higher dimensional Laplacians were introduced already in 1945 by Eckmann, the generalization of edge expansion to simplicial complexes is not straightforward. Recently, a topologically motivated notion analogous to edge expansion that is based on $\\mathbb{Z}_2$-cohomology was introduced by Gromov and independently by Linial, Meshulam and Wallach. It is known that for this generalization there is no direct higher dimensional analogue of the lower bound of the Cheeger inequality. A different, combinatorially motivated generalization of the Cheeger constant, denoted by $h(X$, was studied by Parzanchevski, Rosenthal and Tessler. They showed that indeed $\\lambda(X \\leq h(X$, where $\\lambda(X$ is the smallest non-trivial eigenvalue of the ($(k-1$-dimensional upper Laplacian, for the case of $k$-dimensional simplicial complexes $X$ with complete $(k-1$-skeleton. Whether this inequality also holds for $k$-dimensional complexes with non-com\\-plete$(k-1$-skeleton has been an open question.We give two proofs of the inequality for arbitrary complexes. The proofs differ strongly in the methods and structures employed,and each allows for a different kind of additional strengthening of the original result.
International Nuclear Information System (INIS)
Maruno, Ken-ichi; Biondini, Gino
2004-01-01
We present a class of solutions of the two-dimensional Toda lattice equation, its fully discrete analogue and its ultra-discrete limit. These solutions demonstrate the existence of soliton resonance and web-like structure in discrete integrable systems such as differential-difference equations, difference equations and cellular automata (ultra-discrete equations)
Biomarkers in inflammatory bowel diseases
DEFF Research Database (Denmark)
Bennike, Tue; Birkelund, Svend; Stensballe, Allan
2014-01-01
Unambiguous diagnosis of the two main forms of inflammatory bowel diseases (IBD): Ulcerative colitis (UC) and Crohn's disease (CD), represents a challenge in the early stages of the diseases. The diagnosis may be established several years after the debut of symptoms. Hence, protein biomarkers...... for early and accurate diagnostic could help clinicians improve treatment of the individual patients. Moreover, the biomarkers could aid physicians to predict disease courses and in this way, identify patients in need of intensive treatment. Patients with low risk of disease flares may avoid treatment...... with medications with the concomitant risk of adverse events. In addition, identification of disease and course specific biomarker profiles can be used to identify biological pathways involved in the disease development and treatment. Knowledge of disease mechanisms in general can lead to improved future...
Biomarkers of replicative senescence revisited
DEFF Research Database (Denmark)
Nehlin, Jan
2016-01-01
Biomarkers of replicative senescence can be defined as those ultrastructural and physiological variations as well as molecules whose changes in expression, activity or function correlate with aging, as a result of the gradual exhaustion of replicative potential and a state of permanent cell cycle...... arrest. The biomarkers that characterize the path to an irreversible state of cell cycle arrest due to proliferative exhaustion may also be shared by other forms of senescence-inducing mechanisms. Validation of senescence markers is crucial in circumstances where quiescence or temporary growth arrest may...... be triggered or is thought to be induced. Pre-senescence biomarkers are also important to consider as their presence indicate that induction of aging processes is taking place. The bona fide pathway leading to replicative senescence that has been extensively characterized is a consequence of gradual reduction...
Hairs of discrete symmetries and gravity
Energy Technology Data Exchange (ETDEWEB)
Choi, Kang Sin [Scranton Honors Program, Ewha Womans University, Seodaemun-Gu, Seoul 03760 (Korea, Republic of); Center for Fields, Gravity and Strings, CTPU, Institute for Basic Sciences, Yuseong-Gu, Daejeon 34047 (Korea, Republic of); Kim, Jihn E., E-mail: jihnekim@gmail.com [Department of Physics, Kyung Hee University, 26 Gyungheedaero, Dongdaemun-Gu, Seoul 02447 (Korea, Republic of); Center for Axion and Precision Physics Research (IBS), 291 Daehakro, Yuseong-Gu, Daejeon 34141 (Korea, Republic of); Kyae, Bumseok [Department of Physics, Pusan National University, 2 Busandaehakro-63-Gil, Geumjeong-Gu, Busan 46241 (Korea, Republic of); Nam, Soonkeon [Department of Physics, Kyung Hee University, 26 Gyungheedaero, Dongdaemun-Gu, Seoul 02447 (Korea, Republic of)
2017-06-10
Gauge symmetries are known to be respected by gravity because gauge charges carry flux lines, but global charges do not carry flux lines and are not conserved by gravitational interaction. For discrete symmetries, they are spontaneously broken in the Universe, forming domain walls. Since the realization of discrete symmetries in the Universe must involve the vacuum expectation values of Higgs fields, a string-like configuration (hair) at the intersection of domain walls in the Higgs vacua can be realized. Therefore, we argue that discrete charges are also respected by gravity.
Hairs of discrete symmetries and gravity
Directory of Open Access Journals (Sweden)
Kang Sin Choi
2017-06-01
Full Text Available Gauge symmetries are known to be respected by gravity because gauge charges carry flux lines, but global charges do not carry flux lines and are not conserved by gravitational interaction. For discrete symmetries, they are spontaneously broken in the Universe, forming domain walls. Since the realization of discrete symmetries in the Universe must involve the vacuum expectation values of Higgs fields, a string-like configuration (hair at the intersection of domain walls in the Higgs vacua can be realized. Therefore, we argue that discrete charges are also respected by gravity.
Discrete Tomography and Imaging of Polycrystalline Structures
DEFF Research Database (Denmark)
Alpers, Andreas
High resolution transmission electron microscopy is commonly considered as the standard application for discrete tomography. While this has yet to be technically realized, new applications with a similar flavor have emerged in materials science. In our group at Ris� DTU (Denmark's National...... Laboratory for Sustainable Energy), for instance, we study polycrystalline materials via synchrotron X-ray diffraction. Several reconstruction problems arise, most of them exhibit inherently discrete aspects. In this talk I want to give a concise mathematical introduction to some of these reconstruction...... problems. Special focus is on their relationship to classical discrete tomography. Several open mathematical questions will be mentioned along the way....
Ensemble simulations with discrete classical dynamics
DEFF Research Database (Denmark)
Toxværd, Søren
2013-01-01
For discrete classical Molecular dynamics (MD) obtained by the "Verlet" algorithm (VA) with the time increment $h$ there exist a shadow Hamiltonian $\\tilde{H}$ with energy $\\tilde{E}(h)$, for which the discrete particle positions lie on the analytic trajectories for $\\tilde{H}$. $\\tilde......{E}(h)$ is employed to determine the relation with the corresponding energy, $E$ for the analytic dynamics with $h=0$ and the zero-order estimate $E_0(h)$ of the energy for discrete dynamics, appearing in the literature for MD with VA. We derive a corresponding time reversible VA algorithm for canonical dynamics...
Stochastic Kuramoto oscillators with discrete phase states
Jörg, David J.
2017-09-01
We present a generalization of the Kuramoto phase oscillator model in which phases advance in discrete phase increments through Poisson processes, rendering both intrinsic oscillations and coupling inherently stochastic. We study the effects of phase discretization on the synchronization and precision properties of the coupled system both analytically and numerically. Remarkably, many key observables such as the steady-state synchrony and the quality of oscillations show distinct extrema while converging to the classical Kuramoto model in the limit of a continuous phase. The phase-discretized model provides a general framework for coupled oscillations in a Markov chain setting.
Stochastic Kuramoto oscillators with discrete phase states.
Jörg, David J
2017-09-01
We present a generalization of the Kuramoto phase oscillator model in which phases advance in discrete phase increments through Poisson processes, rendering both intrinsic oscillations and coupling inherently stochastic. We study the effects of phase discretization on the synchronization and precision properties of the coupled system both analytically and numerically. Remarkably, many key observables such as the steady-state synchrony and the quality of oscillations show distinct extrema while converging to the classical Kuramoto model in the limit of a continuous phase. The phase-discretized model provides a general framework for coupled oscillations in a Markov chain setting.
Discrete-Time Biomedical Signal Encryption
Directory of Open Access Journals (Sweden)
Victor Grigoraş
2017-12-01
Full Text Available Chaotic modulation is a strong method of improving communication security. Analog and discrete chaotic systems are presented in actual literature. Due to the expansion of digital communication, discrete-time systems become more efficient and closer to actual technology. The present contribution offers an in-depth analysis of the effects chaos encryption produce on 1D and 2D biomedical signals. The performed simulations show that modulating signals are precisely recovered by the synchronizing receiver if discrete systems are digitally implemented and the coefficients precisely correspond. Channel noise is also applied and its effects on biomedical signal demodulation are highlighted.
Discrete symmetries and de Sitter spacetime
Energy Technology Data Exchange (ETDEWEB)
Cotăescu, Ion I., E-mail: gpascu@physics.uvt.ro; Pascu, Gabriel, E-mail: gpascu@physics.uvt.ro [West University of Timişoara, V. Pârvan Ave. 4, RO-300223 Timişoara (Romania)
2014-11-24
Aspects of the ambiguity in defining quantum modes on de Sitter spacetime using a commuting system composed only of differential operators are discussed. Discrete symmetries and their actions on the wavefunction in commonly used coordinate charts are reviewed. It is argued that the system of commuting operators can be supplemented by requiring the invariance of the wavefunction to combined discrete symmetries- a criterion which selects a single state out of the α-vacuum family. Two such members of this family are singled out by particular combined discrete symmetries- states between which exists a well-known thermality relation.
LABORATORY BIOMARKERS FOR ANKYLOSING SPONDYLITIS
Directory of Open Access Journals (Sweden)
E. N. Aleksandrova
2017-01-01
Full Text Available Ankylosing spondylitis (AS is a chronic inflammatory disease from a group of spondyloarthritis (SpA, which is characterized by lesions of the sacroiliac joints and spine with the common involvement of entheses and peripheral joints in the pathological process. Advances in modern laboratory medicine have contributed to a substantial expansion of the range of pathogenetic, diagnostic, and prognostic biomarkers of AS. As of now, there are key pathogenetic biomarkers of AS (therapeutic targets, which include tumor necrosis factor-α (TNF-α, interleukin 17 (IL-17, and IL-23. Among the laboratory diagnostic and prognostic biomarkers, HLA-B27 and C-reactive protein are of the greatest value in clinical practice; the former for the early diagnosis of the disease and the latter for the assessment of disease activity, the risk of radiographic progression and the efficiency of therapy. Anti-CD74 antibodies are a new biomarker that has high sensitivity and specificity values in diagnosing axial SpA at an early stage. A number of laboratory biomarkers, including calprotectin, matrix metalloproteinase-3 (MMP-3, vascular endothelial growth factor, Dickkopf-1 (Dkk-1, and C-terminal telopeptide of type II collagen (CTX II do not well reflect disease activity, but may predict progressive structural changes in the spine and sacroiliac joints in AS. Blood calprotectin level monitoring allows the effective prediction of a response to therapy with TNF inhibitors and anti-IL-17А monoclonal antibodies. The prospects for the laboratory diagnosis of AS are associated with the clinical validation of candidate biomarkers during large-scale prospective cohort studies and with a search for new proteomic, transcriptomic and genomic markers, by using innovative molecular and cellular technologies.
Static quarks with improved statistical precision
International Nuclear Information System (INIS)
Della Morte, M.; Duerr, S.; Molke, H.; Heitger, J.
2003-09-01
We present a numerical study for different discretisations of the static action, concerning cut-off effects and the growth of statistical errors with Euclidean time. An error reduction by an order of magnitude can be obtained with respect to the Eichten-Hill action, for time separations up to 2 fm, keeping discretization errors small. The best actions lead to a big improvement on the precision of the quark mass M b and F B s in the static approximation. (orig.)
Signature Curves Statistics of DNA Supercoils
Shakiban, Cheri; Lloyd, Peter
2004-01-01
In this paper we describe the Euclidean signature curves for two dimensional closed curves in the plane and their generalization to closed space curves. The focus will be on discrete numerical methods for approximating such curves. Further we will apply these numerical methods to plot the signature curves related to three-dimensional simulated DNA supercoils. Our primary focus will be on statistical analysis of the data generated for the signature curves of the supercoils. We will try to esta...
Biomarkers of cadmium and arsenic interactions
International Nuclear Information System (INIS)
Nordberg, G.F.; Jin, T.; Hong, F.; Zhang, A.; Buchet, J.P.; Bernard, A.
2005-01-01
Advances in proteomics have led to the identification of sensitive urinary biomarkers of renal dysfunction that are increasingly used in toxicology and epidemiology. Recent animal data show that combined exposure to inorganic arsenic (As) and cadmium (Cd) gives rise to more pronounced renal toxicity than exposure to each of the agents alone. In order to examine if similar interaction occurs in humans, renal dysfunction was studied in population groups (619 persons in total) residing in two metal contaminated areas in China: mainly a Cd contaminated area in Zhejiang province (Z-area) and mainly a As contaminated area in Guizhou province (G-area). Nearby control areas without excessive metal exposure were also included. Measurements of urinary β 2 -microglobulin (UB2MG), N-acetyl-β-glucosaminidase (UNAG), retinol binding protein (URBP) and albumin (UALB) were used as markers of renal dysfunction. Urinary Cd (UCd) and total As (UTAs) were analyzed by graphite-furnace atomic absorption spectrometry. Urinary inorganic As and its mono- and di-methylated metabolites (UIAs) were determined by Hydride generation. Results. As expected, the highest UCd values occurred in Z-area (Geometric mean, GM 11.6 μg/g crea) while the highest UTAs values occurred in G-area (GM = 288 μg/g crea). Statistically significant increases compared to the respective control area were present both for UTAs, UCd and for UB2MG, UNAG and UALB in Z-area as well as in G-area. UIAs was determined only in Z area. In G-area, there was a clear dose-response pattern both in relation to UTAs and UCd for each of the biomarkers of renal dysfunction. An interaction effect between As and Cd was demonstrated at higher levels of a combined exposure to As and Cd enhancing the effect on the kidney. In Z-area an increased prevalence of B2MG-uria, NAG-uria and ALB-uria was found in relation to UCd, but no relationship to UTAs was found. A statistically significant relationship between UIAs and UB2MG was found among
Biomarkers in scleroderma: Current status
Directory of Open Access Journals (Sweden)
Latika Gupta
2017-01-01
Full Text Available Scleroderma is an autoimmune disease characterized by indolent obliterative vasculopathy and widespread fibrosis. The two main morphological manifestations of the disease overlap and may make it difficult to separate activity from damage. Many patients, especially those with the limited subset of the disease, have an indolent course without clear-cut inflammatory manifestations. There is a felt need for validated biomarkers, which can differentiate activity from damage, and yet be sensitive to change with therapy. Multiplex arrays of biomarkers have ushered an era of targeted or personalized medicine based on phenotypic characteristics in an individual.
Exterior difference systems and invariance properties of discrete mechanics
International Nuclear Information System (INIS)
Xie Zheng; Xie Duanqiang; Li Hongbo
2008-01-01
Invariance properties describe the fundamental physical laws in discrete mechanics. Can those properties be described in a geometric way? We investigate an exterior difference system called the discrete Euler-Lagrange system, whose solution has one-to-one correspondence with solutions of discrete Euler-Lagrange equations, and use it to define the first integrals. The preservation of the discrete symplectic form along the discrete Hamilton phase flows and the discrete Noether's theorem is also described in the language of difference forms
On organizing principles of discrete differential geometry. Geometry of spheres
International Nuclear Information System (INIS)
Bobenko, Alexander I; Suris, Yury B
2007-01-01
Discrete differential geometry aims to develop discrete equivalents of the geometric notions and methods of classical differential geometry. This survey contains a discussion of the following two fundamental discretization principles: the transformation group principle (smooth geometric objects and their discretizations are invariant with respect to the same transformation group) and the consistency principle (discretizations of smooth parametrized geometries can be extended to multidimensional consistent nets). The main concrete geometric problem treated here is discretization of curvature-line parametrized surfaces in Lie geometry. Systematic use of the discretization principles leads to a discretization of curvature-line parametrization which unifies circular and conical nets.
Solutions of several coupled discrete models in terms of Lamé ...
Indian Academy of Sciences (India)
3Departments of Mathematics and Statistics, Stanford University, Stanford, CA 94305, USA. ∗. Corresponding author. E-mail: avadh@lanl.gov. MS received 23 January 2012; revised 29 March 2012; accepted 18 April 2012. Abstract. Coupled discrete models are ubiquitous in a variety of physical contexts. We provide.
Will the alphabet soup of design criteria affect discrete choice experiment results?
DEFF Research Database (Denmark)
Olsen, Søren Bøye; Meyerhoff, Jürgen
2017-01-01
Every discrete choice experiment needs one, but the impacts of a statistical design on the results are still not well understood. Comparative studies have found that efficient designs outperform especially orthogonal designs. What has been little studied is whether efficient designs come at a cos...
An equivalence between the discrete Gaussian model and a generalized Sine Gordon theory on a lattice
International Nuclear Information System (INIS)
Baskaran, G.; Gupte, N.
1983-11-01
We demonstrate an equivalence between the statistical mechanics of the discrete Gaussian model and a generalized Sine-Gordon theory on an Euclidean lattice in arbitrary dimensions. The connection is obtained by a simple transformation of the partition function and is non perturbative in nature. (author)
Can time be a discrete dynamical variable
International Nuclear Information System (INIS)
Lee, T.D.
1983-01-01
The possibility that time can be regarded as a discrete dynamical variable is examined through all phases of mechanics: from classical mechanics to nonrelativistic quantum mechanics, and to relativistic quantum field theories. (orig.)
Local discrete symmetries from superstring derived models
International Nuclear Information System (INIS)
Faraggi, A.E.
1996-10-01
Discrete and global symmetries play an essential role in many extensions of the Standard Model, for example, to preserve the proton lifetime, to prevent flavor changing neutral currents, etc. An important question is how can such symmetries survive in a theory of quantum gravity, like superstring theory. In a specific string model the author illustrates how local discrete symmetries may arise in string models and play an important role in preventing fast proton decay and flavor changing neutral currents. The local discrete symmetry arises due to the breaking of the non-Abelian gauge symmetries by Wilson lines in the superstring models and forbids, for example dimension five operators which mediate rapid proton decay, to all orders of nonrenormalizable terms. In the context of models of unification of the gauge and gravitational interactions, it is precisely this type of local discrete symmetries that must be found in order to insure that a given model is not in conflict with experimental observations
Breatherlike impurity modes in discrete nonlinear lattices
DEFF Research Database (Denmark)
Hennig, D.; Rasmussen, Kim; Tsironis, G. P.
1995-01-01
We investigate the properties of a disordered generalized discrete nonlinear Schrodinger equation, containing both diagonal and nondiagonal nonlinear terms. The equation models a Linear host lattice doped with nonlinear impurities. We find different types of impurity states that form itinerant...
Inferring gene networks from discrete expression data
Zhang, L.; Mallick, B. K.
2013-01-01
graphical models applied to continuous data, which give a closedformmarginal likelihood. In this paper,we extend network modeling to discrete data, specifically data from serial analysis of gene expression, and RNA-sequencing experiments, both of which
A discrete control model of PLANT
Mitchell, C. M.
1985-01-01
A model of the PLANT system using the discrete control modeling techniques developed by Miller is described. Discrete control models attempt to represent in a mathematical form how a human operator might decompose a complex system into simpler parts and how the control actions and system configuration are coordinated so that acceptable overall system performance is achieved. Basic questions include knowledge representation, information flow, and decision making in complex systems. The structure of the model is a general hierarchical/heterarchical scheme which structurally accounts for coordination and dynamic focus of attention. Mathematically, the discrete control model is defined in terms of a network of finite state systems. Specifically, the discrete control model accounts for how specific control actions are selected from information about the controlled system, the environment, and the context of the situation. The objective is to provide a plausible and empirically testable accounting and, if possible, explanation of control behavior.
Running Parallel Discrete Event Simulators on Sierra
Energy Technology Data Exchange (ETDEWEB)
Barnes, P. D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Jefferson, D. R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
2015-12-03
In this proposal we consider porting the ROSS/Charm++ simulator and the discrete event models that run under its control so that they run on the Sierra architecture and make efficient use of the Volta GPUs.
Effective Hamiltonian for travelling discrete breathers
MacKay, Robert S.; Sepulchre, Jacques-Alexandre
2002-05-01
Hamiltonian chains of oscillators in general probably do not sustain exact travelling discrete breathers. However solutions which look like moving discrete breathers for some time are not difficult to observe in numerics. In this paper we propose an abstract framework for the description of approximate travelling discrete breathers in Hamiltonian chains of oscillators. The method is based on the construction of an effective Hamiltonian enabling one to describe the dynamics of the translation degree of freedom of moving breathers. Error estimate on the approximate dynamics is also studied. The concept of the Peierls-Nabarro barrier can be made clear in this framework. We illustrate the method with two simple examples, namely the Salerno model which interpolates between the Ablowitz-Ladik lattice and the discrete nonlinear Schrödinger system, and the Fermi-Pasta-Ulam chain.
Comparing the Discrete and Continuous Logistic Models
Gordon, Sheldon P.
2008-01-01
The solutions of the discrete logistic growth model based on a difference equation and the continuous logistic growth model based on a differential equation are compared and contrasted. The investigation is conducted using a dynamic interactive spreadsheet. (Contains 5 figures.)
Discrete-time nonlinear sliding mode controller
African Journals Online (AJOL)
user
Keywords: Discrete-time delay system, Sliding mode control, nonlinear sliding ... of engineering systems such as chemical process control, delay in the actuator ...... instrumentation from Motilal Nehru National Institute of Technology (MNNIT),.
Rich dynamics of discrete delay ecological models
International Nuclear Information System (INIS)
Peng Mingshu
2005-01-01
We study multiple bifurcations and chaotic behavior of a discrete delay ecological model. New form of chaos for the 2-D map is observed: the combination of potential period doubling and reverse period-doubling leads to cascading bubbles
Discrete and Continuous Models for Partitioning Problems
Lellmann, Jan; Lellmann, Bjö rn; Widmann, Florian; Schnö rr, Christoph
2013-01-01
-based techniques. This work is concerned with the sources of such artifacts. We discuss the importance of differentiating between artifacts caused by discretization and those caused by relaxation and provide supporting numerical examples. Moreover, we consider
Memorized discrete systems and time-delay
Luo, Albert C J
2017-01-01
This book examines discrete dynamical systems with memory—nonlinear systems that exist extensively in biological organisms and financial and economic organizations, and time-delay systems that can be discretized into the memorized, discrete dynamical systems. It book further discusses stability and bifurcations of time-delay dynamical systems that can be investigated through memorized dynamical systems as well as bifurcations of memorized nonlinear dynamical systems, discretization methods of time-delay systems, and periodic motions to chaos in nonlinear time-delay systems. The book helps readers find analytical solutions of MDS, change traditional perturbation analysis in time-delay systems, detect motion complexity and singularity in MDS; and determine stability, bifurcation, and chaos in any time-delay system.
... Watchdog Ratings Feedback Contact Select Page Childhood Cancer Statistics Home > Cancer Resources > Childhood Cancer Statistics Childhood Cancer Statistics – Graphs and Infographics Number of Diagnoses Incidence Rates ...
Quadratic Term Structure Models in Discrete Time
Marco Realdon
2006-01-01
This paper extends the results on quadratic term structure models in continuos time to the discrete time setting. The continuos time setting can be seen as a special case of the discrete time one. Recursive closed form solutions for zero coupon bonds are provided even in the presence of multiple correlated underlying factors. Pricing bond options requires simple integration. Model parameters may well be time dependent without scuppering such tractability. Model estimation does not require a r...
Symmetries in discrete-time mechanics
International Nuclear Information System (INIS)
Khorrami, M.
1996-01-01
Based on a general formulation for discrete-time quantum mechanics, introduced by M. Khorrami (Annals Phys. 224 (1995), 101), symmetries in discrete-time quantum mechanics are investigated. It is shown that any classical continuous symmetry leads to a conserved quantity in classical mechanics, as well as quantum mechanics. The transformed wave function, however, has the correct evolution if and only if the symmetry is nonanomalous. Copyright copyright 1996 Academic Press, Inc
Nonlinear integrodifferential equations as discrete systems
Tamizhmani, K. M.; Satsuma, J.; Grammaticos, B.; Ramani, A.
1999-06-01
We analyse a class of integrodifferential equations of the `intermediate long wave' (ILW) type. We show that these equations can be formally interpreted as discrete, differential-difference systems. This allows us to link equations of this type with previous results of ours involving differential-delay equations and, on the basis of this, propose new integrable equations of ILW type. Finally, we extend this approach to pure difference equations and propose ILW forms for the discrete lattice KdV equation.
Definable maximal discrete sets in forcing extensions
DEFF Research Database (Denmark)
Törnquist, Asger Dag; Schrittesser, David
2018-01-01
Let be a Σ11 binary relation, and recall that a set A is -discrete if no two elements of A are related by . We show that in the Sacks and Miller forcing extensions of L there is a Δ12 maximal -discrete set. We use this to answer in the negative the main question posed in [5] by showing...
Application of multivariate splines to discrete mathematics
Xu, Zhiqiang
2005-01-01
Using methods developed in multivariate splines, we present an explicit formula for discrete truncated powers, which are defined as the number of non-negative integer solutions of linear Diophantine equations. We further use the formula to study some classical problems in discrete mathematics as follows. First, we extend the partition function of integers in number theory. Second, we exploit the relation between the relative volume of convex polytopes and multivariate truncated powers and giv...
Discrete symmetries and solar neutrino mixing
Energy Technology Data Exchange (ETDEWEB)
Kapetanakis, D.; Mayr, P.; Nilles, H.P. (Physik Dept., Technische Univ. Muenchen, Garching (Germany) Max-Planck-Inst. fuer Physik, Werner-Heisenberg-Inst., Muenchen (Germany))
1992-05-21
We study the question of resonant solar neutrino mixing in the framework of the supersymmetric extension of the standard model. Discrete symmetries that are consistent with solar neutrino mixing and proton stability are classified. In the minimal model they are shown to lead to two distinct patterns of allowed dimension-four operators. Imposing anomaly freedom, only three different discrete Z{sub N}-symmetries (with N=2, 3, 6) are found to be phenomenologically acceptable. (orig.).
Discrete symmetries and solar neutrino mixing
International Nuclear Information System (INIS)
Kapetanakis, D.; Mayr, P.; Nilles, H.P.
1992-01-01
We study the question of resonant solar neutrino mixing in the framework of the supersymmetric extension of the standard model. Discrete symmetries that are consistent with solar neutrino mixing and proton stability are classified. In the minimal model they are shown to lead to two distinct patterns of allowed dimension-four operators. Imposing anomaly freedom, only three different discrete Z N -symmetries (with N=2, 3, 6) are found to be phenomenologically acceptable. (orig.)
Discrete symmetries and coset space dimensional reduction
International Nuclear Information System (INIS)
Kapetanakis, D.; Zoupanos, G.
1989-01-01
We consider the discrete symmetries of all the six-dimensional coset spaces and we apply them in gauge theories defined in ten dimensions which are dimensionally reduced over these homogeneous spaces. Particular emphasis is given in the consequences of the discrete symmetries on the particle content as well as on the symmetry breaking a la Hosotani of the resulting four-dimensional theory. (orig.)
On discrete models of space-time
International Nuclear Information System (INIS)
Horzela, A.; Kempczynski, J.; Kapuscik, E.; Georgia Univ., Athens, GA; Uzes, Ch.
1992-02-01
Analyzing the Einstein radiolocation method we come to the conclusion that results of any measurement of space-time coordinates should be expressed in terms of rational numbers. We show that this property is Lorentz invariant and may be used in the construction of discrete models of space-time different from the models of the lattice type constructed in the process of discretization of continuous models. (author)
Discrete approximations to vector spin models
Energy Technology Data Exchange (ETDEWEB)
Van Enter, Aernout C D [University of Groningen, Johann Bernoulli Institute of Mathematics and Computing Science, Postbus 407, 9700 AK Groningen (Netherlands); Kuelske, Christof [Ruhr-Universitaet Bochum, Fakultaet fuer Mathematik, D44801 Bochum (Germany); Opoku, Alex A, E-mail: A.C.D.v.Enter@math.rug.nl, E-mail: Christof.Kuelske@ruhr-uni-bochum.de, E-mail: opoku@math.leidenuniv.nl [Mathematisch Instituut, Universiteit Leiden, Postbus 9512, 2300 RA, Leiden (Netherlands)
2011-11-25
We strengthen a result from Kuelske and Opoku (2008 Electron. J. Probab. 13 1307-44) on the existence of effective interactions for discretized continuous-spin models. We also point out that such an interaction cannot exist at very low temperatures. Moreover, we compare two ways of discretizing continuous-spin models, and show that except for very low temperatures, they behave similarly in two dimensions. We also discuss some possibilities in higher dimensions. (paper)
Discrete approximations to vector spin models
International Nuclear Information System (INIS)
Van Enter, Aernout C D; Külske, Christof; Opoku, Alex A
2011-01-01
We strengthen a result from Külske and Opoku (2008 Electron. J. Probab. 13 1307–44) on the existence of effective interactions for discretized continuous-spin models. We also point out that such an interaction cannot exist at very low temperatures. Moreover, we compare two ways of discretizing continuous-spin models, and show that except for very low temperatures, they behave similarly in two dimensions. We also discuss some possibilities in higher dimensions. (paper)
A study of discrete nonlinear systems
International Nuclear Information System (INIS)
Dhillon, H.S.
2001-04-01
An investigation of various spatially discrete time-independent nonlinear models was undertaken. These models are generically applicable to many different physical systems including electron-phonon interactions in solids, magnetic multilayers, layered superconductors and classical lattice systems. To characterise the possible magnetic structures created on magnetic multilayers a model has been formulated and studied. The Euler-Lagrange equation for this model is a discrete version of the Sine-Gordon equation. Solutions of this equation are generated by applying the methods of Chaotic Dynamics - treating the space variable associated with the layer number as a discrete time variable. The states found indicate periodic, quasiperiodic and chaotic structures. Analytic solutions to the discrete nonlinear Schroedinger Equation (DNSE) with cubic nonlinearity are presented in the strong coupling limit. Using these as a starting point, a procedure is developed to determine the wave function and the energy eigenvalue for moderate coupling. The energy eigenvalues of the different structures of the wave function are found to be in excellent agreement with the exact strong coupling result. The solutions to the DNSE indicate commensurate and incommensurate spatial structures associated with different localisation patterns of the wave function. The states which arise may be fractal, periodic, quasiperiodic or chaotic. This work is then extended to solve a first order discrete nonlinear equation. The exact solutions for both the first and second order discrete nonlinear equations with cubic nonlinearity suggests that this method of studying discrete nonlinear equations may be applied to solve discrete equations with any order difference and cubic nonlinearity. (author)
Mohamed, Mamdouh S.
2016-02-11
A conservative discretization of incompressible Navier–Stokes equations is developed based on discrete exterior calculus (DEC). A distinguishing feature of our method is the use of an algebraic discretization of the interior product operator and a combinatorial discretization of the wedge product. The governing equations are first rewritten using the exterior calculus notation, replacing vector calculus differential operators by the exterior derivative, Hodge star and wedge product operators. The discretization is then carried out by substituting with the corresponding discrete operators based on the DEC framework. Numerical experiments for flows over surfaces reveal a second order accuracy for the developed scheme when using structured-triangular meshes, and first order accuracy for otherwise unstructured meshes. By construction, the method is conservative in that both mass and vorticity are conserved up to machine precision. The relative error in kinetic energy for inviscid flow test cases converges in a second order fashion with both the mesh size and the time step.
Mohamed, Mamdouh S.; Hirani, Anil N.; Samtaney, Ravi
2016-05-01
A conservative discretization of incompressible Navier-Stokes equations is developed based on discrete exterior calculus (DEC). A distinguishing feature of our method is the use of an algebraic discretization of the interior product operator and a combinatorial discretization of the wedge product. The governing equations are first rewritten using the exterior calculus notation, replacing vector calculus differential operators by the exterior derivative, Hodge star and wedge product operators. The discretization is then carried out by substituting with the corresponding discrete operators based on the DEC framework. Numerical experiments for flows over surfaces reveal a second order accuracy for the developed scheme when using structured-triangular meshes, and first order accuracy for otherwise unstructured meshes. By construction, the method is conservative in that both mass and vorticity are conserved up to machine precision. The relative error in kinetic energy for inviscid flow test cases converges in a second order fashion with both the mesh size and the time step.
Explicit solutions to the semi-discrete modified KdV equation and motion of discrete plane curves
International Nuclear Information System (INIS)
Inoguchi, Jun-ichi; Kajiwara, Kenji; Matsuura, Nozomu; Ohta, Yasuhiro
2012-01-01
We construct explicit solutions to continuous motion of discrete plane curves described by a semi-discrete potential modified KdV equation. Explicit formulas in terms of the τ function are presented. Bäcklund transformations of the discrete curves are also discussed. We finally consider the continuous limit of discrete motion of discrete plane curves described by the discrete potential modified KdV equation to motion of smooth plane curves characterized by the potential modified KdV equation. (paper)
An integrative multi-platform analysis for discovering biomarkers of osteosarcoma
International Nuclear Information System (INIS)
Li, Guodong; Zhang, Wenjuan; Zeng, Huazong; Chen, Lei; Wang, Wenjing; Liu, Jilong; Zhang, Zhiyu; Cai, Zhengdong
2009-01-01
SELDI-TOF-MS (Surface Enhanced Laser Desorption/Ionization-Time of Flight-Mass Spectrometry) has become an attractive approach for cancer biomarker discovery due to its ability to resolve low mass proteins and high-throughput capability. However, the analytes from mass spectrometry are described only by their mass-to-charge ratio (m/z) values without further identification and annotation. To discover potential biomarkers for early diagnosis of osteosarcoma, we designed an integrative workflow combining data sets from both SELDI-TOF-MS and gene microarray analysis. After extracting the information for potential biomarkers from SELDI data and microarray analysis, their associations were further inferred by link-test to identify biomarkers that could likely be used for diagnosis. Immuno-blot analysis was then performed to examine whether the expression of the putative biomarkers were indeed altered in serum from patients with osteosarcoma. Six differentially expressed protein peaks with strong statistical significances were detected by SELDI-TOF-MS. Four of the proteins were up-regulated and two of them were down-regulated. Microarray analysis showed that, compared with an osteoblastic cell line, the expression of 653 genes was changed more than 2 folds in three osteosarcoma cell lines. While expression of 310 genes was increased, expression of the other 343 genes was decreased. The two sets of biomarkers candidates were combined by the link-test statistics, indicating that 13 genes were potential biomarkers for early diagnosis of osteosarcoma. Among these genes, cytochrome c1 (CYC-1) was selected for further experimental validation. Link-test on datasets from both SELDI-TOF-MS and microarray high-throughput analysis can accelerate the identification of tumor biomarkers. The result confirmed that CYC-1 may be a promising biomarker for early diagnosis of osteosarcoma
Theoretical Basics of Teaching Discrete Mathematics
Directory of Open Access Journals (Sweden)
Y. A. Perminov
2012-01-01
Full Text Available The paper deals with the research findings concerning the process of mastering the theoretical basics of discrete mathematics by the students of vocational pedagogic profile. The methodological analysis is based on the subject and functions of the modern discrete mathematics and its role in mathematical modeling and computing. The modern discrete mathematics (i.e. mathematics of the finite type structures plays the important role in modernization of vocational training. It is especially rele- vant to training students for vocational pedagogic qualifications, as in the future they will be responsible for training the middle and the senior level specialists in engineer- ing and technical spheres. Nowadays in different industries, there arise the problems which require for their solving both continual – based on the classical mathematical methods – and discrete modeling. The teaching course of discrete mathematics for the future vocational teachers should be relevant to the target qualification and aimed at mastering the mathematical modeling, systems of computer mathematics and computer technologies. The author emphasizes the fundamental role of mastering the language of algebraic and serial structures, as well as the logical, algorithmic, combinatory schemes dominating in dis- crete mathematics. The guidelines for selecting the content of the course in discrete mathematics are specified. The theoretical findings of the research can be put into practice whilst developing curricula and working programs for bachelors and masters’ training.
Current density and continuity in discretized models
International Nuclear Information System (INIS)
Boykin, Timothy B; Luisier, Mathieu; Klimeck, Gerhard
2010-01-01
Discrete approaches have long been used in numerical modelling of physical systems in both research and teaching. Discrete versions of the Schroedinger equation employing either one or several basis functions per mesh point are often used by senior undergraduates and beginning graduate students in computational physics projects. In studying discrete models, students can encounter conceptual difficulties with the representation of the current and its divergence because different finite-difference expressions, all of which reduce to the current density in the continuous limit, measure different physical quantities. Understanding these different discrete currents is essential and requires a careful analysis of the current operator, the divergence of the current and the continuity equation. Here we develop point forms of the current and its divergence valid for an arbitrary mesh and basis. We show that in discrete models currents exist only along lines joining atomic sites (or mesh points). Using these results, we derive a discrete analogue of the divergence theorem and demonstrate probability conservation in a purely localized-basis approach.
Discrete Calculus as a Bridge between Scales
Degiuli, Eric; McElwaine, Jim
2012-02-01
Understanding how continuum descriptions of disordered media emerge from the microscopic scale is a fundamental challenge in condensed matter physics. In many systems, it is necessary to coarse-grain balance equations at the microscopic scale to obtain macroscopic equations. We report development of an exact, discrete calculus, which allows identification of discrete microscopic equations with their continuum equivalent [1]. This allows the application of powerful techniques of calculus, such as the Helmholtz decomposition, the Divergence Theorem, and Stokes' Theorem. We illustrate our results with granular materials. In particular, we show how Newton's laws for a single grain reproduce their continuum equivalent in the calculus. This allows introduction of a discrete Airy stress function, exactly as in the continuum. As an application of the formalism, we show how these results give the natural mean-field variation of discrete quantities, in agreement with numerical simulations. The discrete calculus thus acts as a bridge between discrete microscale quantities and continuous macroscale quantities. [4pt] [1] E. DeGiuli & J. McElwaine, PRE 2011. doi: 10.1103/PhysRevE.84.041310
Recent developments in discrete ordinates electron transport
International Nuclear Information System (INIS)
Morel, J.E.; Lorence, L.J. Jr.
1986-01-01
The discrete ordinates method is a deterministic method for numerically solving the Boltzmann equation. It was originally developed for neutron transport calculations, but is routinely used for photon and coupled neutron-photon transport calculations as well. The computational state of the art for coupled electron-photon transport (CEPT) calculations is not as developed as that for neutron transport calculations. The only production codes currently available for CEPT calculations are condensed-history Monte Carlo codes such as the ETRAN and ITS codes. A deterministic capability for production calculations is clearly needed. In response to this need, we have begun the development of a production discrete ordinates code for CEPT calculations. The purpose of this paper is to describe the basic approach we are taking, discuss the current status of the project, and present some new computational results. Although further characterization of the coupled electron-photon discrete ordinates method remains to be done, the results to date indicate that the discrete ordinates method can be just as accurate and from 10 to 100 times faster than the Monte Carlo method for a wide variety of problems. We stress that these results are obtained with standard discrete ordinates codes such as ONETRAN. It is clear that even greater efficiency can be obtained by developing a new generation of production discrete ordinates codes specifically designed to solve the Boltzmann-Fokker-Planck equation. However, the prospects for such development in the near future appear to be remote
Discrete symmetries and their stringy origin
International Nuclear Information System (INIS)
Mayorga Pena, Damian Kaloni
2014-05-01
Discrete symmetries have proven to be very useful in controlling the phenomenology of theories beyond the standard model. In this work we explore how these symmetries emerge from string compactifications. Our approach is twofold: On the one hand, we consider the heterotic string on orbifold backgrounds. In this case the discrete symmetries can be derived from the orbifold conformal field theory, and it can be shown that they are in close relation with the orbifold geometry. We devote special attention to R-symmetries, which arise from discrete remnants of the Lorentz group in compact space. Further we discuss the physical implications of these symmetries both in the heterotic mini-landscape and in newly constructed models based on the Z 2 x Z 4 orbifold. In both cases we observe that the discrete symmetries favor particular locations in the orbifold where the particles of standard model should live. On the other hand we consider a class of F-theory models exhibiting an SU(5) gauge group, times additional U(1) symmetries. In this case, the smooth compactification background does not permit us to track the discrete symmetries as transparently as in orbifold models. Hence, we follow a different approach and search for discrete subgroups emerging after the U(1)s are broken. We observe that in this approach it is possible to obtain the standard Z 2 matter parity of the MSSM.
Statistical phenomena - theory
International Nuclear Information System (INIS)
Hereward, H.G.
1977-01-01
This paper discuss two closely related fields, Schottky signals and stochastic cooling. These effects are both based on the fact that a uniform continuous beam really consists of a finite number of discrete particles of charge e. (Auth.)
Gemignani, Michael C
2006-01-01
Topics include applications of the derivative, sequences and series, the integral and continuous variates, discrete distributions, hypothesis testing, functions of several variables, and regression and correlation. 1970 edition. Includes 201 figures and 36 tables.
Early-Phase Studies of Biomarkers
DEFF Research Database (Denmark)
Pepe, Margaret S.; Janes, Holly; Li, Christopher I.
2016-01-01
of a positive biomarker test in cases (true positive) to cost associated with a positive biomarker test in controls (false positive). Guidance is offered on soliciting the cost/benefit ratio. The calculations are based on the longstanding decision theory concept of providing a net benefit on average...... impact on patient outcomes of using the biomarker to make clinical decisions....
Rostrocaudal Dynamics of CSF Biomarkers
Tarnaris, A.; Toma, A.K.; Chapman, M.D.; Petzold, A.F.S.; Keir, G.; Kitchen, N.D.; Watkins, L.D.
2011-01-01
The rostrocaudal gradient (RCG) of markers present in cerebrospinal fluid (CSF) has not been studied adequately due to lack of appropriate control populations and ethical restrictions. The aim of this study is to understand the rostrocaudal gradient of CSF biomarkers. We contacted a study comparing
Imaging Biomarkers for Adult Medulloblastomas
DEFF Research Database (Denmark)
Keil, V C; Warmuth-Metz, M; Reh, C
2017-01-01
BACKGROUND AND PURPOSE: The occurrence of medulloblastomas in adults is rare; nevertheless, these tumors can be subdivided into genetic and histologic entities each having distinct prognoses. This study aimed to identify MR imaging biomarkers to classify these entities and to uncover differences ...
Biomarkers of satiation and satiety
Graaf, de C.; Blom, W.A.M.; Smeets, P.A.M.; Stafleu, A.; Hendriks, H.F.J.
2004-01-01
This review's objective is to give a critical summary of studies that focused on physiologic measures relating to subjectively rated appetite, actual food intake, or both. Biomarkers of satiation and satiety may be used as a tool for assessing the satiating efficiency of foods and for understanding
Bias in Peripheral Depression Biomarkers
DEFF Research Database (Denmark)
Carvalho, André F; Köhler, Cristiano A; Brunoni, André R
2016-01-01
BACKGROUND: To aid in the differentiation of individuals with major depressive disorder (MDD) from healthy controls, numerous peripheral biomarkers have been proposed. To date, no comprehensive evaluation of the existence of bias favoring the publication of significant results or inflating effect...
Biomarkers of spontaneous preterm birth
DEFF Research Database (Denmark)
Polettini, Jossimara; Cobo, Teresa; Kacerovsky, Marian
2017-01-01
biomarkers associated with PTB published from January 2005 to March 2014. Retrieved citations (3631) were screened, and relevant studies (33) were selected for full-text reading. Ten studies were included in the review. Forty-two PTB-related proteins were reported, and RANTES and IL-10 (three studies...
Discrete integrable systems and deformations of associative algebras
International Nuclear Information System (INIS)
Konopelchenko, B G
2009-01-01
Interrelations between discrete deformations of the structure constants for associative algebras and discrete integrable systems are reviewed. Theory of deformations for associative algebras is presented. Closed left ideal generated by the elements representing the multiplication table plays a central role in this theory. Deformations of the structure constants are generated by the deformation driving algebra and governed by the central system of equations. It is demonstrated that many discrete equations such as discrete Boussinesq equation, discrete WDVV equation, discrete Schwarzian KP and BKP equations, discrete Hirota-Miwa equations for KP and BKP hierarchies are particular realizations of the central system. An interaction between the theories of discrete integrable systems and discrete deformations of associative algebras is reciprocal and fruitful. An interpretation of the Menelaus relation (discrete Schwarzian KP equation), discrete Hirota-Miwa equation for KP hierarchy, consistency around the cube as the associativity conditions and the concept of gauge equivalence, for instance, between the Menelaus and KP configurations are particular examples.
International Nuclear Information System (INIS)
Shi, Ying; Zhang, Da-jun; Nimmo, Jonathan J C
2014-01-01
The Hirota–Miwa equation can be written in ‘nonlinear’ form in two ways: the discrete KP equation and, by using a compatible continuous variable, the discrete potential KP equation. For both systems, we consider the Darboux and binary Darboux transformations, expressed in terms of the continuous variable, and obtain exact solutions in Wronskian and Grammian form. We discuss reductions of both systems to the discrete KdV and discrete potential KdV equation, respectively, and exploit this connection to find the Darboux and binary Darboux transformations and exact solutions of these equations. (paper)
... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...
State Transportation Statistics 2014
2014-12-15
The Bureau of Transportation Statistics (BTS) presents State Transportation Statistics 2014, a statistical profile of transportation in the 50 states and the District of Columbia. This is the 12th annual edition of State Transportation Statistics, a ...
Systems biology and biomarker discovery
Energy Technology Data Exchange (ETDEWEB)
Rodland, Karin D.
2010-12-01
Medical practitioners have always relied on surrogate markers of inaccessible biological processes to make their diagnosis, whether it was the pallor of shock, the flush of inflammation, or the jaundice of liver failure. Obviously, the current implementation of biomarkers for disease is far more sophisticated, relying on highly reproducible, quantitative measurements of molecules that are often mechanistically associated with the disease in question, as in glycated hemoglobin for the diagnosis of diabetes [1] or the presence of cardiac troponins in the blood for confirmation of myocardial infarcts [2]. In cancer, where the initial symptoms are often subtle and the consequences of delayed diagnosis often drastic for disease management, the impetus to discover readily accessible, reliable, and accurate biomarkers for early detection is compelling. Yet despite years of intense activity, the stable of clinically validated, cost-effective biomarkers for early detection of cancer is pathetically small and still dominated by a handful of markers (CA-125, CEA, PSA) first discovered decades ago. It is time, one could argue, for a fresh approach to the discovery and validation of disease biomarkers, one that takes full advantage of the revolution in genomic technologies and in the development of computational tools for the analysis of large complex datasets. This issue of Disease Markers is dedicated to one such new approach, loosely termed the 'Systems Biology of Biomarkers'. What sets the Systems Biology approach apart from other, more traditional approaches, is both the types of data used, and the tools used for data analysis - and both reflect the revolution in high throughput analytical methods and high throughput computing that has characterized the start of the twenty first century.
Parametric statistical change point analysis
Chen, Jie
2000-01-01
This work is an in-depth study of the change point problem from a general point of view and a further examination of change point analysis of the most commonly used statistical models Change point problems are encountered in such disciplines as economics, finance, medicine, psychology, signal processing, and geology, to mention only several The exposition is clear and systematic, with a great deal of introductory material included Different models are presented in each chapter, including gamma and exponential models, rarely examined thus far in the literature Other models covered in detail are the multivariate normal, univariate normal, regression, and discrete models Extensive examples throughout the text emphasize key concepts and different methodologies are used, namely the likelihood ratio criterion, and the Bayesian and information criterion approaches A comprehensive bibliography and two indices complete the study
A modern course in statistical physics
Reichl, Linda E
2016-01-01
"A Modern Course in Statistical Physics" is a textbook that illustrates the foundations of equilibrium and non-equilibrium statistical physics, and the universal nature of thermodynamic processes, from the point of view of contemporary research problems. The book treats such diverse topics as the microscopic theory of critical phenomena, superfluid dynamics, quantum conductance, light scattering, transport processes, and dissipative structures, all in the framework of the foundations of statistical physics and thermodynamics. It shows the quantum origins of problems in classical statistical physics. One focus of the book is fluctuations that occur due to the discrete nature of matter, a topic of growing importance for nanometer scale physics and biophysics. Another focus concerns classical and quantum phase transitions, in both monatomic and mixed particle systems. This fourth edition extends the range of topics considered to include, for example, entropic forces, electrochemical processes in biological syste...
Renyi statistics in equilibrium statistical mechanics
International Nuclear Information System (INIS)
Parvan, A.S.; Biro, T.S.
2010-01-01
The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.
Sampling, Probability Models and Statistical Reasoning Statistical
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...
Addressing the Challenge of Defining Valid Proteomic Biomarkers and Classifiers
LENUS (Irish Health Repository)
Dakna, Mohammed
2010-12-10
Abstract Background The purpose of this manuscript is to provide, based on an extensive analysis of a proteomic data set, suggestions for proper statistical analysis for the discovery of sets of clinically relevant biomarkers. As tractable example we define the measurable proteomic differences between apparently healthy adult males and females. We choose urine as body-fluid of interest and CE-MS, a thoroughly validated platform technology, allowing for routine analysis of a large number of samples. The second urine of the morning was collected from apparently healthy male and female volunteers (aged 21-40) in the course of the routine medical check-up before recruitment at the Hannover Medical School. Results We found that the Wilcoxon-test is best suited for the definition of potential biomarkers. Adjustment for multiple testing is necessary. Sample size estimation can be performed based on a small number of observations via resampling from pilot data. Machine learning algorithms appear ideally suited to generate classifiers. Assessment of any results in an independent test-set is essential. Conclusions Valid proteomic biomarkers for diagnosis and prognosis only can be defined by applying proper statistical data mining procedures. In particular, a justification of the sample size should be part of the study design.
Implementation of proteomic biomarkers: making it work.
Mischak, Harald; Ioannidis, John P A; Argiles, Angel; Attwood, Teresa K; Bongcam-Rudloff, Erik; Broenstrup, Mark; Charonis, Aristidis; Chrousos, George P; Delles, Christian; Dominiczak, Anna; Dylag, Tomasz; Ehrich, Jochen; Egido, Jesus; Findeisen, Peter; Jankowski, Joachim; Johnson, Robert W; Julien, Bruce A; Lankisch, Tim; Leung, Hing Y; Maahs, David; Magni, Fulvio; Manns, Michael P; Manolis, Efthymios; Mayer, Gert; Navis, Gerjan; Novak, Jan; Ortiz, Alberto; Persson, Frederik; Peter, Karlheinz; Riese, Hans H; Rossing, Peter; Sattar, Naveed; Spasovski, Goce; Thongboonkerd, Visith; Vanholder, Raymond; Schanstra, Joost P; Vlahou, Antonia
2012-09-01
While large numbers of proteomic biomarkers have been described, they are generally not implemented in medical practice. We have investigated the reasons for this shortcoming, focusing on hurdles downstream of biomarker verification, and describe major obstacles and possible solutions to ease valid biomarker implementation. Some of the problems lie in suboptimal biomarker discovery and validation, especially lack of validated platforms with well-described performance characteristics to support biomarker qualification. These issues have been acknowledged and are being addressed, raising the hope that valid biomarkers may start accumulating in the foreseeable future. However, successful biomarker discovery and qualification alone does not suffice for successful implementation. Additional challenges include, among others, limited access to appropriate specimens and insufficient funding, the need to validate new biomarker utility in interventional trials, and large communication gaps between the parties involved in implementation. To address this problem, we propose an implementation roadmap. The implementation effort needs to involve a wide variety of stakeholders (clinicians, statisticians, health economists, and representatives of patient groups, health insurance, pharmaceutical companies, biobanks, and regulatory agencies). Knowledgeable panels with adequate representation of all these stakeholders may facilitate biomarker evaluation and guide implementation for the specific context of use. This approach may avoid unwarranted delays or failure to implement potentially useful biomarkers, and may expedite meaningful contributions of the biomarker community to healthcare. © 2012 The Authors. European Journal of Clinical Investigation © 2012 Stichting European Society for Clinical Investigation Journal Foundation.
Biomarkers of PTSD: military applications and considerations
Directory of Open Access Journals (Sweden)
Amy Lehrner
2014-08-01
Full Text Available Background: Although there are no established biomarkers for posttraumatic stress disorder (PTSD as yet, biological investigations of PTSD have made progress identifying the pathophysiology of PTSD. Given the biological and clinical complexity of PTSD, it is increasingly unlikely that a single biomarker of disease will be identified. Rather, investigations will more likely identify different biomarkers that indicate the presence of clinically significant PTSD symptoms, associate with risk for PTSD following trauma exposure, and predict or identify recovery. While there has been much interest in PTSD biomarkers, there has been less discussion of their potential clinical applications, and of the social, legal, and ethical implications of such biomarkers. Objective: This article will discuss possible applications of PTSD biomarkers, including the social, legal, and ethical implications of such biomarkers, with an emphasis on military applications. Method: Literature on applications of PTSD biomarkers and on potential ethical and legal implications will be reviewed. Results: Biologically informed research findings hold promise for prevention, assessment, treatment planning, and the development of prophylactic and treatment interventions. As with any biological indicator of disorder, there are potentially positive and negative clinical, social, legal, and ethical consequences of using such biomarkers. Conclusions: Potential clinical applications of PTSD biomarkers hold promise for clinicians, patients, and employers. The search for biomarkers of PTSD should occur in tandem with an interdisciplinary discussion regarding the potential implications of applying biological findings in clinical and employment settings.
Biomarkers of PTSD: military applications and considerations.
Lehrner, Amy; Yehuda, Rachel
2014-01-01
Although there are no established biomarkers for posttraumatic stress disorder (PTSD) as yet, biological investigations of PTSD have made progress identifying the pathophysiology of PTSD. Given the biological and clinical complexity of PTSD, it is increasingly unlikely that a single biomarker of disease will be identified. Rather, investigations will more likely identify different biomarkers that indicate the presence of clinically significant PTSD symptoms, associate with risk for PTSD following trauma exposure, and predict or identify recovery. While there has been much interest in PTSD biomarkers, there has been less discussion of their potential clinical applications, and of the social, legal, and ethical implications of such biomarkers. This article will discuss possible applications of PTSD biomarkers, including the social, legal, and ethical implications of such biomarkers, with an emphasis on military applications. Literature on applications of PTSD biomarkers and on potential ethical and legal implications will be reviewed. Biologically informed research findings hold promise for prevention, assessment, treatment planning, and the development of prophylactic and treatment interventions. As with any biological indicator of disorder, there are potentially positive and negative clinical, social, legal, and ethical consequences of using such biomarkers. Potential clinical applications of PTSD biomarkers hold promise for clinicians, patients, and employers. The search for biomarkers of PTSD should occur in tandem with an interdisciplinary discussion regarding the potential implications of applying biological findings in clinical and employment settings.
Implementation of proteomic biomarkers: making it work
Mischak, Harald; Ioannidis, John PA; Argiles, Angel; Attwood, Teresa K; Bongcam-Rudloff, Erik; Broenstrup, Mark; Charonis, Aristidis; Chrousos, George P; Delles, Christian; Dominiczak, Anna; Dylag, Tomasz; Ehrich, Jochen; Egido, Jesus; Findeisen, Peter; Jankowski, Joachim; Johnson, Robert W; Julien, Bruce A; Lankisch, Tim; Leung, Hing Y; Maahs, David; Magni, Fulvio; Manns, Michael P; Manolis, Efthymios; Mayer, Gert; Navis, Gerjan; Novak, Jan; Ortiz, Alberto; Persson, Frederik; Peter, Karlheinz; Riese, Hans H; Rossing, Peter; Sattar, Naveed; Spasovski, Goce; Thongboonkerd, Visith; Vanholder, Raymond; Schanstra, Joost P; Vlahou, Antonia
2012-01-01
While large numbers of proteomic biomarkers have been described, they are generally not implemented in medical practice. We have investigated the reasons for this shortcoming, focusing on hurdles downstream of biomarker verification, and describe major obstacles and possible solutions to ease valid biomarker implementation. Some of the problems lie in suboptimal biomarker discovery and validation, especially lack of validated platforms with well-described performance characteristics to support biomarker qualification. These issues have been acknowledged and are being addressed, raising the hope that valid biomarkers may start accumulating in the foreseeable future. However, successful biomarker discovery and qualification alone does not suffice for successful implementation. Additional challenges include, among others, limited access to appropriate specimens and insufficient funding, the need to validate new biomarker utility in interventional trials, and large communication gaps between the parties involved in implementation. To address this problem, we propose an implementation roadmap. The implementation effort needs to involve a wide variety of stakeholders (clinicians, statisticians, health economists, and representatives of patient groups, health insurance, pharmaceutical companies, biobanks, and regulatory agencies). Knowledgeable panels with adequate representation of all these stakeholders may facilitate biomarker evaluation and guide implementation for the specific context of use. This approach may avoid unwarranted delays or failure to implement potentially useful biomarkers, and may expedite meaningful contributions of the biomarker community to healthcare. PMID:22519700
Convergence of posteriors for discretized log Gaussian Cox processes
DEFF Research Database (Denmark)
Waagepetersen, Rasmus Plenge
2004-01-01
In Markov chain Monte Carlo posterior computation for log Gaussian Cox processes (LGCPs) a discretization of the continuously indexed Gaussian field is required. It is demonstrated that approximate posterior expectations computed from discretized LGCPs converge to the exact posterior expectations...... when the cell sizes of the discretization tends to zero. The effect of discretization is studied in a data example....
Proteomic and metabolomic approaches to biomarker discovery
Issaq, Haleem J
2013-01-01
Proteomic and Metabolomic Approaches to Biomarker Discovery demonstrates how to leverage biomarkers to improve accuracy and reduce errors in research. Disease biomarker discovery is one of the most vibrant and important areas of research today, as the identification of reliable biomarkers has an enormous impact on disease diagnosis, selection of treatment regimens, and therapeutic monitoring. Various techniques are used in the biomarker discovery process, including techniques used in proteomics, the study of the proteins that make up an organism, and metabolomics, the study of chemical fingerprints created from cellular processes. Proteomic and Metabolomic Approaches to Biomarker Discovery is the only publication that covers techniques from both proteomics and metabolomics and includes all steps involved in biomarker discovery, from study design to study execution. The book describes methods, and presents a standard operating procedure for sample selection, preparation, and storage, as well as data analysis...
Meeting Report--NASA Radiation Biomarker Workshop
Energy Technology Data Exchange (ETDEWEB)
Straume, Tore; Amundson, Sally A,; Blakely, William F.; Burns, Frederic J.; Chen, Allen; Dainiak, Nicholas; Franklin, Stephen; Leary, Julie A.; Loftus, David J.; Morgan, William F.; Pellmar, Terry C.; Stolc, Viktor; Turteltaub, Kenneth W.; Vaughan, Andrew T.; Vijayakumar, Srinivasan; Wyrobek, Andrew J.
2008-05-01
A summary is provided of presentations and discussions from the NASA Radiation Biomarker Workshop held September 27-28, 2007, at NASA Ames Research Center in Mountain View, California. Invited speakers were distinguished scientists representing key sectors of the radiation research community. Speakers addressed recent developments in the biomarker and biotechnology fields that may provide new opportunities for health-related assessment of radiation-exposed individuals, including for long-duration space travel. Topics discussed include the space radiation environment, biomarkers of radiation sensitivity and individual susceptibility, molecular signatures of low-dose responses, multivariate analysis of gene expression, biomarkers in biodefense, biomarkers in radiation oncology, biomarkers and triage following large-scale radiological incidents, integrated and multiple biomarker approaches, advances in whole-genome tiling arrays, advances in mass-spectrometry proteomics, radiation biodosimetry for estimation of cancer risk in a rat skin model, and confounding factors. Summary conclusions are provided at the end of the report.
The Knowledge-Integrated Network Biomarkers Discovery for Major Adverse Cardiac Events
Jin, Guangxu; Zhou, Xiaobo; Wang, Honghui; Zhao, Hong; Cui, Kemi; Zhang, Xiang-Sun; Chen, Luonan; Hazen, Stanley L.; Li, King; Wong, Stephen T. C.
2010-01-01
The mass spectrometry (MS) technology in clinical proteomics is very promising for discovery of new biomarkers for diseases management. To overcome the obstacles of data noises in MS analysis, we proposed a new approach of knowledge-integrated biomarker discovery using data from Major Adverse Cardiac Events (MACE) patients. We first built up a cardiovascular-related network based on protein information coming from protein annotations in Uniprot, protein–protein interaction (PPI), and signal transduction database. Distinct from the previous machine learning methods in MS data processing, we then used statistical methods to discover biomarkers in cardiovascular-related network. Through the tradeoff between known protein information and data noises in mass spectrometry data, we finally could firmly identify those high-confident biomarkers. Most importantly, aided by protein–protein interaction network, that is, cardiovascular-related network, we proposed a new type of biomarkers, that is, network biomarkers, composed of a set of proteins and the interactions among them. The candidate network biomarkers can classify the two groups of patients more accurately than current single ones without consideration of biological molecular interaction. PMID:18665624
Procalcitonin as an adjunctive biomarker in sepsis
Directory of Open Access Journals (Sweden)
Mahua Sinha
2011-01-01
Full Text Available Sepsis can sometimes be difficult to substantiate, and its distinction from non-infectious conditions in critically ill patients is often a challenge. Serum procalcitonin (PCT assay is one of the biomarkers of sepsis. The present study was aimed to assess the usefulness of PCT assay in critically ill patients with suspected sepsis. The study included 40 patients from the intensive care unit with suspected sepsis. Sepsis was confirmed clinically and/or by positive blood culture. Serum PCT was assayed semi-quantitatively by rapid immunochromatographic technique (within 2 hours of sample receipt. Among 40 critically ill patients, 21 had clinically confirmed sepsis. There were 12 patients with serum PCT ≥10 ng/ml (8, blood culture positive; 1, rickettsia; 2, post-antibiotic blood culture sterile; and 1, non-sepsis; 7 patients with PCT 2-10 ng/ml (4, blood culture positive; 1, falciparum malaria; 2, post-antibiotic blood culture sterile; 3 patients with PCT of 0.5 to 2 ng/ml (sepsis in 1 patient; and 18 patients with PCT < 0.5 ng/ml (sepsis in 2 patients. Patients with PCT ≥ 2 ng/ml had statistically significant correlation with the presence of sepsis (P<0.0001. The PCT assay revealed moderate sensitivity (86% and high specificity (95% at a cut-off ≥ 2 ng/ml. The PCT assay was found to be a useful biomarker of sepsis in this study. The assay could be performed and reported rapidly and provided valuable information before availability of culture results. This might assist in avoiding unwarranted antibiotic usage.
Czech Academy of Sciences Publication Activity Database
Šmíd, Martin
2009-01-01
Roč. 165, č. 1 (2009), s. 29-45 ISSN 0254-5330 R&D Projects: GA ČR GA402/04/1294 Institutional research plan: CEZ:AV0Z10750506 Keywords : multistage stochastic programming problems * approximation * discretization * Monte Carlo Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.961, year: 2009 http://library.utia.cas.cz/separaty/2008/E/smid-the expected loss in the discretization of multistage stochastic programming problems - estimation and convergence rate.pdf
Discrete Feature Model (DFM) User Documentation
International Nuclear Information System (INIS)
Geier, Joel
2008-06-01
This manual describes the Discrete-Feature Model (DFM) software package for modelling groundwater flow and solute transport in networks of discrete features. A discrete-feature conceptual model represents fractures and other water-conducting features around a repository as discrete conductors surrounded by a rock matrix which is usually treated as impermeable. This approximation may be valid for crystalline rocks such as granite or basalt, which have very low permeability if macroscopic fractures are excluded. A discrete feature is any entity that can conduct water and permit solute transport through bedrock, and can be reasonably represented as a piecewise-planar conductor. Examples of such entities may include individual natural fractures (joints or faults), fracture zones, and disturbed-zone features around tunnels (e.g. blasting-induced fractures or stress-concentration induced 'onion skin' fractures around underground openings). In a more abstract sense, the effectively discontinuous nature of pathways through fractured crystalline bedrock may be idealized as discrete, equivalent transmissive features that reproduce large-scale observations, even if the details of connective paths (and unconnected domains) are not precisely known. A discrete-feature model explicitly represents the fundamentally discontinuous and irregularly connected nature of systems of such systems, by constraining flow and transport to occur only within such features and their intersections. Pathways for flow and solute transport in this conceptualization are a consequence not just of the boundary conditions and hydrologic properties (as with continuum models), but also the irregularity of connections between conductive/transmissive features. The DFM software package described here is an extensible code for investigating problems of flow and transport in geological (natural or human-altered) systems that can be characterized effectively in terms of discrete features. With this software, the
Discrete Feature Model (DFM) User Documentation
Energy Technology Data Exchange (ETDEWEB)
Geier, Joel (Clearwater Hardrock Consulting, Corvallis, OR (United States))
2008-06-15
This manual describes the Discrete-Feature Model (DFM) software package for modelling groundwater flow and solute transport in networks of discrete features. A discrete-feature conceptual model represents fractures and other water-conducting features around a repository as discrete conductors surrounded by a rock matrix which is usually treated as impermeable. This approximation may be valid for crystalline rocks such as granite or basalt, which have very low permeability if macroscopic fractures are excluded. A discrete feature is any entity that can conduct water and permit solute transport through bedrock, and can be reasonably represented as a piecewise-planar conductor. Examples of such entities may include individual natural fractures (joints or faults), fracture zones, and disturbed-zone features around tunnels (e.g. blasting-induced fractures or stress-concentration induced 'onion skin' fractures around underground openings). In a more abstract sense, the effectively discontinuous nature of pathways through fractured crystalline bedrock may be idealized as discrete, equivalent transmissive features that reproduce large-scale observations, even if the details of connective paths (and unconnected domains) are not precisely known. A discrete-feature model explicitly represents the fundamentally discontinuous and irregularly connected nature of systems of such systems, by constraining flow and transport to occur only within such features and their intersections. Pathways for flow and solute transport in this conceptualization are a consequence not just of the boundary conditions and hydrologic properties (as with continuum models), but also the irregularity of connections between conductive/transmissive features. The DFM software package described here is an extensible code for investigating problems of flow and transport in geological (natural or human-altered) systems that can be characterized effectively in terms of discrete features. With this
Peto, Maximus V; De la Guardia, Carlos; Winslow, Ksenia; Ho, Andrew; Fortney, Kristen; Morgen, Eric
2017-08-31
Biomarkers of all-cause mortality are of tremendous clinical and research interest. Because of the long potential duration of prospective human lifespan studies, such biomarkers can play a key role in quantifying human aging and quickly evaluating any potential therapies. Decades of research into mortality biomarkers have resulted in numerous associations documented across hundreds of publications. Here, we present MortalityPredictors.org , a manually-curated, publicly accessible database, housing published, statistically-significant relationships between biomarkers and all-cause mortality in population-based or generally healthy samples. To gather the information for this database, we searched PubMed for appropriate research papers and then manually curated relevant data from each paper. We manually curated 1,576 biomarker associations, involving 471 distinct biomarkers. Biomarkers ranged in type from hematologic (red blood cell distribution width) to molecular (DNA methylation changes) to physical (grip strength). Via the web interface, the resulting data can be easily browsed, searched, and downloaded for further analysis. MortalityPredictors.org provides comprehensive results on published biomarkers of human all-cause mortality that can be used to compare biomarkers, facilitate meta-analysis, assist with the experimental design of aging studies, and serve as a central resource for analysis. We hope that it will facilitate future research into human mortality and aging.
Discrete stochastic analogs of Erlang epidemic models.
Getz, Wayne M; Dougherty, Eric R
2018-12-01
Erlang differential equation models of epidemic processes provide more realistic disease-class transition dynamics from susceptible (S) to exposed (E) to infectious (I) and removed (R) categories than the ubiquitous SEIR model. The latter is itself is at one end of the spectrum of Erlang SE[Formula: see text]I[Formula: see text]R models with [Formula: see text] concatenated E compartments and [Formula: see text] concatenated I compartments. Discrete-time models, however, are computationally much simpler to simulate and fit to epidemic outbreak data than continuous-time differential equations, and are also much more readily extended to include demographic and other types of stochasticity. Here we formulate discrete-time deterministic analogs of the Erlang models, and their stochastic extension, based on a time-to-go distributional principle. Depending on which distributions are used (e.g. discretized Erlang, Gamma, Beta, or Uniform distributions), we demonstrate that our formulation represents both a discretization of Erlang epidemic models and generalizations thereof. We consider the challenges of fitting SE[Formula: see text]I[Formula: see text]R models and our discrete-time analog to data (the recent outbreak of Ebola in Liberia). We demonstrate that the latter performs much better than the former; although confining fits to strict SEIR formulations reduces the numerical challenges, but sacrifices best-fit likelihood scores by at least 7%.
Positivity for Convective Semi-discretizations
Fekete, Imre
2017-04-19
We propose a technique for investigating stability properties like positivity and forward invariance of an interval for method-of-lines discretizations, and apply the technique to study positivity preservation for a class of TVD semi-discretizations of 1D scalar hyperbolic conservation laws. This technique is a generalization of the approach suggested in Khalsaraei (J Comput Appl Math 235(1): 137–143, 2010). We give more relaxed conditions on the time-step for positivity preservation for slope-limited semi-discretizations integrated in time with explicit Runge–Kutta methods. We show that the step-size restrictions derived are sharp in a certain sense, and that many higher-order explicit Runge–Kutta methods, including the classical 4th-order method and all non-confluent methods with a negative Butcher coefficient, cannot generally maintain positivity for these semi-discretizations under any positive step size. We also apply the proposed technique to centered finite difference discretizations of scalar hyperbolic and parabolic problems.
Noether symmetries of discrete mechanico–electrical systems
International Nuclear Information System (INIS)
Fu Jingli; Xie Fengping; Chen Benyong
2008-01-01
This paper focuses on studying Noether symmetries and conservation laws of the discrete mechanico-electrical systems with the nonconservative and the dissipative forces. Based on the invariance of discrete Hamilton action of the systems under the infinitesimal transformation with respect to the generalized coordinates, the generalized electrical quantities and time, it presents the discrete analogue of variational principle, the discrete analogue of Lagrange–Maxwell equations, the discrete analogue of Noether theorems for Lagrange–Maxwell and Lagrange mechanico-electrical systems. Also, the discrete Noether operator identity and the discrete Noether-type conservation laws are obtained for these systems. An actual example is given to illustrate these results. (general)
Discrete breathers for a discrete nonlinear Schrödinger ring coupled to a central site.
Jason, Peter; Johansson, Magnus
2016-01-01
We examine the existence and properties of certain discrete breathers for a discrete nonlinear Schrödinger model where all but one site are placed in a ring and coupled to the additional central site. The discrete breathers we focus on are stationary solutions mainly localized on one or a few of the ring sites and possibly also the central site. By numerical methods, we trace out and study the continuous families the discrete breathers belong to. Our main result is the discovery of a split bifurcation at a critical value of the coupling between neighboring ring sites. Below this critical value, families form closed loops in a certain parameter space, implying that discrete breathers with and without central-site occupation belong to the same family. Above the split bifurcation the families split up into several separate ones, which bifurcate with solutions with constant ring amplitudes. For symmetry reasons, the families have different properties below the split bifurcation for even and odd numbers of sites. It is also determined under which conditions the discrete breathers are linearly stable. The dynamics of some simpler initial conditions that approximate the discrete breathers are also studied and the parameter regimes where the dynamics remain localized close to the initially excited ring site are related to the linear stability of the exact discrete breathers.
Discrete Localized States and Localization Dynamics in Discrete Nonlinear Schrödinger Equations
DEFF Research Database (Denmark)
Christiansen, Peter Leth; Gaididei, Yu.B.; Mezentsev, V.K.
1996-01-01
Dynamics of two-dimensional discrete structures is studied in the framework of the generalized two-dimensional discrete nonlinear Schrodinger equation. The nonlinear coupling in the form of the Ablowitz-Ladik nonlinearity is taken into account. Stability properties of the stationary solutions...
Rosenstein, Joseph G., Ed.; Franzblau, Deborah S., Ed.; Roberts, Fred S., Ed.
This book is a collection of articles by experienced educators and explains why and how discrete mathematics should be taught in K-12 classrooms. It includes evidence for "why" and practical guidance for "how" and also discusses how discrete mathematics can be used as a vehicle for achieving the broader goals of the major…
Lassere, Marissa N; Johnson, Kent R; Boers, Maarten; Tugwell, Peter; Brooks, Peter; Simon, Lee; Strand, Vibeke; Conaghan, Philip G; Ostergaard, Mikkel; Maksymowych, Walter P; Landewe, Robert; Bresnihan, Barry; Tak, Paul-Peter; Wakefield, Richard; Mease, Philip; Bingham, Clifton O; Hughes, Michael; Altman, Doug; Buyse, Marc; Galbraith, Sally; Wells, George
2007-03-01
There are clear advantages to using biomarkers and surrogate endpoints, but concerns about clinical and statistical validity and systematic methods to evaluate these aspects hinder their efficient application. Our objective was to review the literature on biomarkers and surrogates to develop a hierarchical schema that systematically evaluates and ranks the surrogacy status of biomarkers and surrogates; and to obtain feedback from stakeholders. After a systematic search of Medline and Embase on biomarkers, surrogate (outcomes, endpoints, markers, indicators), intermediate endpoints, and leading indicators, a quantitative surrogate validation schema was developed and subsequently evaluated at a stakeholder workshop. The search identified several classification schema and definitions. Components of these were incorporated into a new quantitative surrogate validation level of evidence schema that evaluates biomarkers along 4 domains: Target, Study Design, Statistical Strength, and Penalties. Scores derived from 3 domains the Target that the marker is being substituted for, the Design of the (best) evidence, and the Statistical strength are additive. Penalties are then applied if there is serious counterevidence. A total score (0 to 15) determines the level of evidence, with Level 1 the strongest and Level 5 the weakest. It was proposed that the term "surrogate" be restricted to markers attaining Levels 1 or 2 only. Most stakeholders agreed that this operationalization of the National Institutes of Health definitions of biomarker, surrogate endpoint, and clinical endpoint was useful. Further development and application of this schema provides incentives and guidance for effective biomarker and surrogate endpoint research, and more efficient drug discovery, development, and approval.
Inference of Causal Relationships between Biomarkers and Outcomes in High Dimensions
Directory of Open Access Journals (Sweden)
Felix Agakov
2011-12-01
Full Text Available We describe a unified computational framework for learning causal dependencies between genotypes, biomarkers, and phenotypic outcomes from large-scale data. In contrast to previous studies, our framework allows for noisy measurements, hidden confounders, missing data, and pleiotropic effects of genotypes on outcomes. The method exploits the use of genotypes as “instrumental variables” to infer causal associations between phenotypic biomarkers and outcomes, without requiring the assumption that genotypic effects are mediated only through the observed biomarkers. The framework builds on sparse linear methods developed in statistics and machine learning and modified here for inferring structures of richer networks with latent variables. Where the biomarkers are gene transcripts, the method can be used for fine mapping of quantitative trait loci (QTLs detected in genetic linkage studies. To demonstrate our method, we examined effects of gene transcript levels in the liver on plasma HDL cholesterol levels in a sample of 260 mice from a heterogeneous stock.
Electroencephalography Is a Good Complement to Currently Established Dementia Biomarkers
DEFF Research Database (Denmark)
Ferreira, Daniel; Jelic, Vesna; Cavallin, Lena
2016-01-01
, 135 Alzheimer's disease (AD), 15 dementia with Lewy bodies/Parkinson's disease with dementia (DLB/PDD), 32 other dementias]. The EEG data were recorded in a standardized way. Structural imaging data were visually rated using scales of atrophy in the medial temporal, frontal, and posterior cortex......BACKGROUND/AIMS: Dementia biomarkers that are accessible and easily applicable in nonspecialized clinical settings are urgently needed. Quantitative electroencephalography (qEEG) is a good candidate, and the statistical pattern recognition (SPR) method has recently provided promising results. We......EEG to the diagnostic workup substantially increases the detection of AD pathology even in pre-dementia stages and improves differential diagnosis. EEG could serve as a good complement to currently established dementia biomarkers since it is cheap, noninvasive, and extensively applied outside academic centers....
Glycoscience aids in biomarker discovery
Directory of Open Access Journals (Sweden)
Serenus Hua1,2 & Hyun Joo An1,2,*
2012-06-01
Full Text Available The glycome consists of all glycans (or carbohydrates within abiological system, and modulates a wide range of important biologicalactivities, from protein folding to cellular communications.The mining of the glycome for disease markers representsa new paradigm for biomarker discovery; however, this effortis severely complicated by the vast complexity and structuraldiversity of glycans. This review summarizes recent developmentsin analytical technology and methodology as applied tothe fields of glycomics and glycoproteomics. Mass spectrometricstrategies for glycan compositional profiling are described, as arepotential refinements which allow structure-specific profiling.Analytical methods that can discern protein glycosylation at aspecific site of modification are also discussed in detail.Biomarker discovery applications are shown at each level ofanalysis, highlighting the key role that glycoscience can play inhelping scientists understand disease biology.
Candidate immune biomarkers for radioimmunotherapy.
Levy, Antonin; Nigro, Giulia; Sansonetti, Philippe J; Deutsch, Eric
2017-08-01
Newly available immune checkpoint blockers (ICBs), capable to revert tumor immune tolerance, are revolutionizing the anticancer armamentarium. Recent evidence also established that ionizing radiation (IR) could produce antitumor immune responses, and may as well synergize with ICBs. Multiple radioimmunotherapy combinations are thenceforth currently assessed in early clinical trials. Past examples have highlighted the need for treatment personalization, and there is an unmet need to decipher immunological biomarkers that could allow selecting patients who could benefit from these promising but expensive associations. Recent studies have identified potential predictive and prognostic immune assays at the cellular (tumor microenvironment composition), genomic (mutational/neoantigen load), and peripheral blood levels. Within this review, we collected the available evidence regarding potential personalized immune biomarker-directed radiation therapy strategies that might be used for patient selection in the era of radioimmunotherapy. Copyright © 2017. Published by Elsevier B.V.
Limit sets for the discrete spectrum of complex Jacobi matrices
International Nuclear Information System (INIS)
Golinskii, L B; Egorova, I E
2005-01-01
The discrete spectrum of complex Jacobi matrices that are compact perturbations of the discrete Laplacian is studied. The precise stabilization rate (in the sense of order) of the matrix elements ensuring the finiteness of the discrete spectrum is found. An example of a Jacobi matrix with discrete spectrum having a unique limit point is constructed. These results are discrete analogues of Pavlov's well-known results on Schroedinger operators with complex potential on a half-axis.
Euler-Poincare reduction for discrete field theories
International Nuclear Information System (INIS)
Vankerschaver, Joris
2007-01-01
In this note, we develop a theory of Euler-Poincare reduction for discrete Lagrangian field theories. We introduce the concept of Euler-Poincare equations for discrete field theories, as well as a natural extension of the Moser-Veselov scheme, and show that both are equivalent. The resulting discrete field equations are interpreted in terms of discrete differential geometry. An application to the theory of discrete harmonic mappings is also briefly discussed
Integrals of Motion for Discrete-Time Optimal Control Problems
Torres, Delfim F. M.
2003-01-01
We obtain a discrete time analog of E. Noether's theorem in Optimal Control, asserting that integrals of motion associated to the discrete time Pontryagin Maximum Principle can be computed from the quasi-invariance properties of the discrete time Lagrangian and discrete time control system. As corollaries, results for first-order and higher-order discrete problems of the calculus of variations are obtained.
The ultimatum game: Discrete vs. continuous offers
Dishon-Berkovits, Miriam; Berkovits, Richard
2014-09-01
In many experimental setups in social-sciences, psychology and economy the subjects are requested to accept or dispense monetary compensation which is usually given in discrete units. Using computer and mathematical modeling we show that in the framework of studying the dynamics of acceptance of proposals in the ultimatum game, the long time dynamics of acceptance of offers in the game are completely different for discrete vs. continuous offers. For discrete values the dynamics follow an exponential behavior. However, for continuous offers the dynamics are described by a power-law. This is shown using an agent based computer simulation as well as by utilizing an analytical solution of a mean-field equation describing the model. These findings have implications to the design and interpretation of socio-economical experiments beyond the ultimatum game.
Symmetric, discrete fractional splines and Gabor systems
DEFF Research Database (Denmark)
Søndergaard, Peter Lempel
2006-01-01
In this paper we consider fractional splines as windows for Gabor frames. We introduce two new types of symmetric, fractional splines in addition to one found by Unser and Blu. For the finite, discrete case we present two families of splines: One is created by sampling and periodizing the continu......In this paper we consider fractional splines as windows for Gabor frames. We introduce two new types of symmetric, fractional splines in addition to one found by Unser and Blu. For the finite, discrete case we present two families of splines: One is created by sampling and periodizing...... the continuous splines, and one is a truly finite, discrete construction. We discuss the properties of these splines and their usefulness as windows for Gabor frames and Wilson bases....
Sputtering calculations with the discrete ordinated method
International Nuclear Information System (INIS)
Hoffman, T.J.; Dodds, H.L. Jr.; Robinson, M.T.; Holmes, D.K.
1977-01-01
The purpose of this work is to investigate the applicability of the discrete ordinates (S/sub N/) method to light ion sputtering problems. In particular, the neutral particle discrete ordinates computer code, ANISN, was used to calculate sputtering yields. No modifications to this code were necessary to treat charged particle transport. However, a cross section processing code was written for the generation of multigroup cross sections; these cross sections include a modification to the total macroscopic cross section to account for electronic interactions and small-scattering-angle elastic interactions. The discrete ordinates approach enables calculation of the sputtering yield as functions of incident energy and angle and of many related quantities such as ion reflection coefficients, angular and energy distributions of sputtering particles, the behavior of beams penetrating thin foils, etc. The results of several sputtering problems as calculated with ANISN are presented
Direct Discrete Method for Neutronic Calculations
International Nuclear Information System (INIS)
Vosoughi, Naser; Akbar Salehi, Ali; Shahriari, Majid
2002-01-01
The objective of this paper is to introduce a new direct method for neutronic calculations. This method which is named Direct Discrete Method, is simpler than the neutron Transport equation and also more compatible with physical meaning of problems. This method is based on physic of problem and with meshing of the desired geometry, writing the balance equation for each mesh intervals and with notice to the conjunction between these mesh intervals, produce the final discrete equations series without production of neutron transport differential equation and mandatory passing from differential equation bridge. We have produced neutron discrete equations for a cylindrical shape with two boundary conditions in one group energy. The correction of the results from this method are tested with MCNP-4B code execution. (authors)
An algebra of discrete event processes
Heymann, Michael; Meyer, George
1991-01-01
This report deals with an algebraic framework for modeling and control of discrete event processes. The report consists of two parts. The first part is introductory, and consists of a tutorial survey of the theory of concurrency in the spirit of Hoare's CSP, and an examination of the suitability of such an algebraic framework for dealing with various aspects of discrete event control. To this end a new concurrency operator is introduced and it is shown how the resulting framework can be applied. It is further shown that a suitable theory that deals with the new concurrency operator must be developed. In the second part of the report the formal algebra of discrete event control is developed. At the present time the second part of the report is still an incomplete and occasionally tentative working paper.
Is Fitts' law continuous in discrete aiming?
Directory of Open Access Journals (Sweden)
Rita Sleimen-Malkoun
Full Text Available The lawful continuous linear relation between movement time and task difficulty (i.e., index of difficulty; ID in a goal-directed rapid aiming task (Fitts' law has been recently challenged in reciprocal performance. Specifically, a discontinuity was observed at critical ID and was attributed to a transition between two distinct dynamic regimes that occurs with increasing difficulty. In the present paper, we show that such a discontinuity is also present in discrete aiming when ID is manipulated via target width (experiment 1 but not via target distance (experiment 2. Fitts' law's discontinuity appears, therefore, to be a suitable indicator of the underlying functional adaptations of the neuro-muscular-skeletal system to task properties/requirements, independently of reciprocal or discrete nature of the task. These findings open new perspectives to the study of dynamic regimes involved in discrete aiming and sensori-motor mechanisms underlying the speed-accuracy trade-off.
Acceleration techniques for the discrete ordinate method
International Nuclear Information System (INIS)
Efremenko, Dmitry; Doicu, Adrian; Loyola, Diego; Trautmann, Thomas
2013-01-01
In this paper we analyze several acceleration techniques for the discrete ordinate method with matrix exponential and the small-angle modification of the radiative transfer equation. These techniques include the left eigenvectors matrix approach for computing the inverse of the right eigenvectors matrix, the telescoping technique, and the method of false discrete ordinate. The numerical simulations have shown that on average, the relative speedup of the left eigenvector matrix approach and the telescoping technique are of about 15% and 30%, respectively. -- Highlights: ► We presented the left eigenvector matrix approach. ► We analyzed the method of false discrete ordinate. ► The telescoping technique is applied for matrix operator method. ► Considered techniques accelerate the computations by 20% in average.
Some challenges with statistical inference in adaptive designs.
Hung, H M James; Wang, Sue-Jane; Yang, Peiling
2014-01-01
Adaptive designs have generated a great deal of attention to clinical trial communities. The literature contains many statistical methods to deal with added statistical uncertainties concerning the adaptations. Increasingly encountered in regulatory applications are adaptive statistical information designs that allow modification of sample size or related statistical information and adaptive selection designs that allow selection of doses or patient populations during the course of a clinical trial. For adaptive statistical information designs, a few statistical testing methods are mathematically equivalent, as a number of articles have stipulated, but arguably there are large differences in their practical ramifications. We pinpoint some undesirable features of these methods in this work. For adaptive selection designs, the selection based on biomarker data for testing the correlated clinical endpoints may increase statistical uncertainty in terms of type I error probability, and most importantly the increased statistical uncertainty may be impossible to assess.
Biomarkers in adult posthemorrhagic hydrocephalus.
Hua, Cong; Zhao, Gang
2017-08-01
Posthemorrhagic hydrocephalus is a severe complication following intracranial hemorrhage. Posthemorrhagic hydrocephalus is often associated with high morbidity and mortality and serves as an important clinical predictor of adverse outcomes after intracranial hemorrhage. Currently, no effective medical intervention exists to improve functional outcomes in posthemorrhagic hydrocephalus patients because little is still known about the mechanisms of posthemorrhagic hydrocephalus pathogenesis. Because a better understanding of the posthemorrhagic hydrocephalus pathogenesis would facilitate development of clinical treatments, this is an active research area. The purpose of this review is to describe recent progress in elucidation of molecular mechanisms that cause posthemorrhagic hydrocephalus. What we are certain of is that the entry of blood into the ventricular system and subarachnoid space results in release of lytic blood products which cause a series of physiological and pathological changes in the brain. Blood components that can be linked to pathology would serve as disease biomarkers. From studies of posthemorrhagic hydrocephalus, such biomarkers are known to mutually synergize to initiate and promote posthemorrhagic hydrocephalus progression. These findings suggest that modulation of biomarker expression or function may benefit posthemorrhagic hydrocephalus patients.
Discrete quantum geometries and their effective dimension
International Nuclear Information System (INIS)
Thuerigen, Johannes
2015-01-01
In several approaches towards a quantum theory of gravity, such as group field theory and loop quantum gravity, quantum states and histories of the geometric degrees of freedom turn out to be based on discrete spacetime. The most pressing issue is then how the smooth geometries of general relativity, expressed in terms of suitable geometric observables, arise from such discrete quantum geometries in some semiclassical and continuum limit. In this thesis I tackle the question of suitable observables focusing on the effective dimension of discrete quantum geometries. For this purpose I give a purely combinatorial description of the discrete structures which these geometries have support on. As a side topic, this allows to present an extension of group field theory to cover the combinatorially larger kinematical state space of loop quantum gravity. Moreover, I introduce a discrete calculus for fields on such fundamentally discrete geometries with a particular focus on the Laplacian. This permits to define the effective-dimension observables for quantum geometries. Analysing various classes of quantum geometries, I find as a general result that the spectral dimension is more sensitive to the underlying combinatorial structure than to the details of the additional geometric data thereon. Semiclassical states in loop quantum gravity approximate the classical geometries they are peaking on rather well and there are no indications for stronger quantum effects. On the other hand, in the context of a more general model of states which are superposition over a large number of complexes, based on analytic solutions, there is a flow of the spectral dimension from the topological dimension d on low energy scales to a real number between 0 and d on high energy scales. In the particular case of 1 these results allow to understand the quantum geometry as effectively fractal.
Synchronization Of Parallel Discrete Event Simulations
Steinman, Jeffrey S.
1992-01-01
Adaptive, parallel, discrete-event-simulation-synchronization algorithm, Breathing Time Buckets, developed in Synchronous Parallel Environment for Emulation and Discrete Event Simulation (SPEEDES) operating system. Algorithm allows parallel simulations to process events optimistically in fluctuating time cycles that naturally adapt while simulation in progress. Combines best of optimistic and conservative synchronization strategies while avoiding major disadvantages. Algorithm processes events optimistically in time cycles adapting while simulation in progress. Well suited for modeling communication networks, for large-scale war games, for simulated flights of aircraft, for simulations of computer equipment, for mathematical modeling, for interactive engineering simulations, and for depictions of flows of information.
Speeding Up Network Simulations Using Discrete Time
Lucas, Aaron; Armbruster, Benjamin
2013-01-01
We develop a way of simulating disease spread in networks faster at the cost of some accuracy. Instead of a discrete event simulation (DES) we use a discrete time simulation. This aggregates events into time periods. We prove a bound on the accuracy attained. We also discuss the choice of step size and do an analytical comparison of the computational costs. Our error bound concept comes from the theory of numerical methods for SDEs and the basic proof structure comes from the theory of numeri...
PHASE CHAOS IN THE DISCRETE KURAMOTO MODEL
DEFF Research Database (Denmark)
Maistrenko, V.; Vasylenko, A.; Maistrenko, Y.
2010-01-01
The paper describes the appearance of a novel, high-dimensional chaotic regime, called phase chaos, in a time-discrete Kuramoto model of globally coupled phase oscillators. This type of chaos is observed at small and intermediate values of the coupling strength. It arises from the nonlinear...... interaction among the oscillators, while the individual oscillators behave periodically when left uncoupled. For the four-dimensional time-discrete Kuramoto model, we outline the region of phase chaos in the parameter plane and determine the regions where phase chaos coexists with different periodic...
Digital and discrete geometry theory and algorithms
Chen, Li
2014-01-01
This book provides comprehensive coverage of the modern methods for geometric problems in the computing sciences. It also covers concurrent topics in data sciences including geometric processing, manifold learning, Google search, cloud data, and R-tree for wireless networks and BigData.The author investigates digital geometry and its related constructive methods in discrete geometry, offering detailed methods and algorithms. The book is divided into five sections: basic geometry; digital curves, surfaces and manifolds; discretely represented objects; geometric computation and processing; and a
A Low Complexity Discrete Radiosity Method
Chatelier , Pierre Yves; Malgouyres , Rémy
2006-01-01
International audience; Rather than using Monte Carlo sampling techniques or patch projections to compute radiosity, it is possible to use a discretization of a scene into voxels and perform some discrete geometry calculus to quickly compute visibility information. In such a framework , the radiosity method may be as precise as a patch-based radiosity using hemicube computation for form-factors, but it lowers the overall theoretical complexity to an O(N log N) + O(N), where the O(N) is largel...
Modeling and simulation of discrete event systems
Choi, Byoung Kyu
2013-01-01
Computer modeling and simulation (M&S) allows engineers to study and analyze complex systems. Discrete-event system (DES)-M&S is used in modern management, industrial engineering, computer science, and the military. As computer speeds and memory capacity increase, so DES-M&S tools become more powerful and more widely used in solving real-life problems. Based on over 20 years of evolution within a classroom environment, as well as on decades-long experience in developing simulation-based solutions for high-tech industries, Modeling and Simulation of Discrete-Event Systems is the only book on
Logic and discrete mathematics a concise introduction
Conradie, Willem
2015-01-01
A concise yet rigorous introduction to logic and discrete mathematics. This book features a unique combination of comprehensive coverage of logic with a solid exposition of the most important fields of discrete mathematics, presenting material that has been tested and refined by the authors in university courses taught over more than a decade. The chapters on logic - propositional and first-order - provide a robust toolkit for logical reasoning, emphasizing the conceptual understanding of the language and the semantics of classical logic as well as practical applications through the easy
Semiclassical expanding discrete space-times
International Nuclear Information System (INIS)
Cobb, W.K.; Smalley, L.L.
1981-01-01
Given the close ties between general relativity and geometry one might reasonably expect that quantum effects associated with gravitation might also be tied to the geometry of space-time, namely, to some sort of discreteness in space-time itself. In particular it is supposed that space-time consists of a discrete lattice of points rather than the usual continuum. Since astronomical evidence seems to suggest that the universe is expanding, the lattice must also expand. Some of the implications of such a model are that the proton should presently be stable, and the universe should be closed although the mechanism for closure is quantum mechanical. (author)